Mele's Digital Zygote: Developer Responsibility for Neural Networks.

Saved in:
Bibliographic Details
Title: Mele's Digital Zygote: Developer Responsibility for Neural Networks.
Authors: Søgaard A; Department of Computer Science, University of Copenhagen, København, Denmark., Stamatiou F; Department of Communication, University of Copenhagen & Department of Philosophy, Stellenbosch University, Stellenbosch, South Africa. filippostam@gmail.com.
Source: Science and engineering ethics [Sci Eng Ethics] 2025 Nov 26; Vol. 31 (6), pp. 40. Date of Electronic Publication: 2025 Nov 26.
Publication Type: Journal Article
Language: English
Journal Info: Publisher: Opragen Publications Country of Publication: England NLM ID: 9516228 Publication Model: Electronic Cited Medium: Internet ISSN: 1471-5546 (Electronic) Linking ISSN: 13533452 NLM ISO Abbreviation: Sci Eng Ethics Subsets: MEDLINE
Imprint Name(s): Original Publication: Guildford, Surrey, UK : Opragen Publications, c1995-
MeSH Terms: Neural Networks, Computer* , Social Responsibility* , Technology*/ethics, Humans ; Zygote
Abstract: Competing Interests: Declarations. Competing Interests: All authors have no conflicts of interest.
Should developers be held responsible for the predictions of their neural networks-and if not, does that introduce a responsibility gap? The claim that neural networks introduce a responsibility gap has seen significant pushback, with philosophers arguing that the gap can be bridged, or did not exist in the first place. We show how the responsibility gap turns on whether we can distinguish between foreseeable and unforeseeable neural network predictions. Empirical facts about neural networks tell us we cannot, which seems to force developers to either assume full responsibility or no responsibility at all, introducing a responsibility gap-unless, of course, the same empirical facts hold true of humans, in which case there is no gap, but the trouble is simply with the classical notion of responsibility. We revisit and revise Mele's Zygote, as well as the famous Palsgraf case, and argue that in fact, what complicates responsibility assignment for neural networks also complicates responsibility assignment for humans, and humans seem to confront us with the same all-or-nothing dilemma. Thus, we agree there is no technology-induced responsibility gap (there was no gap in the first place), but for slightly different reasons than our predecessors.
(© 2025. The Author(s).)
References: Camb Q Healthc Ethics. 2021 Jul;30(3):435-447. (PMID: 34109925)
Nat Mach Intell. 2019 May;1(5):206-215. (PMID: 35603010)
Br J Soc Psychol. 2000 Sep;39 ( Pt 3):313-25. (PMID: 11041004)
Trends Cogn Sci. 2020 Sep;24(9):694-703. (PMID: 32682732)
Mol Biol Evol. 2017 Aug 1;34(8):2057-2064. (PMID: 28525580)
Minds Mach (Dordr). 2024;34(3):20. (PMID: 38855350)
Sci Eng Ethics. 2020 Aug;26(4):2051-2068. (PMID: 31650511)
Sci Eng Ethics. 2018 Aug;24(4):1201-1219. (PMID: 28721641)
Contributed Indexing: Keywords: Accountability; Developer responsibility; Ethics of AI; Moral responsibility; Responsibility gaps
Entry Date(s): Date Created: 20251126 Date Completed: 20251126 Latest Revision: 20251129
Update Code: 20251129
PubMed Central ID: PMC12657523
DOI: 10.1007/s11948-025-00566-9
PMID: 41296098
Database: MEDLINE
Description
Abstract:Competing Interests: Declarations. Competing Interests: All authors have no conflicts of interest.<br />Should developers be held responsible for the predictions of their neural networks-and if not, does that introduce a responsibility gap? The claim that neural networks introduce a responsibility gap has seen significant pushback, with philosophers arguing that the gap can be bridged, or did not exist in the first place. We show how the responsibility gap turns on whether we can distinguish between foreseeable and unforeseeable neural network predictions. Empirical facts about neural networks tell us we cannot, which seems to force developers to either assume full responsibility or no responsibility at all, introducing a responsibility gap-unless, of course, the same empirical facts hold true of humans, in which case there is no gap, but the trouble is simply with the classical notion of responsibility. We revisit and revise Mele's Zygote, as well as the famous Palsgraf case, and argue that in fact, what complicates responsibility assignment for neural networks also complicates responsibility assignment for humans, and humans seem to confront us with the same all-or-nothing dilemma. Thus, we agree there is no technology-induced responsibility gap (there was no gap in the first place), but for slightly different reasons than our predecessors.<br /> (© 2025. The Author(s).)
ISSN:1471-5546
DOI:10.1007/s11948-025-00566-9