To explain or not to explain?—Artificial intelligence explainability in clinical decision support systems

Explainability for artificial intelligence (AI) in medicine is a hotly debated topic. Our paper presents a review of the key arguments in favor and against explainability for AI-powered Clinical Decision Support System (CDSS) applied to a concrete use case, namely an AI-powered CDSS currently used i...

Full description

Saved in:
Bibliographic Details
Published in:PLOS digital health Vol. 1; no. 2; p. e0000016
Main Authors: Amann, Julia, Vetter, Dennis, Blomberg, Stig Nikolaj, Christensen, Helle Collatz, Coffee, Megan, Gerke, Sara, Gilbert, Thomas K., Hagendorff, Thilo, Holm, Sune, Livne, Michelle, Spezzatti, Andy, Strümke, Inga, Zicari, Roberto V., Madai, Vince Istvan
Format: Journal Article
Language:English
Published: United States Public Library of Science 01.02.2022
Public Library of Science (PLoS)
Subjects:
ISSN:2767-3170, 2767-3170
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Explainability for artificial intelligence (AI) in medicine is a hotly debated topic. Our paper presents a review of the key arguments in favor and against explainability for AI-powered Clinical Decision Support System (CDSS) applied to a concrete use case, namely an AI-powered CDSS currently used in the emergency call setting to identify patients with life-threatening cardiac arrest. More specifically, we performed a normative analysis using socio-technical scenarios to provide a nuanced account of the role of explainability for CDSSs for the concrete use case, allowing for abstractions to a more general level. Our analysis focused on three layers: technical considerations, human factors, and the designated system role in decision-making. Our findings suggest that whether explainability can provide added value to CDSS depends on several key questions: technical feasibility, the level of validation in case of explainable algorithms, the characteristics of the context in which the system is implemented, the designated role in the decision-making process, and the key user group(s). Thus, each CDSS will require an individualized assessment of explainability needs and we provide an example of how such an assessment could look like in practice.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
VIM reported receiving personal fees from ai4medicine outside the submitted work. There is no connection, commercial exploitation, transfer or association between the projects of ai4medicine and the results presented in this work.
ISSN:2767-3170
2767-3170
DOI:10.1371/journal.pdig.0000016