Choose your explanation: a comparison of SHAP and Grad-CAM in human activity recognition
Explaining machine learning (ML) models using eXplainable AI (XAI) techniques has become essential to make them more transparent and trustworthy. This is especially important in high-risk environments like healthcare, where understanding model decisions is critical to ensure ethical, sound, and trus...
Saved in:
| Published in: | Applied intelligence (Dordrecht, Netherlands) Vol. 55; no. 17; p. 1107 |
|---|---|
| Main Authors: | , , , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
New York
Springer US
01.11.2025
Springer Nature B.V |
| Subjects: | |
| ISSN: | 0924-669X, 1573-7497 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Abstract | Explaining machine learning (ML) models using eXplainable AI (XAI) techniques has become essential to make them more transparent and trustworthy. This is especially important in high-risk environments like healthcare, where understanding model decisions is critical to ensure ethical, sound, and trustworthy outcome predictions. However, users are often confused about which explanability method to choose for their specific use case. We present a comparative analysis of two explainability methods, Shapley Additive Explanations (SHAP) and Gradient-weighted Class Activation Mapping (Grad-CAM), within the domain of human activity recognition (HAR) utilizing graph convolutional networks (GCNs). By evaluating these methods on skeleton-based input representation from two real-world datasets, including a healthcare-critical cerebral palsy (CP) case, this study provides vital insights into both approaches’ strengths, limitations, and differences, offering a roadmap for selecting the most appropriate explanation method based on specific models and applications. We qualitatively and quantitatively compare the two methods, focusing on feature importance ranking and model sensitivity through perturbation experiments. While SHAP provides detailed input feature attribution, Grad-CAM delivers faster, spatially oriented explanations, making both methods complementary depending on the application’s requirements. Given the importance of XAI in enhancing trust and transparency in ML models, particularly in sensitive environments like healthcare, our research demonstrates how SHAP and Grad-CAM could complement each other to provide model explanations. |
|---|---|
| AbstractList | Explaining machine learning (ML) models using eXplainable AI (XAI) techniques has become essential to make them more transparent and trustworthy. This is especially important in high-risk environments like healthcare, where understanding model decisions is critical to ensure ethical, sound, and trustworthy outcome predictions. However, users are often confused about which explanability method to choose for their specific use case. We present a comparative analysis of two explainability methods, Shapley Additive Explanations (SHAP) and Gradient-weighted Class Activation Mapping (Grad-CAM), within the domain of human activity recognition (HAR) utilizing graph convolutional networks (GCNs). By evaluating these methods on skeleton-based input representation from two real-world datasets, including a healthcare-critical cerebral palsy (CP) case, this study provides vital insights into both approaches’ strengths, limitations, and differences, offering a roadmap for selecting the most appropriate explanation method based on specific models and applications. We qualitatively and quantitatively compare the two methods, focusing on feature importance ranking and model sensitivity through perturbation experiments. While SHAP provides detailed input feature attribution, Grad-CAM delivers faster, spatially oriented explanations, making both methods complementary depending on the application’s requirements. Given the importance of XAI in enhancing trust and transparency in ML models, particularly in sensitive environments like healthcare, our research demonstrates how SHAP and Grad-CAM could complement each other to provide model explanations. |
| ArticleNumber | 1107 |
| Author | Ihlen, Espen Alexander F. Strümke, Inga Groos, Daniel Tempel, Felix Adde, Lars |
| Author_xml | – sequence: 1 givenname: Felix orcidid: 0009-0005-6310-408X surname: Tempel fullname: Tempel, Felix email: felix.e.f.tempel@ntnu.no organization: Faculty of Informatics, Norwegian University of Science and Technology – sequence: 2 givenname: Daniel surname: Groos fullname: Groos, Daniel organization: Faculty of Medicine and Health Sciences, Norwegian University of Science and Technology – sequence: 3 givenname: Espen Alexander F. surname: Ihlen fullname: Ihlen, Espen Alexander F. organization: Faculty of Medicine and Health Sciences, Norwegian University of Science and Technology – sequence: 4 givenname: Lars surname: Adde fullname: Adde, Lars organization: Department of Clinical and Molecular Medicine, NTNU, Clinic of Rehabilitation, St. Olavs Hospital, Trondheim University Hospital – sequence: 5 givenname: Inga surname: Strümke fullname: Strümke, Inga organization: Faculty of Informatics, Norwegian University of Science and Technology |
| BookMark | eNp9kM9LwzAYhoNMcE7_AU8Bz9H8atJ6G0U3YaKgwm4hbZKtY0tq0on77-2s4M3Td3nf5-V7zsHIB28BuCL4hmAsbxPBPC8QphnCohA5YidgTDLJkOSFHIExLihHQhTLM3Ce0gZjzBgmY7As1yEkCw9hH6H9arfa664J_g5qWIddq2OTgofBwdf59AVqb-AsaoPK6RNsPFzvd9pDXXfNZ9MdYLR1WPnmCLgAp05vk738vRPw_nD_Vs7R4nn2WE4XqKZCUkRYVbhMG0uosZhXrnJOMGIJN8Y5ZwQuLGPM5bXL8lpTa6gQ3HJuRFVVTrIJuB64bQwfe5s6telf8f2kYlQyyTNOaZ-iQ6qOIaVonWpjs9PxoAhWR4NqMKh6g-rHoGJ9iQ2l1If9ysY_9D-tbyzBdlg |
| Cites_doi | 10.1109/CIHMCompanion65205.2025.11002697 10.4018/978-1-7998-0134-4.ch006 10.3390/s23020634 10.1109/ICASSP43922.2022.9746621 10.1162/netn_a_00451 10.1109/IECON43393.2020.9254361 10.1007/978-3-030-91608-4_33 10.1109/TPAMI.2022.3157033 10.3934/jdg.2020018 10.1007/s10957-017-1159-3 10.1016/j.dsp.2017.10.011 10.1007/s10479-021-04099-9 10.3934/jdg.2025017 10.1109/TAI.2023.3266418 10.1609/aaai.v32i1.12328 10.1016/S0140-6736(96)10182-3 10.1016/j.media.2018.06.001 10.1109/TNNLS.2020.3027314 10.1109/WACV.2018.00097 10.1016/j.compbiomed.2023.106668 10.48550/arXiv.2101.00905 10.1109/TPAMI.2019.2916873 10.1142/S021848852350040X 10.1109/ACCESS.2024.3377103 10.48550/arXiv.2501.09592 10.3390/s24061940 10.1016/j.jped.2015.12.003 10.1515/9781400829156-012 10.1145/2939672.2939778 10.1109/TPAMI.2022.3209686 10.1145/3394171.3413802 10.1007/s11263-019-01228-7 10.48550/arXiv.1312.6034 10.1016/j.procs.2022.08.105 10.1007/s00521-019-04051-w 10.1109/CVPR.2016.115 10.1016/j.techfore.2025.124265 10.1016/j.artint.2021.103502 10.48550/arXiv.1810.03292 10.1007/978-3-030-99203-3_9 10.1007/s10479-017-2405-7 10.3934/jimo.2023084 10.1016/j.artmed.2023.102616 10.1186/s12916-019-1426-2 10.1016/j.compeleceng.2024.109370 10.1038/s41467-023-44141-x 10.1007/978-3-030-83620-7_7 10.1016/j.compbiomed.2025.109838 10.1001/jamanetworkopen.2022.21325 10.48550/arXiv.2204.11351 10.1016/j.cmpb.2022.107161 10.1016/j.dajour.2023.100230 |
| ContentType | Journal Article |
| Copyright | The Author(s) 2025 The Author(s) 2025. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
| Copyright_xml | – notice: The Author(s) 2025 – notice: The Author(s) 2025. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
| DBID | C6C AAYXX CITATION 7SC 8FD JQ2 L7M L~C L~D |
| DOI | 10.1007/s10489-025-06968-3 |
| DatabaseName | Springer Nature OA Free Journals CrossRef Computer and Information Systems Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional |
| DatabaseTitle | CrossRef Computer and Information Systems Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Advanced Technologies Database with Aerospace ProQuest Computer Science Collection Computer and Information Systems Abstracts Professional |
| DatabaseTitleList | CrossRef Computer and Information Systems Abstracts |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Computer Science |
| EISSN | 1573-7497 |
| ExternalDocumentID | 10_1007_s10489_025_06968_3 |
| GrantInformation_xml | – fundername: NTNU Norwegian University of Science and Technology (incl St. Olavs Hospital - Trondheim University Hospital) |
| GroupedDBID | -~C -~X .86 .DC .VR 06D 0R~ 0VY 1N0 203 23M 2J2 2JN 2JY 2KG 2LR 2~H 30V 4.4 406 408 409 40D 40E 5GY 5VS 67Z 6NX 77I 77K 7WY 8FL 8TC 8UJ 95- 95. 95~ 96X AABHQ AACDK AAHNG AAIAL AAJBT AAJKR AANZL AAPKM AARTL AASML AATNV AATVU AAUYE AAWCG AAYIU AAYQN ABAKF ABBBX ABBRH ABBXA ABDBE ABDZT ABECU ABFSG ABFTV ABHLI ABHQN ABIVO ABJNI ABJOX ABKCH ABKTR ABMNI ABMQK ABNWP ABQBU ABRTQ ABSXP ABTEG ABTHY ABTKH ABTMW ABWNU ABXPI ACAOD ACDTI ACGFS ACHSB ACHXU ACIWK ACKNC ACMDZ ACMLO ACOKC ACOMO ACPIV ACSNA ACSTC ACZOJ ADHHG ADHIR ADIMF ADKFA ADKNI ADKPE ADRFC ADTPH ADURQ ADYFF ADZKW AEFQL AEGAL AEGNC AEJHL AEJRE AEMSY AENEX AEOHA AEPYU AETLH AEVLU AEXYK AEZWR AFBBN AFDZB AFHIU AFLOW AFOHR AFQWF AFWTZ AFZKB AGAYW AGDGC AGJBK AGMZJ AGQEE AGQMX AGRTI AGWIL AGWZB AGYKE AHAVH AHBYD AHKAY AHPBZ AHSBF AHWEU AHYZX AIAKS AIGIU AIIXL AILAN AITGF AIXLP AJRNO AJZVZ ALMA_UNASSIGNED_HOLDINGS ALWAN AMKLP AMXSW AMYLF AMYQR AOCGG ARMRJ ASPBG ATHPR AVWKF AXYYD AYFIA AYJHY AZFZN B-. BA0 BENPR BGNMA BSONS C6C CS3 CSCUP DDRTE DL5 DNIVK DPUIP EBLON EBS EIOEI ESBYG FEDTE FERAY FFXSO FIGPU FNLPD FRRFC FWDCC GGCAI GGRSB GJIRD GNWQR GQ7 GQ8 GXS HF~ HG5 HG6 HMJXF HQYDN HRMNR HVGLF HZ~ I09 IHE IJ- IKXTQ ITM IWAJR IXC IZIGR IZQ I~X I~Z J-C J0Z JBSCW JCJTX JZLTJ K60 K6~ KDC KOV LAK LLZTM M4Y MA- N9A NB0 NPVJJ NQJWS O93 O9G O9I O9J OAM P19 P2P P9O PF0 PT4 PT5 QOK QOS R89 R9I RHV RNS ROL RPX RSV S16 S1Z S27 S3B SAP SCO SDH SDM SHX SISQX SJYHP SNE SNPRN SNX SOHCF SOJ SPISZ SRMVM SSLCW STPWE SZN T13 TSG TSK TSV TUC U2A UG4 UOJIU UTJUX UZXMN VC2 VFIZW W23 W48 WK8 YLTOR Z45 ZMTXR ~A9 ~EX -Y2 1SB 2.D 28- 2P1 2VQ 5QI 8FE 8FG AAAVM AAOBN AARHV AAYTO AAYXX ABJCF ABQSL ABULA ABUWG ACBXY ADHKG AEBTG AEFIE AEKMD AFEXP AFFHD AFGCZ AFKRA AGGDS AGQPQ AJBLW ARAPS AZQEC BBWZM BDATZ BEZIV BGLVJ BPHCQ CAG CCPQU CITATION COF DWQXO EJD FINBP FRNLG FSGXE GNUQQ H13 HCIFZ K6V K7- KOW L6V M0C M7S N2Q NDZJH NU0 O9- OVD P62 PHGZM PHGZT PQBIZ PQBZA PQGLB PQQKQ PROAC PSYQQ PTHSS Q2X R4E RNI RZC RZE RZK S26 S28 SCJ SCLPG T16 TEORI ZY4 7SC 8FD JQ2 L7M L~C L~D |
| ID | FETCH-LOGICAL-c2672-13b9f5ade12de04bfbff631e14ddfffd609e333f8cf58ca2ed2664e44d6bbbf73 |
| IEDL.DBID | RSV |
| ISSN | 0924-669X |
| IngestDate | Sat Nov 22 03:12:49 EST 2025 Thu Nov 27 00:52:08 EST 2025 Fri Nov 21 01:12:01 EST 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 17 |
| Keywords | HAR XAI SHAP Explainable AI Human Activity Recognition Grad-CAM |
| Language | English |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c2672-13b9f5ade12de04bfbff631e14ddfffd609e333f8cf58ca2ed2664e44d6bbbf73 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ORCID | 0009-0005-6310-408X |
| OpenAccessLink | https://link.springer.com/10.1007/s10489-025-06968-3 |
| PQID | 3273745422 |
| PQPubID | 326365 |
| ParticipantIDs | proquest_journals_3273745422 crossref_primary_10_1007_s10489_025_06968_3 springer_journals_10_1007_s10489_025_06968_3 |
| PublicationCentury | 2000 |
| PublicationDate | 2025-11-01 |
| PublicationDateYYYYMMDD | 2025-11-01 |
| PublicationDate_xml | – month: 11 year: 2025 text: 2025-11-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | New York |
| PublicationPlace_xml | – name: New York – name: Boston |
| PublicationSubtitle | The International Journal of Research on Intelligent Systems for Real Life Complex Problems |
| PublicationTitle | Applied intelligence (Dordrecht, Netherlands) |
| PublicationTitleAbbrev | Appl Intell |
| PublicationYear | 2025 |
| Publisher | Springer US Springer Nature B.V |
| Publisher_xml | – name: Springer US – name: Springer Nature B.V |
| References | 6968_CR2 6968_CR14 6968_CR58 6968_CR1 6968_CR15 6968_CR59 6968_CR3 6968_CR11 Z Sadeghi (6968_CR7) 2024; 118 6968_CR57 6968_CR50 6968_CR51 S Ergün (6968_CR33) 2021; 307 G Montavon (6968_CR9) 2018; 73 Q Gao (6968_CR18) 2023; 14 6968_CR26 6968_CR27 S Nazir (6968_CR13) 2023; 156 KN Pellano (6968_CR22) 2024; 24 6968_CR28 6968_CR21 CJ Kelly (6968_CR16) 2019; 17 6968_CR23 6968_CR24 C Einspieler (6968_CR31) 2016; 92 6968_CR20 F Tempel (6968_CR19) 2025; 188 6968_CR60 A Chaddad (6968_CR5) 2023; 23 E Tjoa (6968_CR8) 2021; 32 6968_CR29 6968_CR36 RM Cabral-Carvalho (6968_CR55) 2025; 9 RR Selvaraju (6968_CR25) 2020; 128 6968_CR37 6968_CR38 6968_CR32 HW Loh (6968_CR6) 2022; 226 6968_CR34 6968_CR35 C Einspieler (6968_CR47) 2004 F Tempel (6968_CR42) 2024; 12 6968_CR30 S Parisot (6968_CR39) 2018; 48 K Aas (6968_CR56) 2021; 298 J Allgaier (6968_CR12) 2023; 143 A Vellido (6968_CR17) 2020; 32 S Bharati (6968_CR4) 2024; 5 6968_CR48 6968_CR49 6968_CR43 6968_CR44 6968_CR45 6968_CR40 6968_CR41 A Bessadok (6968_CR54) 2023; 45 HF Prechtl (6968_CR46) 1997; 349 C Molnar (6968_CR53) 2022 V Vishwarupe (6968_CR10) 2022; 204 R Hassan (6968_CR52) 2025; 219 |
| References_xml | – ident: 6968_CR43 doi: 10.1109/CIHMCompanion65205.2025.11002697 – ident: 6968_CR32 doi: 10.4018/978-1-7998-0134-4.ch006 – volume: 23 start-page: 634 issue: 2 year: 2023 ident: 6968_CR5 publication-title: Sensors doi: 10.3390/s23020634 – ident: 6968_CR21 doi: 10.1109/ICASSP43922.2022.9746621 – volume: 9 start-page: 761 issue: 2 year: 2025 ident: 6968_CR55 publication-title: Network Neuroscience doi: 10.1162/netn_a_00451 – ident: 6968_CR23 doi: 10.1109/IECON43393.2020.9254361 – ident: 6968_CR11 doi: 10.1007/978-3-030-91608-4_33 – volume-title: Prechtl’s Method on the Qualitative Assessment of General Movements in Preterm year: 2004 ident: 6968_CR47 – ident: 6968_CR30 doi: 10.1109/TPAMI.2022.3157033 – ident: 6968_CR36 doi: 10.3934/jdg.2020018 – ident: 6968_CR37 doi: 10.1007/s10957-017-1159-3 – volume: 73 start-page: 1 year: 2018 ident: 6968_CR9 publication-title: Digital Signal Process doi: 10.1016/j.dsp.2017.10.011 – volume: 307 start-page: 127 year: 2021 ident: 6968_CR33 publication-title: Ann Oper Res doi: 10.1007/s10479-021-04099-9 – ident: 6968_CR34 doi: 10.3934/jdg.2025017 – volume: 5 start-page: 1429 issue: 4 year: 2024 ident: 6968_CR4 publication-title: IEEE Trans Artif Intell doi: 10.1109/TAI.2023.3266418 – ident: 6968_CR45 doi: 10.1609/aaai.v32i1.12328 – volume: 349 start-page: 1361 issue: 9062 year: 1997 ident: 6968_CR46 publication-title: The Lancet doi: 10.1016/S0140-6736(96)10182-3 – volume: 48 start-page: 117 year: 2018 ident: 6968_CR39 publication-title: Med Image Anal doi: 10.1016/j.media.2018.06.001 – ident: 6968_CR24 – volume: 32 start-page: 4793 issue: 11 year: 2021 ident: 6968_CR8 publication-title: IEEE T Neural Netw Learn Syst doi: 10.1109/TNNLS.2020.3027314 – ident: 6968_CR57 doi: 10.1109/WACV.2018.00097 – volume: 156 year: 2023 ident: 6968_CR13 publication-title: Comput Biol Med doi: 10.1016/j.compbiomed.2023.106668 – ident: 6968_CR50 doi: 10.48550/arXiv.2101.00905 – ident: 6968_CR1 doi: 10.1109/TPAMI.2019.2916873 – ident: 6968_CR3 – ident: 6968_CR38 doi: 10.1142/S021848852350040X – volume: 12 start-page: 39505 year: 2024 ident: 6968_CR42 publication-title: IEEE access practical innovations open solutions doi: 10.1109/ACCESS.2024.3377103 – ident: 6968_CR60 doi: 10.48550/arXiv.2501.09592 – volume: 24 start-page: 1940 issue: 6 year: 2024 ident: 6968_CR22 publication-title: Sensors doi: 10.3390/s24061940 – volume: 92 start-page: 64 issue: 3 year: 2016 ident: 6968_CR31 publication-title: Jornal de Pediatria doi: 10.1016/j.jped.2015.12.003 – ident: 6968_CR41 doi: 10.1515/9781400829156-012 – ident: 6968_CR27 doi: 10.1145/2939672.2939778 – volume: 45 start-page: 5833 issue: 5 year: 2023 ident: 6968_CR54 publication-title: IEEE Trans Pattern Anal Mach Intell doi: 10.1109/TPAMI.2022.3209686 – ident: 6968_CR20 doi: 10.1145/3394171.3413802 – volume: 128 start-page: 336 issue: 2 year: 2020 ident: 6968_CR25 publication-title: Int J Comput Vision doi: 10.1007/s11263-019-01228-7 – ident: 6968_CR26 doi: 10.48550/arXiv.1312.6034 – volume: 204 start-page: 869 year: 2022 ident: 6968_CR10 publication-title: Procedia Comput Sci doi: 10.1016/j.procs.2022.08.105 – volume: 32 start-page: 18069 issue: 24 year: 2020 ident: 6968_CR17 publication-title: Neural Comput Appl doi: 10.1007/s00521-019-04051-w – ident: 6968_CR58 – ident: 6968_CR2 doi: 10.1109/CVPR.2016.115 – ident: 6968_CR49 – volume: 219 year: 2025 ident: 6968_CR52 publication-title: Technol Forecast Soc Chang doi: 10.1016/j.techfore.2025.124265 – volume: 298 year: 2021 ident: 6968_CR56 publication-title: Artif Intell doi: 10.1016/j.artint.2021.103502 – ident: 6968_CR29 doi: 10.48550/arXiv.1810.03292 – volume-title: Interpretable Machine Learning: A Guide for Making Black Box Models Explainable, Second, edition year: 2022 ident: 6968_CR53 – ident: 6968_CR28 doi: 10.1007/978-3-030-99203-3_9 – ident: 6968_CR40 doi: 10.1007/s10479-017-2405-7 – ident: 6968_CR35 doi: 10.3934/jimo.2023084 – volume: 143 year: 2023 ident: 6968_CR12 publication-title: Artifl Intell Med doi: 10.1016/j.artmed.2023.102616 – volume: 17 start-page: 195 issue: 1 year: 2019 ident: 6968_CR16 publication-title: BMC Med doi: 10.1186/s12916-019-1426-2 – volume: 118 year: 2024 ident: 6968_CR7 publication-title: Comput Elect Eng doi: 10.1016/j.compeleceng.2024.109370 – ident: 6968_CR59 – volume: 14 start-page: 8294 issue: 1 year: 2023 ident: 6968_CR18 publication-title: Nature Commun doi: 10.1038/s41467-023-44141-x – ident: 6968_CR15 doi: 10.1007/978-3-030-83620-7_7 – volume: 188 year: 2025 ident: 6968_CR19 publication-title: Comput Biol Med doi: 10.1016/j.compbiomed.2025.109838 – ident: 6968_CR48 doi: 10.1001/jamanetworkopen.2022.21325 – ident: 6968_CR51 doi: 10.48550/arXiv.2204.11351 – volume: 226 year: 2022 ident: 6968_CR6 publication-title: Comput Methods Prog Biomed doi: 10.1016/j.cmpb.2022.107161 – ident: 6968_CR14 doi: 10.1016/j.dajour.2023.100230 – ident: 6968_CR44 |
| SSID | ssj0003301 |
| Score | 2.4007807 |
| Snippet | Explaining machine learning (ML) models using eXplainable AI (XAI) techniques has become essential to make them more transparent and trustworthy. This is... |
| SourceID | proquest crossref springer |
| SourceType | Aggregation Database Index Database Publisher |
| StartPage | 1107 |
| SubjectTerms | Artificial Intelligence Artificial neural networks Computer Science Datasets Decision making Explainable artificial intelligence Health care Human activity recognition Kinematics Machine learning Machines Manufacturing Mechanical Engineering Processes Trustworthiness |
| Title | Choose your explanation: a comparison of SHAP and Grad-CAM in human activity recognition |
| URI | https://link.springer.com/article/10.1007/s10489-025-06968-3 https://www.proquest.com/docview/3273745422 |
| Volume | 55 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVAVX databaseName: SpringerLINK Contemporary 1997-Present customDbUrl: eissn: 1573-7497 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0003301 issn: 0924-669X databaseCode: RSV dateStart: 19970101 isFulltext: true titleUrlDefault: https://link.springer.com/search?facet-content-type=%22Journal%22 providerName: Springer Nature |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV07T8MwELagMLBQnqJQkAc2sJTYju2wVRWlC1VFAXWLHD9EJZRUTUHw73HcpAUEAyij41P02ee7i-_uA-CcBVxyzZymqVAjanGERCQsiiPnixgulHs82QQfDMR4HA-rorCiznavryT9Sf2p2I2W6T1OWlB2dEFkHWw4cydKdbwbPS7PXxehe548F1kgxuJxVSrzs4yv5mjlY367FvXWptf833fugO3Ku4SdxXbYBWsm2wPNmrkBVoq8D8bdpzwvDHx3E6B5mz7LxU_BKyihWjITwtzCUb8zhDLT8GYmNep2buEkg57YD5YlESXzBFwmIeXZAXjoXd93-6jiWEAKM45RSNLYRlKbEGsT0NSm1jISmpBqba3VLIgNIcQKZSOhJDbaWXRqKNUsTVPLySFoZHlmjgB0wZslJgwMJ6lzsrCkLtTiTAWplbGgpgUuaqiT6aKVRrJqmlyCljjQEg9aQlqgXa9GUqlVkRDnbHEaUYxb4LJGfzX8u7Tjv71-ArZwuYC-5rANGvPZizkFm-p1PilmZ367fQDLdc7T |
| linkProvider | Springer Nature |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3fS8MwED50Cvri_InTqXnwTQNtkiatb2M4J25juCl7K2mToCCtbFP0vzft2k1FH5Q-pjnKl1zurrm7D-CUO0IKxa2mxa7CzBAP-55vcOBZX0QLP7ZPTjYhej1_NAr6RVHYpMx2L68k85P6U7Eby9J7rDQn6-iC6TKsMGuxskS-28H9_Py1EXrOk2cjC8x5MCpKZX6W8dUcLXzMb9eiubVpVf_3nZuwUXiXqDHbDluwpJNtqJbMDahQ5B0YNR_SdKLRu52A9Nvzk5z9FLxAEsVzZkKUGjRoN_pIJgpdjaXCzUYXPSYoJ_ZDWUlExjyB5klIabILd63LYbONC44FHBMuCHZpFBhPKu0SpR0WmcgYTl3tMqWMMYo7gaaUGj82nh9LopW16EwzpngURUbQPagkaaL3AdngzVDtOlrQyDpZRDIbagkeO5GRgc90Dc5KqMPnWSuNcNE0OQMttKCFOWghrUG9XI2wUKtJSK2zJZjHCKnBeYn-Yvh3aQd_e_0E1trDbifsXPduDmGdZIuZ1x_WoTIdv-gjWI1fp4-T8XG-9T4AhzrRtw |
| linkToPdf | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1LT8MwDLZ4CXFhPMVgQA7cIKJN0qTlNg0GCJgm8dBuVdokAgm10zYQ_HuSrt0AwQGhHttYlR3XduPPH8AB94QUiltPS32FmSEBDoPQ4CiwuYgWYWqvgmxCdDphrxd1P6H4i2736khyjGlwU5qy0XFfmeNPwDfmWn2sZM9Nd8F0FuaZIw1y9frtw-RbbKv1gjPPVhmY86hXwmZ-lvE1NE3zzW9HpEXkadf-_84rsFxmnag53iarMKOzNahVjA6odPB16LUe83yo0btdgPRb_1mOfxaeIInSCWMhyg26vWh2kcwUOh9IhVvNG_SUoYLwDzmohGOkQJPmpDzbgPv22V3rApfcCzglXBDs0yQygVTaJ0p7LDGJMZz62mdKGWMU9yJNKTVhaoIwlUQrG-mZZkzxJEmMoJswl-WZ3gJkizpDte9pQRObfBHJbAkmeOolRkYh03U4rNQe98cjNuLpMGWntNgqLS6UFtM6NCrLxKW7DWNqkzDBAkZIHY4qS0xv_y5t-2-P78Ni97QdX192rnZgiThbFrDEBsyNBi96FxbS19HTcLBX7MIPYaPamw |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Choose+your+explanation%3A+a+comparison+of+SHAP+and+Grad-CAM+in+human+activity+recognition&rft.jtitle=Applied+intelligence+%28Dordrecht%2C+Netherlands%29&rft.au=Tempel%2C+Felix&rft.au=Groos%2C+Daniel&rft.au=Ihlen%2C+Espen+Alexander+F.&rft.au=Adde%2C+Lars&rft.date=2025-11-01&rft.pub=Springer+US&rft.issn=0924-669X&rft.eissn=1573-7497&rft.volume=55&rft.issue=17&rft_id=info:doi/10.1007%2Fs10489-025-06968-3&rft.externalDocID=10_1007_s10489_025_06968_3 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0924-669X&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0924-669X&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0924-669X&client=summon |