Toward a unified methodology for preliminary digital evidence assessment: Standardizing forensic investigations.

Gespeichert in:
Bibliographische Detailangaben
Titel: Toward a unified methodology for preliminary digital evidence assessment: Standardizing forensic investigations.
Autoren: AlBusaidi AJ; Computer Science and Information Technology, University of Malaya, Kuala Lumpur, Malaysia., Kiah LBM; Computer Science and Information Technology, University of Malaya, Kuala Lumpur, Malaysia., Abdul Waha AWB; Computer Science and Information Technology, University of Malaya, Kuala Lumpur, Malaysia.
Quelle: Journal of forensic sciences [J Forensic Sci] 2025 Jul; Vol. 70 (4), pp. 1571-1583. Date of Electronic Publication: 2025 May 14.
Publikationsart: Journal Article
Sprache: English
Info zur Zeitschrift: Publisher: Blackwell Pub Country of Publication: United States NLM ID: 0375370 Publication Model: Print-Electronic Cited Medium: Internet ISSN: 1556-4029 (Electronic) Linking ISSN: 00221198 NLM ISO Abbreviation: J Forensic Sci Subsets: MEDLINE
Imprint Name(s): Publication: 2006- : Malden, MA : Blackwell Pub.
Original Publication: [Chicago, Ill.] : Callaghan and Co., 1956-
MeSH-Schlagworte: Forensic Sciences*/standards , Forensic Sciences*/methods , Image Processing, Computer-Assisted*, Humans ; Bayes Theorem
Abstract: The increasing reliance on forensic investigations on digital evidence raises concerns about reliability, standardization, and misinterpretation. Inconsistent forensic evaluations necessitate a structured approach for examining digital evidence's strength impacting judicial outcomes. This study aimed to propose a systematic preliminary digital evidence assessment methodology by integrating Bayesian reasoning to enhance evaluative interpretations. A phase-phase structured framework is introduced to guide forensic practitioners in assessing digital evidence through observation, hypothesis generation, and inference. The methodology utilizes the Certainty Scale (C-Scale) to improve consistency among forensic assessments, standardizing evaluative opinions. Additionally, developing a proof-of-concept database for digital evidence cases of manipulation is essential to support evidence strength determination in investigations. The results showed that this approach advances transparency and limits cognitive bias in forensic evaluations. Aligned with international forensic regulatory frameworks and standards like ISI-21043, the methodology proposed enhances forensic decision-making, particularly for investigators who lack digital forensic expertise. The current study contributed to forensic sciences by presenting a standardized method for examining digital evidence strength, catering to the gap between theoretical evaluation models and practical applications in forensics. To enhance transparency and provide a balanced perspective on the evidential value of observed digital evidence, it is crucial to standardize the approach that digital forensic practitioners take in formulating and articulating their preliminary evaluative opinions.
(© 2025 American Academy of Forensic Sciences.)
References: Attorney General's Office. Review of the efficiency and effectiveness of disclosure in the criminal justice system. London: Crown/U.K. Government; 2018.
Griffiths C. Getting people thinking and talking: an exploration of the Attorney General's 2020 guidelines on disclosure. Int J Evid Based Pract. 2022;26(4):359–380. https://doi.org/10.1177/13657127221124362.
Tully G, Stockdale M. Commentary on: Hak. Evaluation of the forensic science regulator's recommendations regarding image comparison evidence. Forensic Sci Int Synergy. 2019;1(1):298. https://doi.org/10.1016/j.fsisyn.2019.09.006.
White JL. The understanding of digital and multimedia evidence (DME) by attorneys and digital forensic examiners (DFE) within the United States criminal justice system [dissertation]. Hattiesburg, MS: The University of Southern Mississippi; 2021.
Irwin D, Mandel DR. Improving information evaluation for intelligence production. Intell Natl Secur. 2019;34(4):503–525. https://doi.org/10.1080/02684527.2019.1569343.
Casey E. Standardization of forming and expressing preliminary evaluative opinions on digital evidence. Forensic Sci Int Digit Investig. 2020;32:200888. https://doi.org/10.1016/j.fsidi.2019.200888.
He X, Li C. Development of forensic standards in China: a review. Forensic Sci Res. 2022;7(1):1–10. https://doi.org/10.1080/20961790.2021.1912877.
Casey E, Jaquet‐Chiffelle D‐O, Spichiger H, Ryser E, Souvignet T. Structuring the evaluation of location‐related mobile device evidence. Forensic Sci Int Digit Investig. 2020;32:300928. https://doi.org/10.1016/j.fsidi.2020.300928.
Jaquet‐Chiffelle D‐O, Casey E, Pollitt M, Gladyshev P. A framework for harmonising forensic science practices and digital/multimedia evidence. Gaithersburg, MD: OSAC/NIST; 2018. https://doi.org/10.29325/OSAC.TS.0002.
Raghavan S. Digital forensic research: current state of the art. CSI Trans ICT. 2013;1:91–114. https://doi.org/10.1007/s40012‐012‐0008‐7.
Roberts P. Renegotiating forensic cultures: between law, science and criminal justice. Stud Hist Phil Biol Biomed Sci. 2013;44(1):47–59. https://doi.org/10.1016/j.shpsc.2012.09.010.
Casey E. Error, uncertainty and loss in digital evidence. Int J Digit Evid. 2002;1(2):1–45.
Casey E. Digital evidence and computer crime: forensic science, computers, and the internet. Cambridge, MA: Academic Press; 2011. https://doi.org/10.5555/2021194.
Solanke AA, Biasiotti MA. Digital forensics AI: evaluating, standardising and optimising digital evidence mining techniques. KI Künstl Intell. 2022;36(2):143–161. https://doi.org/10.1007/s13218‐022‐00763‐9.
Jackson RL. Learning forensic assessment. New York: Routledge; 2008.
Cook R, Evett IW, Jackson G, Jones P, Lambert JA. A model for case assessment and interpretation. Sci Justice. 1998;38(3):151–156. https://doi.org/10.1016/S1355‐0306(98)72099‐4.
Jackson G, Aitken C, Roberts P. Case assessment and interpretation of expert evidence: guidance for judges, lawyers, forensic scientists and expert witnesses. Practitioner guide no 4. London: Royal Statistical Society; 2015.
Jackson AR, Jackson JM. Forensic science. Upper Saddle River, NJ: Pearson Education; 2008.
Cobley P. Sebeok's panopticon. In: Cobley P, Deely J, Kull K, Petrilli S, editors. Semiotics continues to astonish: Thomas A. Sebeok and the doctrine of signs. Berlin: De Gruyter Mouton; 2011. https://doi.org/10.1515/9783110254389.85.
Eco U, Sebeok TA, editors. The sign of three: Dupin, Holmes, Peirce. Bloomington, IN: Indiana University Press; 1983.
Lipton P. Inference to the best explanation. In: Newton‐Smith WH, editor. A companion to the philosophy of science. Hoboken, NJ: John Wiley & Sons, Inc; 2017. p. 184–193.
Sunde N, Dror IE. Cognitive and human factors in digital forensics: problems, challenges, and the way forward. Digit Investig. 2019;29:101–108. https://doi.org/10.1016/j.diin.2019.03.011.
Casu M, Guarnera L, Caponnetto P, Battiato S. GenAI mirage: the impostor bias and the deepfake detection challenge in the era of artificial illusions. Forensic Sci Int Digit Investig. 2024;50:301795. https://doi.org/10.1016/j.fsidi.2024.301795.
Bhadra P. Is forensic evidence impartial? Cognitive biases in forensic analysis. Criminal psychology and the criminal justice system in India and beyond. Singapore: Springer; 2021. p. 215–227. https://doi.org/10.1007/978‐981‐16‐4570‐9_14.
Murphy E. The new forensics: criminal justice, false certainty, and the second generation of scientific evidence. Calif L Rev. 2007;95:721.
Bowers CM. Forensic testimony: science, law and expert evidence. Cambridge, MA: Academic Press; 2013.
Park RC, Saks MJ. Interdisciplinary trends in evidence scholarship. Available at SSRN 783824. 2005. https://doi.org/10.2139/ssrn.783824.
Alatawi H, Alenazi K, Alshehri S, Alshamakhi S, Mustafa M, Aljaedi A. Mobile forensics: a review. Proceedings of the 2020 International Conference on Computing and Information Technology (ICCIT‐1441); 2020 Sep 9–10; Tabuk, Saudi Arabia. Piscataway, NJ: IEEE; 2020. https://doi.org/10.1109/ICCIT‐144147971.2020.9213739.
Casey E. The chequered past and risky future of digital forensics. Aust J Forensic Sci. 2019;51(6):649–664. https://doi.org/10.1080/00450618.2018.1554090.
Casey E, Ribaux O, Roux C. The Kodak syndrome: risks and opportunities created by decentralisation of forensic capabilities. J Forensic Sci. 2019;64(1):127–136. https://doi.org/10.1111/1556‐4029.13849.
Casey E, Nelson A, Hyde J. Standardization of file recovery classification and authentication. Digit Investig. 2019;31:100873. https://doi.org/10.1016/j.diin.2019.06.004.
Casey E. Digital stratigraphy: contextual analysis of file system traces in forensic science. J Forensic Sci. 2018;63(5):1383–1391. https://doi.org/10.1111/1556‐4029.13722.
Thompson WC. How should forensic scientists present source conclusions. Seton Hall L Rev. 2017;48:773.
Brown G, Cropp P. Standardised nomenclature in forensic science. J Forensic Sci Soc. 1987;27(6):393–399. https://doi.org/10.1016/S0015‐7368(87)72787‐X.
Ostrum RB. The logical approach to evidence evaluation. In: Angel M, Kelly JS, editors. Forensic document examination in the 21st century. Boca Raton, FL: CRC Press; 2020. p. 9–34.
Casey E. Clearly conveying digital forensic results. Digit Investig. 2018;24:1–3. https://doi.org/10.1016/j.diin.2018.03.001.
Providers A. Standards for the formulation of evaluative forensic science expert opinion. Sci Justice. 2009;49:161–164. https://doi.org/10.1016/j.scijus.2009.07.004.
Aitken C, Roberts P, Jackson G. Fundamentals of probability and statistical evidence in criminal proceedings: guidance for judges, lawyers, forensic scientists and expert witnesses. 2010.
Smith J. Unauthorized changes to customer records in a banking database. J Digit Forensics. 2022;15(3):245–256.
Johnson L. Alteration of financial transactions in a corporate database. Int J Cyber Secur. 2021;10(2):112–124.
White R. Data exfiltration from a compromised employee database. Forensic Sci Rev. 2022;12(4):78–90.
Patel S. Ransomware attack affecting a local government database. Cybersecur J. 2022;9(2):34–47.
Garcia M. Backdating of entries in a public records database. Digitl Evid Cyber Crime. 2022;14(1):101–113.
Nguyen T. Modification of sensitive data in a university database. J Acad Integr. 2023;11(3):67–79.
Kim H. Unauthorized deletion of records in a sales database. Int J Forensic Comput. 2022;7(2):150–162.
Kennedy I, Day E. Procedures at digital crime scenes. In: Bryant R, editor. Policing digital crime. Milton Park, Abingdon: Routledge; 2016. p. 147–160.
Horsman G. ACPO principles for digital evidence: time for an update? Forensic Sci Int Rep. 2020;2:100076. https://doi.org/10.1016/j.fsir.2020.100076.
Lallie HS, Pimlott L. Applying the ACPO principles in public cloud forensic investigations. J Digit Forensics Sec Law. 2012;7(1):5. https://doi.org/10.15394/jdfsl.2012.1113.
Berger CE, Slooten K. The LR does not exist. Sci Justice. 2016;56(5):388–391. https://doi.org/10.1016/j.scijus.2016.06.005.
Biedermann A, Vuille J. Digital evidence, ‘absence’ of data and ambiguous patterns of reasoning. Digit Investig. 2016;16:S86–S95. https://doi.org/10.1016/j.diin.2016.01.011.
Contributed Indexing: Keywords: digital forensic strength of evidence; digital forensics; digital forensics report of findings; digital forensics statement of opinions; forensic investigations
Entry Date(s): Date Created: 20250515 Date Completed: 20250703 Latest Revision: 20250703
Update Code: 20250703
DOI: 10.1111/1556-4029.70070
PMID: 40369776
Datenbank: MEDLINE
Beschreibung
Abstract:The increasing reliance on forensic investigations on digital evidence raises concerns about reliability, standardization, and misinterpretation. Inconsistent forensic evaluations necessitate a structured approach for examining digital evidence's strength impacting judicial outcomes. This study aimed to propose a systematic preliminary digital evidence assessment methodology by integrating Bayesian reasoning to enhance evaluative interpretations. A phase-phase structured framework is introduced to guide forensic practitioners in assessing digital evidence through observation, hypothesis generation, and inference. The methodology utilizes the Certainty Scale (C-Scale) to improve consistency among forensic assessments, standardizing evaluative opinions. Additionally, developing a proof-of-concept database for digital evidence cases of manipulation is essential to support evidence strength determination in investigations. The results showed that this approach advances transparency and limits cognitive bias in forensic evaluations. Aligned with international forensic regulatory frameworks and standards like ISI-21043, the methodology proposed enhances forensic decision-making, particularly for investigators who lack digital forensic expertise. The current study contributed to forensic sciences by presenting a standardized method for examining digital evidence strength, catering to the gap between theoretical evaluation models and practical applications in forensics. To enhance transparency and provide a balanced perspective on the evidential value of observed digital evidence, it is crucial to standardize the approach that digital forensic practitioners take in formulating and articulating their preliminary evaluative opinions.<br /> (© 2025 American Academy of Forensic Sciences.)
ISSN:1556-4029
DOI:10.1111/1556-4029.70070