The Effect of the Question Mark Option in Progress Testing: A Large-Scale Longitudinal Study.

Uloženo v:
Podrobná bibliografie
Název: The Effect of the Question Mark Option in Progress Testing: A Large-Scale Longitudinal Study.
Autoři: van Wijk EV; Center for Innovation in Medical Education, Leiden University Medical Center, The Netherlands., Donkers J; School of Health Professions Education, Faculty of Health, Medicine and Life Sciences, Maastricht University, The Netherlands., de Laat PCJ; Department of Pediatrics, Erasmus Medical Center, Rotterdam, The Netherlands., Meiboom AA; Department of General Practice and Elderly Care Medicine, Amsterdam University Medical Center, Amsterdam, The Netherlands., Jacobs B; Department of Neurology, University of Groningen, University Medical Center Groningen, Groningen, The Netherlands., Ravesloot JH; Department of Physiology, Amsterdam University Medical Center, Amsterdam, The Netherlands., Tio RA; Department of Cardiology, Catharina Hospital Eindhoven, Eindhoven, The Netherlands., Oud FMM; Education Centre, Department of Medical Education, University Medical Center Utrecht, Utrecht, The Netherlands., Kooman JP; Department of Internal Medicine, Division of Nephrology, Maastricht University, The Netherlands., Bremers AJA; Department of Surgery, Radboud University Medical Center, Nijmegen, The Netherlands., Langers AMJ; Department of Gastroenterology and Hepatology, Leiden University Medical Center, Leiden, The Netherlands.
Zdroj: Perspectives on medical education [Perspect Med Educ] 2025 Dec 03; Vol. 14 (1), pp. 891-904. Date of Electronic Publication: 2025 Dec 03 (Print Publication: 2025).
Způsob vydávání: Journal Article
Jazyk: English
Informace o časopise: Publisher: Ubiquity Press Country of Publication: Netherlands NLM ID: 101590643 Publication Model: eCollection Cited Medium: Internet ISSN: 2212-277X (Electronic) Linking ISSN: 22122761 NLM ISO Abbreviation: Perspect Med Educ Subsets: MEDLINE
Imprint Name(s): Publication: 2023- : [London] : Ubiquity Press
Original Publication: [Houten] : Bohn Stafleu van Loghum
Výrazy ze slovníku MeSH: Educational Measurement*/methods , Educational Measurement*/standards , Educational Measurement*/statistics & numerical data , Students, Medical*/statistics & numerical data , Students, Medical*/psychology, Humans ; Longitudinal Studies ; Retrospective Studies ; Netherlands ; Reproducibility of Results ; Male ; Female
Abstrakt: Introduction: Formula scoring, widely used in medical progress tests (PT), includes a question mark option to discourage guessing, but this feature may disadvantage risk-averse students and bias results due to test-taking strategies. To enhance reliability and more accurately assess ability, Dutch medical schools recently transitioned to a computer adaptive-PT (CA-PT) based on Item Response Theory, which adjusts question difficulty dynamically, excluding the question mark option. This provided a unique opportunity to evaluate the impact of the question mark option in a large cohort. We specifically explored the relationship between question mark use in conventional PT and performance on CA-PT.
Methods: Retrospective data from medical students across seven faculties who took both PT formats were analyzed. Z-scores for total score and question mark score (number of unanswered questions) in the conventional PT, and theta score for the CA-PT were assessed. A linear model assessed the effect of the question mark score on theta, corrected for the conventional PT-score. Cluster analysis explored student subgroups per year.
Results: Students with similar conventional PT scores who left more questions unanswered on the conventional PT generally performed better on CA-PT. This effect diminished as students advanced through their studies. Cluster analysis revealed a variable effect between different students, most pronounced in year 4, and a reverse effect in year 5.
Discussion: Question mark option use significantly impacted student performance on PT, with a remarkable variability among students. This variability suggests that formula scoring captures more than knowledge alone, highlighting the need to align scoring methods with intended assessment goals.
(Copyright: © 2025 The Author(s).)
Competing Interests: The authors have no competing interests to declare.
References: BMC Med Educ. 2017 Nov 09;17(1):192. (PMID: 29121888)
J Contin Educ Health Prof. 2011 Summer;31(3):157-64. (PMID: 21953655)
R J. 2016 Aug;8(1):289-317. (PMID: 27818791)
Perspect Med Educ. 2016 Feb;5(1):51-5. (PMID: 26754310)
Med Teach. 2002 Jan;24(1):23-6. (PMID: 12098453)
Adv Health Sci Educ Theory Pract. 2024 Nov;29(5):1665-1688. (PMID: 38502460)
J Pers Soc Psychol. 1999 Dec;77(6):1121-34. (PMID: 10626367)
Med Teach. 2012;34(9):683-97. (PMID: 22905655)
Perspect Med Educ. 2024 Jul 26;13(1):406-416. (PMID: 39071727)
Psychol Bull. 1954 Jul;51(4):380-417. (PMID: 13177802)
Int J Environ Res Public Health. 2022 Feb 02;19(3):. (PMID: 35162729)
Med Teach. 2016 Nov;38(11):1125-1129. (PMID: 27117670)
Med Educ. 2012 Jan;46(1):71-9. (PMID: 22150198)
Med Educ. 1999 Apr;33(4):267-75. (PMID: 10336757)
Adv Health Sci Educ Theory Pract. 2015 Dec;20(5):1325-38. (PMID: 25912621)
Med Educ. 2005 Feb;39(2):163-70. (PMID: 15679683)
Adv Health Sci Educ Theory Pract. 2015 May;20(2):431-40. (PMID: 25103688)
Mil Med. 2012 Sep;177(9 Suppl):31-7. (PMID: 23029858)
Psychon Bull Rev. 1994 Mar;1(1):126-9. (PMID: 24203422)
Psychometrika. 2015 Mar;80(1):1-20. (PMID: 24499939)
Med Educ. 2003 Aug;37(8):739-45. (PMID: 12945568)
Psychol Rev. 2012 Jan;119(1):80-113. (PMID: 22022833)
BMC Med Educ. 2016 Oct 14;16(1):267. (PMID: 27741945)
Entry Date(s): Date Created: 20251208 Date Completed: 20251208 Latest Revision: 20251210
Update Code: 20251210
PubMed Central ID: PMC12680002
DOI: 10.5334/pme.1673
PMID: 41356409
Databáze: MEDLINE
Popis
Abstrakt:Introduction: Formula scoring, widely used in medical progress tests (PT), includes a question mark option to discourage guessing, but this feature may disadvantage risk-averse students and bias results due to test-taking strategies. To enhance reliability and more accurately assess ability, Dutch medical schools recently transitioned to a computer adaptive-PT (CA-PT) based on Item Response Theory, which adjusts question difficulty dynamically, excluding the question mark option. This provided a unique opportunity to evaluate the impact of the question mark option in a large cohort. We specifically explored the relationship between question mark use in conventional PT and performance on CA-PT.<br />Methods: Retrospective data from medical students across seven faculties who took both PT formats were analyzed. Z-scores for total score and question mark score (number of unanswered questions) in the conventional PT, and theta score for the CA-PT were assessed. A linear model assessed the effect of the question mark score on theta, corrected for the conventional PT-score. Cluster analysis explored student subgroups per year.<br />Results: Students with similar conventional PT scores who left more questions unanswered on the conventional PT generally performed better on CA-PT. This effect diminished as students advanced through their studies. Cluster analysis revealed a variable effect between different students, most pronounced in year 4, and a reverse effect in year 5.<br />Discussion: Question mark option use significantly impacted student performance on PT, with a remarkable variability among students. This variability suggests that formula scoring captures more than knowledge alone, highlighting the need to align scoring methods with intended assessment goals.<br /> (Copyright: © 2025 The Author(s).)
ISSN:2212-277X
DOI:10.5334/pme.1673