TY - JOUR
T1 - Online test administration results in students selecting more responses to multiple-choice-multiple-response items
AU - Olsho, Alexis
AU - Smith, Trevor I.
AU - Eaton, Philip
AU - Zimmerman, Charlotte
AU - Boudreaux, Andrew
AU - White Brahmia, Suzanne
N1 - Publisher Copyright:
© 2023 authors. Published by the American Physical Society. Published by the American Physical Society under the terms of the "https://creativecommons.org/licenses/by/4.0/"Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.
PY - 2023/1
Y1 - 2023/1
N2 - We developed the Physics Inventory of Quantitative Literacy (PIQL) to assess students' quantitative reasoning in introductory physics contexts. The PIQL includes several "multiple-choice-multiple-response"(MCMR) items (i.e., multiple-choice questions for which more than one response may be selected) as well as traditional single-response multiple-choice items. In this paper, we discuss differences in performance on MCMR items that seems to result from differences in administration method (paper versus online). In particular, we find a tendency for "clickiness"in online administration: students choose more responses to MCMR items when taking the electronic version of the assessment. Student performance on single-response multiple-choice items was not affected by administration method. These results suggest that MCMR items may provide a unique opportunity to probe differences in online and on-paper administration of low-stakes assessments.
AB - We developed the Physics Inventory of Quantitative Literacy (PIQL) to assess students' quantitative reasoning in introductory physics contexts. The PIQL includes several "multiple-choice-multiple-response"(MCMR) items (i.e., multiple-choice questions for which more than one response may be selected) as well as traditional single-response multiple-choice items. In this paper, we discuss differences in performance on MCMR items that seems to result from differences in administration method (paper versus online). In particular, we find a tendency for "clickiness"in online administration: students choose more responses to MCMR items when taking the electronic version of the assessment. Student performance on single-response multiple-choice items was not affected by administration method. These results suggest that MCMR items may provide a unique opportunity to probe differences in online and on-paper administration of low-stakes assessments.
UR - http://www.scopus.com/inward/record.url?scp=85153893514&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85153893514&partnerID=8YFLogxK
U2 - 10.1103/PhysRevPhysEducRes.19.013101
DO - 10.1103/PhysRevPhysEducRes.19.013101
M3 - Article
AN - SCOPUS:85153893514
SN - 2469-9896
VL - 19
JO - Physical Review Physics Education Research
JF - Physical Review Physics Education Research
IS - 1
M1 - 013101
ER -