1887

Development of a New Scoring System To Accurately Estimate Learning Outcome Achievements via Single, Best-Answer, Multiple-Choice Questions for Preclinical Students in a Medical Microbiology Course

    Authors: Yodying Dangprapai1, Popchai Ngamskulrungroj2,*, Sansnee Senawong3, Patompong Ungprasert4, Azian Harun5
    VIEW AFFILIATIONS HIDE AFFILIATIONS
    Affiliations: 1: Department of Physiology, Faculty of Medicine, Siriraj Hospital, Mahidol University, Bangkok 10700, Thailand; 2: Department of Microbiology, Faculty of Medicine, Siriraj Hospital, Mahidol University, Bangkok 10700, Thailand; 3: Department of Immunology, Faculty of Medicine, Siriraj Hospital, Mahidol University, Bangkok 10700, Thailand; 4: Clinical Epidemiology Unit, Department of Research and Development, Faculty of Medicine, Siriraj Hospital, Mahidol University, Bangkok 10700, Thailand; 5: Department of Medical Microbiology and Parasitology, School of Medical Sciences, Universiti Sains Malaysia, 16150 Kubang Kerian, Kelantan, Malaysia
    AUTHOR AND ARTICLE INFORMATION AUTHOR AND ARTICLE INFORMATION
    • Received 18 February 2019 Accepted 20 November 2019 Published 28 February 2020
    • ©2020 Author(s). Published by the American Society for Microbiology
    • [open-access] This is an Open Access article distributed under the terms of the Creative Commons Attribution-Noncommercial-NoDerivatives 4.0 International license (https://creativecommons.org/licenses/by-nc-nd/4.0/ and https://creativecommons.org/licenses/by-nc-nd/4.0/legalcode), which grants the public the nonexclusive right to copy, distribute, or display the published work.

    • Supplemental materials available at http://asmscience.org/jmbe
    • *Corresponding author. Mailing address: Department of Microbiology, Faculty of Medicine, Siriraj Hospital, Mahidol University, 2 Wang Lang Rd., Siriraj, Bangkok Noi, Bangkok 10700, Thailand. Phone: +66 2 419 7053. Fax: +66 2 418 4148. E-mail: [email protected].
    Source: J. Microbiol. Biol. Educ. February 2020 vol. 21 no. 1 doi:10.1128/jmbe.v21i1.1773
MyBook is a cheap paperback edition of the original book and will be sold at uniform, low price.
  • XML
    69.43 Kb
  • PDF
    959.82 Kb
  • HTML
    54.61 Kb

    Abstract:

    During the preclinical years, single-best-answer multiple-choice questions (SBA-MCQs) are often used to test the higher-order cognitive processes of medical students (such as application and analysis) while simultaneously assessing lower-order processes (like knowledge and comprehension). Consequently, it can be difficult to pinpoint which learning outcome has been achieved or needs improvement. We developed a new scoring system for SBA-MCQs using a step-by-step methodology to evaluate each learning outcome independently. Enrolled in this study were third-year medical students ( = 316) who had registered in the basic microbiology course at the Faculty of Medicine, Siriraj Hospital, Mahidol University during the academic year 2017. A step-by-step SBA-MCQ with a new scoring system was created and used as a tool to evaluate the validity of the traditional SBA-MCQs that assess two separate outcomes simultaneously. The scores for the two methods, in percentages, were compared using two different questions (SBA-MCQ1 and SBA-MCQ2). SBA-MCQ1 tested the students’ knowledge of the causative agent of a specific infectious disease and the basic characteristics of the microorganism, while SBA-MCQ2 tested their knowledge of the causative agent of a specific infectious disease and the pathogenic mechanism of the microorganism. The mean score obtained with the traditional SBA-MCQs was significantly lower than that obtained with the step-by-step SBA-MCQs (85.9% for the traditional approach versus 90.9% for step-by-step SBA-MCQ1; < 0.001; and 81.5% for the traditional system versus 87.4% for step-by-step SBA-MCQ2; < 0.001). Moreover, 65.8% and 87.8% of the students scored lower with the traditional SBA-MCQ1 and the traditional SBA-MCQ2, respectively, than with the corresponding sets of step-by-step SBA-MCQ questions. These results suggest that traditional SBA-MCQ scores need to be interpreted with caution because they have the potential to underestimate the learning achievement of students. Therefore, the step-by-step SBA-MCQ is preferable to the traditional SBA-MCQs and is recommended for use in examinations during the preclinical years.

References & Citations

1. Newble D 2016 Revisiting “The effect of assessments and examinations on the learning of medical students.” Med Educ 50 498 501 10.1111/medu.12796 27072435 http://dx.doi.org/10.1111/medu.12796
2. Bloom BS, Engelhart MD, Furst EJ, Hill WH, Krathwohl DR 1956 Taxonomy of educational objectives: the classification of educational goals Edward Bros Ann Arbor, MI
3. Crowe A, Dirks C, Wenderoth MP 2008 Biology in bloom: implementing Bloom’s taxonomy to enhance student learning in biology CBE Life Sci Educ 7 368 381 10.1187/cbe.08-05-0024 19047424 2592046 http://dx.doi.org/10.1187/cbe.08-05-0024
4. Yeong FM 2015 Use of constructed-response questions to support learning of cell biology during lectures J Microbiol Biol Educ 16 87 89 10.1128/jmbe.v16i1.890 25949766 4416515 http://dx.doi.org/10.1128/jmbe.v16i1.890
5. van Hoeij MJ, Haarhuis JC, Wierstra RF, van Beukelen P 2004 Developing a classification tool based on Bloom’s taxonomy to assess the cognitive level of short essay questions J Vet Med Educ 31 261 267 10.3138/jvme.31.3.261 15510341 http://dx.doi.org/10.3138/jvme.31.3.261
6. Javaeed A 2018 Assessment of higher ordered thinking in medical education: multiple choice questions and modified essay questions MedEdPublish 7 2 60 doi.org/10.15694/mep.2018.0000128.1 10.15694/mep.2018.0000128.1 http://dx.doi.org/10.15694/mep.2018.0000128.1
7. Scully D 2017 Constructing multiple-choice items to measure higher-order thinking Pract Assess Res Eval 22 1 13
8. Zaidi NLB, Grob KL, Monrad SM, Kurtz JB, Tai A, Ahmed AZ, Gruppen LD, Santen SA 2018 Pushing critical thinking skills with multiple-choice questions: does Bloom’s taxonomy work? Acad Med 93 856 859 10.1097/ACM.0000000000002087 http://dx.doi.org/10.1097/ACM.0000000000002087
9. Coughlin PA, Featherstone CR 2017 How to write a high quality multiple choice question (MCQ): a guide for clinicians Eur J Vasc Endovasc Surg 54 654 658 10.1016/j.ejvs.2017.07.012 28870436 http://dx.doi.org/10.1016/j.ejvs.2017.07.012
10. Hayes K, McCrorie P 2010 The principles and best practice of question writing for postgraduate examinations Best Pract Res Clin Obstet Gynaecol 24 783 794 10.1016/j.bpobgyn.2010.04.008 20846909 http://dx.doi.org/10.1016/j.bpobgyn.2010.04.008
11. Paniagua MA, Swygert KA 2016 Constructing written test questions for the basic and clinical sciences 4th ed. National Board of Medical Examiners Philadelphia, PA, USA
12. Al-Rukban MO 2006 Guidelines for the construction of multiple choice questions tests J Family Community Med 13 125 133 23012132 3410060
13. Dell KA, Wantuch GA 2017 How-to guide for writing multiple choice questions for the pharmacy instructor Curr Pharm Teach Learn 9 137 144 10.1016/j.cptl.2016.08.036 29180146 http://dx.doi.org/10.1016/j.cptl.2016.08.036
14. Collins J 2006 Education techniques for lifelong learning: writing multiple-choice questions for continuing medical education activities and self-assessment modules Radiographics 26 543 551 10.1148/rg.262055145 16549616 http://dx.doi.org/10.1148/rg.262055145
15. Brady AM 2005 Assessment of learning with multiple-choice questions Nurse Educ Pract 5 238 242 10.1016/j.nepr.2004.12.005 http://dx.doi.org/10.1016/j.nepr.2004.12.005
16. Cook S 2008 Writing outstanding MCQs that match your objectives: why keep assessing your student’s performance a secret? SGH Proc 17 154 159
17. Haladyna TM, Downing SM, Rodriguez MC 2002 A review of multiple-choice item-writing guidelines for classroom assessment Appl Meas Educ 15 309 333 10.1207/S15324818AME1503_5 http://dx.doi.org/10.1207/S15324818AME1503_5
18. Breakall J, Randles C, Tasker R 2019 Development and use of a multiple-choice item writing flaws evaluation instrument in the context of general chemistry Chem Educ Res Pract 20 369 382 10.1039/C8RP00262B http://dx.doi.org/10.1039/C8RP00262B
19. Downing SM 2005 The effects of violating standard item writing principles on tests and students: the consequences of using flawed test items on achievement examinations in medical education Adv Health Sci Educ Theory Pract 10 133 143 10.1007/s10459-004-4019-5 16078098 http://dx.doi.org/10.1007/s10459-004-4019-5
20. Baig M, Ali SK, Ali S, Huda N 2014 Evaluation of multiple choice and short essay question items in basic medical sciences Pak J Med Sci 30 3 6 24639820 3955531
21. Tariq S, Tariq S, Maqsood S, Jawed S, Baig M 2017 Evaluation of cognitive levels and item writing flaws in medical pharmacology internal assessment examinations Pak J Med Sci 33 866 870 10.12669/pjms.334.12887 29067055 5648954 http://dx.doi.org/10.12669/pjms.334.12887
22. Kowash M, Hussein I, Al Halabi M 2019 Evaluating the quality of multiple choice question in paediatric dentistry postgraduate examinations Sultan Qaboos Univ Med J 19 e135 e141 31538012 6736258
23. Kittrakulrat J, Jongjatuporn W, Jurjai R, Jarupanich N, Pongpirul K 2014 The ASEAN economic community and medical qualification Glob Health Action 7 24535 10.3402/gha.v7.24535 25215908 4161945 http://dx.doi.org/10.3402/gha.v7.24535
24. Burns ER 2010 “Anatomizing” reversed: use of examination questions that foster use of higher-order learning skills by students Anat Sci Educ 3 330 334 10.1002/ase.187 21046570 http://dx.doi.org/10.1002/ase.187
25. Rodriguez-Diez MC, Alegre M, Diez N, Arbea L, Ferrer M 2016 Technical flaws in multiple-choice questions in the access exam to medical specialties (“examen MIR”) in Spain (2009–2013) BMC Med Educ 16 47 10.1186/s12909-016-0559-7 http://dx.doi.org/10.1186/s12909-016-0559-7
26. Imwattana K, Kiratisin P, Techasintana P, Ngamskulrungroj P 2018 An impact on medical student knowledge outcomes after replacing peer lectures with small group discussions MedEdPublish 7 4 3 doi.org/10.15694/mep.2018.0000224.1
27. 2012 Medical Competency Assessment Criteria for National License 2012, on The Medical Council of Thailand http://www.tmc.or.th/download/medical2555.pdf Accessed 30 October 2017

Supplemental Material

Loading

Article metrics loading...

/content/journal/jmbe/10.1128/jmbe.v21i1.1773
2020-02-28
2020-07-09

Abstract:

During the preclinical years, single-best-answer multiple-choice questions (SBA-MCQs) are often used to test the higher-order cognitive processes of medical students (such as application and analysis) while simultaneously assessing lower-order processes (like knowledge and comprehension). Consequently, it can be difficult to pinpoint which learning outcome has been achieved or needs improvement. We developed a new scoring system for SBA-MCQs using a step-by-step methodology to evaluate each learning outcome independently. Enrolled in this study were third-year medical students ( = 316) who had registered in the basic microbiology course at the Faculty of Medicine, Siriraj Hospital, Mahidol University during the academic year 2017. A step-by-step SBA-MCQ with a new scoring system was created and used as a tool to evaluate the validity of the traditional SBA-MCQs that assess two separate outcomes simultaneously. The scores for the two methods, in percentages, were compared using two different questions (SBA-MCQ1 and SBA-MCQ2). SBA-MCQ1 tested the students’ knowledge of the causative agent of a specific infectious disease and the basic characteristics of the microorganism, while SBA-MCQ2 tested their knowledge of the causative agent of a specific infectious disease and the pathogenic mechanism of the microorganism. The mean score obtained with the traditional SBA-MCQs was significantly lower than that obtained with the step-by-step SBA-MCQs (85.9% for the traditional approach versus 90.9% for step-by-step SBA-MCQ1; < 0.001; and 81.5% for the traditional system versus 87.4% for step-by-step SBA-MCQ2; < 0.001). Moreover, 65.8% and 87.8% of the students scored lower with the traditional SBA-MCQ1 and the traditional SBA-MCQ2, respectively, than with the corresponding sets of step-by-step SBA-MCQ questions. These results suggest that traditional SBA-MCQ scores need to be interpreted with caution because they have the potential to underestimate the learning achievement of students. Therefore, the step-by-step SBA-MCQ is preferable to the traditional SBA-MCQs and is recommended for use in examinations during the preclinical years.

Highlighted Text: Show | Hide
Loading full text...

Full text loading...

/deliver/fulltext/jmbe/21/1/jmbe-21-4.html?itemId=/content/journal/jmbe/10.1128/jmbe.v21i1.1773&mimeType=html&fmt=ahah

Figures

Image of FIGURE 1

Click to view

FIGURE 1

An example of the traditional SBA-MCQs used for preclinical-year students. A) SBA-MCQ1 asking two outcomes simultaneously (Outcome 1 and Outcome 2); C) SBA-MCQ2 asking Outcome 1 and Outcome 3 simultaneously; B) and D) To score 1 point from the SBA-MCQ1 or the SBA-MCQ2, a student must display correct knowledge for both outcomes. Correct knowledge for only one of the two outcomes will result in zero points. SBA-MCQ = single-best-answer multiple-choice question.

Source: J. Microbiol. Biol. Educ. February 2020 vol. 21 no. 1 doi:10.1128/jmbe.v21i1.1773
Download as Powerpoint
Image of FIGURE 2

Click to view

FIGURE 2

An example of a step-by-step SBA-MCQ and its scoring system. Representative scores when using the scoring system of the traditional SBA-MCQ are also presented for comparison. *Not applicable as the traditional SBA-MCQ score only depends on the answer for Outcome 2 or Outcome 3; **Score = 0 if the answer is something other than the options listed in the figure. SBA-MCQ = single-best-answer multiple-choice question.

Source: J. Microbiol. Biol. Educ. February 2020 vol. 21 no. 1 doi:10.1128/jmbe.v21i1.1773
Download as Powerpoint

This is a required field
Please enter a valid email address
Please check the format of the address you have entered.
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error