1887

A Comparison of Two Low-Stakes Methods for Administering a Program-Level Biology Concept Assessment

    Authors: Brian A. Couch1,*, Jennifer K. Knight2
    VIEW AFFILIATIONS HIDE AFFILIATIONS
    Affiliations: 1: School of Biological Sciences, University of Nebraska, Lincoln, NE 68588; 2: Department of Molecular, Cellular, and Developmental Biology, University of Colorado, Boulder, CO 80309
    AUTHOR AND ARTICLE INFORMATION AUTHOR AND ARTICLE INFORMATION
    • For Course 1, the first semester was co-taught by an additional instructor, but the course structure and instructional materials remained nearly identical across terms.
    • *Corresponding author. Mailing address: 204 Manter, Lincoln, NE 68588-0118. Phone: 402-472-8130. Fax: 402-472-2083. E-mail: [email protected].
    • ©2015 Author(s). Published by the American Society for Microbiology.
    Source: J. Microbiol. Biol. Educ. December 2015 vol. 16 no. 2 178-185. doi:10.1128/jmbe.v16i2.953
MyBook is a cheap paperback edition of the original book and will be sold at uniform, low price.
  • XML
  • PDF
    321.64 Kb
  • HTML
    63.35 Kb

    Abstract:

    Concept assessments are used commonly in undergraduate science courses to assess student learning and diagnose areas of student difficulty. While most concept assessments align with the content of individual courses or course topics, some concept assessments have been developed for use at the programmatic level to gauge student progress and achievement over a series of courses or an entire major. The broad scope of a program-level assessment, which exceeds the content of any single course, creates several test administration issues, including finding a suitable time for students to take the assessment and adequately incentivizing student participation. These logistical considerations must also be weighed against test security and the ability of students to use unauthorized resources that could compromise test validity. To understand how potential administration methods affect student outcomes, we administered the Molecular Biology Capstone Assessment (MBCA) to three pairs of matched upper-division courses in two ways: an online assessment taken by students outside of class and a paper-based assessment taken during class. We found that overall test scores were not significantly different and that individual item difficulties were highly correlated between these two administration methods. However, in-class administration resulted in reduced completion rates of items at the end of the assessment. Taken together, these results suggest that an online, outside-of-class administration produces scores that are comparable to a paper-based, in-class format and has the added advantages that instructors do not have to dedicate class time and students are more likely to complete the entire assessment.

References & Citations

1. Adams WK, Wieman CE 2011 Development and validation of instruments to measure learning of expert-like thinking Int J Sci Educ 33 1289 1312 10.1080/09500693.2010.512369 http://dx.doi.org/10.1080/09500693.2010.512369
2. American Educational Research Association (AERA), American Psychological Association (APA), National Council on Measurement and Education (NCME) 2014 The standards for educational and psychological testing Washington, DC
3. Anderson DL, Fisher KM, Norman GJ 2002 Development and evaluation of the conceptual inventory of natural selection J Res Sci Teach 39 952 978 10.1002/tea.10053 http://dx.doi.org/10.1002/tea.10053
4. Beno BA 2004 The role of student learning outcomes in accreditation quality review New Dir Commun Coll 126 65 72
5. Couch BA, Wood WB, Knight JK 2015 The Molecular Biology Capstone Assessment: a concept assessment for upper-division molecular biology students CBE Life Sci Educ 14 ar10 25713098 4353076
6. Crocker L, Algina J 2006 Introduction to classical and modern test theory Wadsworth Pub. Co. Mason, OH
7. Ding L, Reay NW, Lee A, Bao L 2008 Effects of testing conditions on conceptual survey results Phys Rev ST Phys Educ Res 4 010112 10.1103/PhysRevSTPER.4.010112 http://dx.doi.org/10.1103/PhysRevSTPER.4.010112
8. Duckworth AL, Quinn PD, Lynam DR, Loeber R, Stouthamer-Loeber M 2011 Role of test motivation in intelligence testing Proc Natl Acad Sci 108 7716 7720 10.1073/pnas.1018601108 21518867 3093513 http://dx.doi.org/10.1073/pnas.1018601108
9. Ewell PT 1988 Implementing assessment: some organizational issues New Dir Institutional Res 15 15 28 10.1002/ir.37019885904 http://dx.doi.org/10.1002/ir.37019885904
10. Garvin-Doxas K, Klymkowsky MW 2008 Understanding randomness and its impact on student learning: lessons learned from building the Biology Concept Inventory (BCI) CBE Life Sci Educ 7 227 233 10.1187/cbe.07-08-0063 18519614 2424310 http://dx.doi.org/10.1187/cbe.07-08-0063
11. Gross LJ 1982 Scoring multiple true/false tests: some considerations Eval Health Prof 5 459 468 10.1177/016327878200500407 http://dx.doi.org/10.1177/016327878200500407
12. Haslam F, Treagust DF 1987 Diagnosing secondary students’ misconceptions of photosynthesis and respiration in plants using a two-tier multiple choice instrument J Biol Educ 21 203 211 10.1080/00219266.1987.9654897 http://dx.doi.org/10.1080/00219266.1987.9654897
13. Higgerson ML 1993 Important components of an effective assessment program J Assoc Commun Adm 2 1 9
14. Howitt S, Anderson T, Costa M, Hamilton S, Wright T 2008 A concept inventory for molecular life sciences: how will it help your teaching practice? Aust Biochem 39 14 17
15. Kalas P, O’Neill A, Pollock C, Birol G 2013 Development of a meiosis concept inventory CBE Life Sci Educ 12 655 664 24297292 3846516
16. Knight JK 2010 Biology concept assessment tools: design and use Microbiol Aust 31 5 8
17. Libarkin J 2008 Concept inventories in higher education science National Research Council Washington, DC
18. Linacre JM 2014 Winsteps Rasch measurement computer program Winsteps.com Beaverton, OR
19. Linacre JM 2014 Winstep Rasch measurement computer program user’s guide Winsteps.com Beaverton, OR
20. Liu OL, Bridgeman B, Adler RM 2012 Measuring learning outcomes in higher education: motivation matters Educ Res 41 352 362 10.3102/0013189X12459679 http://dx.doi.org/10.3102/0013189X12459679
21. Marbach-Ad G, et al 2007 A faculty team works to create content linkages among various courses to increase meaningful learning of targeted concepts of microbiology CBE Life Sci Educ 6 155 162 10.1187/cbe.06-12-0212 17548877 1885905 http://dx.doi.org/10.1187/cbe.06-12-0212
22. Marbach-Ad G, et al 2010 A model for using a concept inventory as a tool for students’ assessment and faculty professional development CBE Life Sci Educ 9 408 416 10.1187/cbe.10-05-0069 21123686 2995757 http://dx.doi.org/10.1187/cbe.10-05-0069
23. Marbach-Ad G, et al 2009 Assessing student understanding of host pathogen interactions using a concept inventory J Microbiol Biol Educ 10 43 50 10.1128/jmbe.v10.98 23653689 3577151 http://dx.doi.org/10.1128/jmbe.v10.98
24. Middaugh MF 2009 Planning and assessment in higher education: demonstrating institutional effectiveness Jossey-Bass Hoboken, NJ 10.1002/9781118269572 http://dx.doi.org/10.1002/9781118269572
25. Odom AL, Barrow LH 1995 Development and application of a two-tier diagnostic test measuring college biology students’ understanding of diffusion and osmosis after a course of instruction J Res Sci Teach 32 45 61 10.1002/tea.3660320106 http://dx.doi.org/10.1002/tea.3660320106
26. Shi J, Wood WB, Martin JM, Guild NA, Vicens Q, Knight JK 2010 A diagnostic assessment for introductory molecular and cell biology CBE Life Sci Educ 9 453 461 10.1187/cbe.10-04-0055 21123692 2995763 http://dx.doi.org/10.1187/cbe.10-04-0055
27. Smith MK, Wood WB, Knight JK 2008 The Genetics Concept Assessment: a new concept inventory for gauging student understanding of genetics CBE Life Sci Educ 7 422 430 10.1187/cbe.08-08-0045 19047428 2592048 http://dx.doi.org/10.1187/cbe.08-08-0045
28. Smith M, Thomas K, Dunham M 2012 In-class incentives that encourage students to take concept assessments seriously J Coll Sci Teach 42 57 61
29. Tsai F-J, Suen HK 1993 A brief report on a comparison of six scoring methods for multiple true-false items Educ Psychol Meas 53 399 404 10.1177/0013164493053002008 http://dx.doi.org/10.1177/0013164493053002008
30. Wise SL, DeMars CE 2005 Low examinee effort in low-stakes assessment: problems and potential solutions Educ Assess 10 1 17 10.1207/s15326977ea1001_1 http://dx.doi.org/10.1207/s15326977ea1001_1
31. Wise SL, DeMars CE 2010 Examinee noneffort and the validity of program assessment results Educ Assess 15 27 41 10.1080/10627191003673216 http://dx.doi.org/10.1080/10627191003673216
32. Wolf LF, Smith JK 1995 The consequence of consequence: motivation, anxiety, and test performance Appl Meas Educ 8 227 242 10.1207/s15324818ame0803_3 http://dx.doi.org/10.1207/s15324818ame0803_3
33. Zwick R, Thayer DT, Lewis C 1999 An empirical Bayes approach to Mantel-Haenszel DIF analysis J Educ Meas 36 1 28 10.1111/j.1745-3984.1999.tb00543.x http://dx.doi.org/10.1111/j.1745-3984.1999.tb00543.x
34. Zwick R 2012 A review of ETS differential item functioning assessment procedures: flagging rules, minimum sample size requirements, and criterion refinement ETS Res Rep Ser 2012 i 30 10.1002/j.2333-8504.2012.tb02290.x http://dx.doi.org/10.1002/j.2333-8504.2012.tb02290.x

Supplemental Material

No supplementary material available for this content.

Loading

Article metrics loading...

/content/journal/jmbe/10.1128/jmbe.v16i2.953
2015-12-01
2019-03-25

Abstract:

Concept assessments are used commonly in undergraduate science courses to assess student learning and diagnose areas of student difficulty. While most concept assessments align with the content of individual courses or course topics, some concept assessments have been developed for use at the programmatic level to gauge student progress and achievement over a series of courses or an entire major. The broad scope of a program-level assessment, which exceeds the content of any single course, creates several test administration issues, including finding a suitable time for students to take the assessment and adequately incentivizing student participation. These logistical considerations must also be weighed against test security and the ability of students to use unauthorized resources that could compromise test validity. To understand how potential administration methods affect student outcomes, we administered the Molecular Biology Capstone Assessment (MBCA) to three pairs of matched upper-division courses in two ways: an online assessment taken by students outside of class and a paper-based assessment taken during class. We found that overall test scores were not significantly different and that individual item difficulties were highly correlated between these two administration methods. However, in-class administration resulted in reduced completion rates of items at the end of the assessment. Taken together, these results suggest that an online, outside-of-class administration produces scores that are comparable to a paper-based, in-class format and has the added advantages that instructors do not have to dedicate class time and students are more likely to complete the entire assessment.

Highlighted Text: Show | Hide
Loading full text...

Full text loading...

/deliver/fulltext/jmbe/16/2/jmbe-16-178.xml.a.html?itemId=/content/journal/jmbe/10.1128/jmbe.v16i2.953&mimeType=html&fmt=ahah

Figures

Image of FIGURE 1

Click to view

FIGURE 1

Effect of format on overall scores. Bars represent average percent correct for each class ± SEM. Filled bars indicate the semester in which the test was given online, outside of class. Unfilled bars indicate the semester in which the test was given on paper, in class. Two-factor ANOVA (format × course): main effect of format, = 0.10, = 0.76; main effect of course, = 4.62, = 0.01; interaction, = 0.26, = 0.77. SEM = standard error of the mean.

Source: J. Microbiol. Biol. Educ. December 2015 vol. 16 no. 2 178-185. doi:10.1128/jmbe.v16i2.953
Download as Powerpoint
Image of FIGURE 2

Click to view

FIGURE 2

Comparison of T/F statement difficulties. Symbols represent difficulties for each T/F statement (4 per question) given either online, outside of class (black circles) or on paper, in class (gray triangles). Lines between data points are included to help visually trace the two administration formats. Note that a higher difficulty indicates a higher proportion of correct answers (i.e., an easier question). Correlation between statements: Pearson’s = 0.92. Mantel-Haenszel differential item functioning: † = Category B (10d, 14c, 15b); ‡ = Category C (7c). T/F = true/false.

Source: J. Microbiol. Biol. Educ. December 2015 vol. 16 no. 2 178-185. doi:10.1128/jmbe.v16i2.953
Download as Powerpoint
Image of FIGURE 3

Click to view

FIGURE 3

Individual T/F statement completion rates. Symbols represent the percent of students marking an answer for each T/F statement given either online, outside of class (black circles) or on paper, in class (gray triangles). T/F = true/false.

Source: J. Microbiol. Biol. Educ. December 2015 vol. 16 no. 2 178-185. doi:10.1128/jmbe.v16i2.953
Download as Powerpoint
Image of FIGURE 4

Click to view

FIGURE 4

Total assessment completion time for the online, outside-of-class administration. Gray bars represent the percent of students taking the amount of time given for each bin. Labels indicate the upper threshold of each bin. For example, the right-most bin contains students who took longer than 85 minutes and less than or equal to 90 minutes.

Source: J. Microbiol. Biol. Educ. December 2015 vol. 16 no. 2 178-185. doi:10.1128/jmbe.v16i2.953
Download as Powerpoint
Image of FIGURE 5

Click to view

FIGURE 5

Time per question for the online, outside-of-class administration. Central bars represent median question time, boxes represent inner quartiles, and whiskers represent 5 and 95 percentiles.

Source: J. Microbiol. Biol. Educ. December 2015 vol. 16 no. 2 178-185. doi:10.1128/jmbe.v16i2.953
Download as Powerpoint
Image of FIGURE 6

Click to view

FIGURE 6

Relationship between total assessment time and overall test scores for the online, outside-of-class format. Each gray dot corresponds to the overall score for a single student taking the indicated amount of time. The black line shows the linear correlation between variables ( = 0.24).

Source: J. Microbiol. Biol. Educ. December 2015 vol. 16 no. 2 178-185. doi:10.1128/jmbe.v16i2.953
Download as Powerpoint

This is a required field
Please enter a valid email address
Please check the format of the address you have entered.
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error