1887

A Comparison of Two Low-Stakes Methods for Administering a Program-Level Biology Concept Assessment

    Authors: Brian A. Couch1,*, Jennifer K. Knight2
    VIEW AFFILIATIONS HIDE AFFILIATIONS
    Affiliations: 1: School of Biological Sciences, University of Nebraska, Lincoln, NE 68588; 2: Department of Molecular, Cellular, and Developmental Biology, University of Colorado, Boulder, CO 80309
    AUTHOR AND ARTICLE INFORMATION AUTHOR AND ARTICLE INFORMATION
    • For Course 1, the first semester was co-taught by an additional instructor, but the course structure and instructional materials remained nearly identical across terms.
    • *Corresponding author. Mailing address: 204 Manter, Lincoln, NE 68588-0118. Phone: 402-472-8130. Fax: 402-472-2083. E-mail: bcouch2@unl.edu.
    • ©2015 Author(s). Published by the American Society for Microbiology.
    Source: J. Microbiol. Biol. Educ. December 2015 vol. 16 no. 2 178-185. doi:10.1128/jmbe.v16i2.953
MyBook is a cheap paperback edition of the original book and will be sold at uniform, low price.
  • XML
  • PDF
    321.64 Kb
  • HTML
    63.35 Kb

    Abstract:

    Concept assessments are used commonly in undergraduate science courses to assess student learning and diagnose areas of student difficulty. While most concept assessments align with the content of individual courses or course topics, some concept assessments have been developed for use at the programmatic level to gauge student progress and achievement over a series of courses or an entire major. The broad scope of a program-level assessment, which exceeds the content of any single course, creates several test administration issues, including finding a suitable time for students to take the assessment and adequately incentivizing student participation. These logistical considerations must also be weighed against test security and the ability of students to use unauthorized resources that could compromise test validity. To understand how potential administration methods affect student outcomes, we administered the Molecular Biology Capstone Assessment (MBCA) to three pairs of matched upper-division courses in two ways: an online assessment taken by students outside of class and a paper-based assessment taken during class. We found that overall test scores were not significantly different and that individual item difficulties were highly correlated between these two administration methods. However, in-class administration resulted in reduced completion rates of items at the end of the assessment. Taken together, these results suggest that an online, outside-of-class administration produces scores that are comparable to a paper-based, in-class format and has the added advantages that instructors do not have to dedicate class time and students are more likely to complete the entire assessment.

References & Citations

1. Adams WK, Wieman CE2011Development and validation of instruments to measure learning of expert-like thinkingInt J Sci Educ331289131210.1080/09500693.2010.512369 http://dx.doi.org/10.1080/09500693.2010.512369
2. American Educational Research Association (AERA), American Psychological Association (APA), National Council on Measurement and Education (NCME)2014The standards for educational and psychological testingWashington, DC
3. Anderson DL, Fisher KM, Norman GJ2002Development and evaluation of the conceptual inventory of natural selectionJ Res Sci Teach3995297810.1002/tea.10053 http://dx.doi.org/10.1002/tea.10053
4. Beno BA2004The role of student learning outcomes in accreditation quality reviewNew Dir Commun Coll1266572
5. Couch BA, Wood WB, Knight JK2015The Molecular Biology Capstone Assessment: a concept assessment for upper-division molecular biology studentsCBE Life Sci Educ14ar10257130984353076
6. Crocker L, Algina J2006Introduction to classical and modern test theoryWadsworth Pub. Co.Mason, OH
7. Ding L, Reay NW, Lee A, Bao L2008Effects of testing conditions on conceptual survey resultsPhys Rev ST Phys Educ Res401011210.1103/PhysRevSTPER.4.010112 http://dx.doi.org/10.1103/PhysRevSTPER.4.010112
8. Duckworth AL, Quinn PD, Lynam DR, Loeber R, Stouthamer-Loeber M2011Role of test motivation in intelligence testingProc Natl Acad Sci1087716772010.1073/pnas.1018601108215188673093513 http://dx.doi.org/10.1073/pnas.1018601108
9. Ewell PT1988Implementing assessment: some organizational issuesNew Dir Institutional Res15152810.1002/ir.37019885904 http://dx.doi.org/10.1002/ir.37019885904
10. Garvin-Doxas K, Klymkowsky MW2008Understanding randomness and its impact on student learning: lessons learned from building the Biology Concept Inventory (BCI)CBE Life Sci Educ722723310.1187/cbe.07-08-0063185196142424310 http://dx.doi.org/10.1187/cbe.07-08-0063
11. Gross LJ1982Scoring multiple true/false tests: some considerationsEval Health Prof545946810.1177/016327878200500407 http://dx.doi.org/10.1177/016327878200500407
12. Haslam F, Treagust DF1987Diagnosing secondary students’ misconceptions of photosynthesis and respiration in plants using a two-tier multiple choice instrumentJ Biol Educ2120321110.1080/00219266.1987.9654897 http://dx.doi.org/10.1080/00219266.1987.9654897
13. Higgerson ML1993Important components of an effective assessment programJ Assoc Commun Adm219
14. Howitt S, Anderson T, Costa M, Hamilton S, Wright T2008A concept inventory for molecular life sciences: how will it help your teaching practice?Aust Biochem391417
15. Kalas P, O’Neill A, Pollock C, Birol G2013Development of a meiosis concept inventoryCBE Life Sci Educ12655664242972923846516
16. Knight JK2010Biology concept assessment tools: design and useMicrobiol Aust3158
17. Libarkin J2008Concept inventories in higher education scienceNational Research CouncilWashington, DC
18. Linacre JM2014Winsteps Rasch measurement computer programWinsteps.comBeaverton, OR
19. Linacre JM2014Winstep Rasch measurement computer program user’s guideWinsteps.comBeaverton, OR
20. Liu OL, Bridgeman B, Adler RM2012Measuring learning outcomes in higher education: motivation mattersEduc Res4135236210.3102/0013189X12459679 http://dx.doi.org/10.3102/0013189X12459679
21. Marbach-Ad G, et al2007A faculty team works to create content linkages among various courses to increase meaningful learning of targeted concepts of microbiologyCBE Life Sci Educ615516210.1187/cbe.06-12-0212175488771885905 http://dx.doi.org/10.1187/cbe.06-12-0212
22. Marbach-Ad G, et al2010A model for using a concept inventory as a tool for students’ assessment and faculty professional developmentCBE Life Sci Educ940841610.1187/cbe.10-05-0069211236862995757 http://dx.doi.org/10.1187/cbe.10-05-0069
23. Marbach-Ad G, et al2009Assessing student understanding of host pathogen interactions using a concept inventoryJ Microbiol Biol Educ10435010.1128/jmbe.v10.98236536893577151 http://dx.doi.org/10.1128/jmbe.v10.98
24. Middaugh MF2009Planning and assessment in higher education: demonstrating institutional effectivenessJossey-BassHoboken, NJ10.1002/9781118269572 http://dx.doi.org/10.1002/9781118269572
25. Odom AL, Barrow LH1995Development and application of a two-tier diagnostic test measuring college biology students’ understanding of diffusion and osmosis after a course of instructionJ Res Sci Teach32456110.1002/tea.3660320106 http://dx.doi.org/10.1002/tea.3660320106
26. Shi J, Wood WB, Martin JM, Guild NA, Vicens Q, Knight JK2010A diagnostic assessment for introductory molecular and cell biologyCBE Life Sci Educ945346110.1187/cbe.10-04-0055211236922995763 http://dx.doi.org/10.1187/cbe.10-04-0055
27. Smith MK, Wood WB, Knight JK2008The Genetics Concept Assessment: a new concept inventory for gauging student understanding of geneticsCBE Life Sci Educ742243010.1187/cbe.08-08-0045190474282592048 http://dx.doi.org/10.1187/cbe.08-08-0045
28. Smith M, Thomas K, Dunham M2012In-class incentives that encourage students to take concept assessments seriouslyJ Coll Sci Teach425761
29. Tsai F-J, Suen HK1993A brief report on a comparison of six scoring methods for multiple true-false itemsEduc Psychol Meas5339940410.1177/0013164493053002008 http://dx.doi.org/10.1177/0013164493053002008
30. Wise SL, DeMars CE2005Low examinee effort in low-stakes assessment: problems and potential solutionsEduc Assess1011710.1207/s15326977ea1001_1 http://dx.doi.org/10.1207/s15326977ea1001_1
31. Wise SL, DeMars CE2010Examinee noneffort and the validity of program assessment resultsEduc Assess15274110.1080/10627191003673216 http://dx.doi.org/10.1080/10627191003673216
32. Wolf LF, Smith JK1995The consequence of consequence: motivation, anxiety, and test performanceAppl Meas Educ822724210.1207/s15324818ame0803_3 http://dx.doi.org/10.1207/s15324818ame0803_3
33. Zwick R, Thayer DT, Lewis C1999An empirical Bayes approach to Mantel-Haenszel DIF analysisJ Educ Meas3612810.1111/j.1745-3984.1999.tb00543.x http://dx.doi.org/10.1111/j.1745-3984.1999.tb00543.x
34. Zwick R2012A review of ETS differential item functioning assessment procedures: flagging rules, minimum sample size requirements, and criterion refinementETS Res Rep Ser2012i3010.1002/j.2333-8504.2012.tb02290.x http://dx.doi.org/10.1002/j.2333-8504.2012.tb02290.x
jmbe.v16i2.953.citations
jmbe/16/2
content/journal/jmbe/10.1128/jmbe.v16i2.953
Loading

Citations loading...

Supplemental Material

No supplementary material available for this content.

Loading

Article metrics loading...

/content/journal/jmbe/10.1128/jmbe.v16i2.953
2015-12-01
2017-11-19

Abstract:

Concept assessments are used commonly in undergraduate science courses to assess student learning and diagnose areas of student difficulty. While most concept assessments align with the content of individual courses or course topics, some concept assessments have been developed for use at the programmatic level to gauge student progress and achievement over a series of courses or an entire major. The broad scope of a program-level assessment, which exceeds the content of any single course, creates several test administration issues, including finding a suitable time for students to take the assessment and adequately incentivizing student participation. These logistical considerations must also be weighed against test security and the ability of students to use unauthorized resources that could compromise test validity. To understand how potential administration methods affect student outcomes, we administered the Molecular Biology Capstone Assessment (MBCA) to three pairs of matched upper-division courses in two ways: an online assessment taken by students outside of class and a paper-based assessment taken during class. We found that overall test scores were not significantly different and that individual item difficulties were highly correlated between these two administration methods. However, in-class administration resulted in reduced completion rates of items at the end of the assessment. Taken together, these results suggest that an online, outside-of-class administration produces scores that are comparable to a paper-based, in-class format and has the added advantages that instructors do not have to dedicate class time and students are more likely to complete the entire assessment.

Highlighted Text: Show | Hide
Loading full text...

Full text loading...

/deliver/fulltext/jmbe/16/2/jmbe-16-178.xml.a.html?itemId=/content/journal/jmbe/10.1128/jmbe.v16i2.953&mimeType=html&fmt=ahah

Figures

Image of FIGURE 1

Click to view

FIGURE 1

Effect of format on overall scores. Bars represent average percent correct for each class ± SEM. Filled bars indicate the semester in which the test was given online, outside of class. Unfilled bars indicate the semester in which the test was given on paper, in class. Two-factor ANOVA (format × course): main effect of format, = 0.10, = 0.76; main effect of course, = 4.62, = 0.01; interaction, = 0.26, = 0.77. SEM = standard error of the mean.

Source: J. Microbiol. Biol. Educ. December 2015 vol. 16 no. 2 178-185. doi:10.1128/jmbe.v16i2.953
Download as Powerpoint
Image of FIGURE 2

Click to view

FIGURE 2

Comparison of T/F statement difficulties. Symbols represent difficulties for each T/F statement (4 per question) given either online, outside of class (black circles) or on paper, in class (gray triangles). Lines between data points are included to help visually trace the two administration formats. Note that a higher difficulty indicates a higher proportion of correct answers (i.e., an easier question). Correlation between statements: Pearson’s = 0.92. Mantel-Haenszel differential item functioning: † = Category B (10d, 14c, 15b); ‡ = Category C (7c). T/F = true/false.

Source: J. Microbiol. Biol. Educ. December 2015 vol. 16 no. 2 178-185. doi:10.1128/jmbe.v16i2.953
Download as Powerpoint
Image of FIGURE 3

Click to view

FIGURE 3

Individual T/F statement completion rates. Symbols represent the percent of students marking an answer for each T/F statement given either online, outside of class (black circles) or on paper, in class (gray triangles). T/F = true/false.

Source: J. Microbiol. Biol. Educ. December 2015 vol. 16 no. 2 178-185. doi:10.1128/jmbe.v16i2.953
Download as Powerpoint
Image of FIGURE 4

Click to view

FIGURE 4

Total assessment completion time for the online, outside-of-class administration. Gray bars represent the percent of students taking the amount of time given for each bin. Labels indicate the upper threshold of each bin. For example, the right-most bin contains students who took longer than 85 minutes and less than or equal to 90 minutes.

Source: J. Microbiol. Biol. Educ. December 2015 vol. 16 no. 2 178-185. doi:10.1128/jmbe.v16i2.953
Download as Powerpoint
Image of FIGURE 5

Click to view

FIGURE 5

Time per question for the online, outside-of-class administration. Central bars represent median question time, boxes represent inner quartiles, and whiskers represent 5 and 95 percentiles.

Source: J. Microbiol. Biol. Educ. December 2015 vol. 16 no. 2 178-185. doi:10.1128/jmbe.v16i2.953
Download as Powerpoint
Image of FIGURE 6

Click to view

FIGURE 6

Relationship between total assessment time and overall test scores for the online, outside-of-class format. Each gray dot corresponds to the overall score for a single student taking the indicated amount of time. The black line shows the linear correlation between variables ( = 0.24).

Source: J. Microbiol. Biol. Educ. December 2015 vol. 16 no. 2 178-185. doi:10.1128/jmbe.v16i2.953
Download as Powerpoint

This is a required field
Please enter a valid email address
Please check the format of the address you have entered.
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error