1887

Designing a Curriculum-Aligned Assessment of Cumulative Learning about Marine Primary Production to Improve an Undergraduate Marine Sciences Program

    Authors: Ryan A. Weatherbee1,*, Sara M. Lindsay2
    VIEW AFFILIATIONS HIDE AFFILIATIONS
    Affiliations: 1: Husson University, Bangor, ME 04401; 2: School of Marine Sciences and Maine Center for Research in STEM Education, University of Maine, Orono, ME 04469-5741
    AUTHOR AND ARTICLE INFORMATION AUTHOR AND ARTICLE INFORMATION
    Source: J. Microbiol. Biol. Educ. December 2018 vol. 19 no. 3 doi:10.1128/jmbe.v19i3.1396
MyBook is a cheap paperback edition of the original book and will be sold at uniform, low price.
  • PDF
    1.31 MB
  • HTML
    75.90 Kb
  • XML
    89.80 Kb

    Abstract:

    We developed an assessment to track changes in understanding about marine primary production, a key concept taught across our undergraduate curriculum. Question content was informed by investigating student misunderstandings, conducting faculty interviews, and mapping primary production concepts to the curriculum. Content questions were paired with questions asking students how confident they were in their answers. Although students gained knowledge of marine primary production across educational levels, confidence data and item analysis indicated student misunderstandings on several concepts. Many students had difficulty on questions that required interpreting graphs or other higher-order thinking skills. The results set the stage for additional focused assessment and curriculum revision, and the questions may be useful in developing a large-scale, interdisciplinary marine sciences concept inventory.

References & Citations

1. American Association for the Advancement of Science 2011 Vision and change in undergraduate biology education: a call to action: a summary of recommendations made at a national conference organized by the American Association for the Advancement of Science July 15–17, 2009 Washington, DC
2. National Research Council 1999 Transforming undergraduate education in science, mathematics, engineering and technology The National Academies Press Washington, DC
3. National Research Council 2003 Evaluating and improving undergraduate teaching in science, technology, engineering and mathematics The National Academies Press Washington, DC
4. Tran LU, Payne DL, Whitley L 2010 Research on learning and teaching ocean and aquatic sciences NMEA Spec Rep 3 22 26
5. Arthurs L, Hsia JF, Schweinle W 2015 The oceanography concept inventory: a semicustomizable assessment for measuring student understanding of oceanography J Geosci Educ 63 310 322 10.5408/14-061.1 http://dx.doi.org/10.5408/14-061.1
6. Meltzer DE, Thornton RK 2012 Resource letter ALIP-1: active-learning instruction in physics Am J Phys 80 478 496 10.1119/1.3678299 http://dx.doi.org/10.1119/1.3678299
7. Hoskinson AM, Caballero MD, Knight JK 2013 How can we improve problem solving in undergraduate biology? Applying lessons from 30 years of physics education research CBE Life Sci Educ 12 153 161 10.1187/cbe.12-09-0149 23737623 3671643 http://dx.doi.org/10.1187/cbe.12-09-0149
8. Dirks C 2011 The current status and future direction of biology education research Commissioned paper for the Committee on the Status, Contributions, and Future Directions of Discipline-Based Education Research, Board of Science Education, Division of Behavioral and Social Sciences and Education, National Research Council Washington DC
9. Doktor J, Mestre J 2011 A synthesis of discipline-based education research in physics Commissioned paper for the Committee on the Status, Contributions, and Future Directions of Discipline-Based Education Research, Board on Science Education, Division of Behavioral and Social Sciences and Education, National Research Council Washington DC
10. National Research Council 2012 Discipline-based education research: understanding and improving learning in undergraduate science and engineering The National Academies Press Washington, DC
11. D’Avanzo C 2008 Biology concept inventories: overview, status, and next steps Bioscience 58 1 7
12. Smith MK, Knight JK 2012 Using the genetics concept assessment to document persistent conceptual difficulties in undergraduate genetics courses Genetics 191 21 32 10.1534/genetics.111.137810 22367036 3338261 http://dx.doi.org/10.1534/genetics.111.137810
13. Anderson DL, Fisher KM, Norman JG 2002 Development and validation of the conceptual inventory of natural selection J Res Sci Teach 39 952 978 10.1002/tea.10053 http://dx.doi.org/10.1002/tea.10053
14. Fisher KM, Williams KS 2011 Concept inventories and conceptual assessments in biology (CABs): an annotated list Available at: www.sci.sdsu.edu/CRMSE/files/Concept_Inventories_in_Biology_20110325.pdf
15. Mulford DR, Robinson WR 2002 An inventory for alternate conceptions among first-semester general chemistry students J Chem Educ 79 739 744 10.1021/ed079p739 http://dx.doi.org/10.1021/ed079p739
16. Dick-Perez M, Luxford CJ, Windus TL, Holme T 2016 A quantum chemistry concept inventory for physical chemistry classes J Chem Educ 93 605 612 10.1021/acs.jchemed.5b00781 http://dx.doi.org/10.1021/acs.jchemed.5b00781
17. Wren D, Barbera J 2013 Gathering evidence for validity during the design, development and qualitative evaluation of thermochemistry concept inventory items J Chem Educ 90 1590 1601 10.1021/ed400384g http://dx.doi.org/10.1021/ed400384g
18. Hestenes D, Wells M, Swackhamer G 1992 Force concept inventory Phys Teach 20 141 158 10.1119/1.2343497 http://dx.doi.org/10.1119/1.2343497
19. Ding L 2007 Designing an energy assessment to evaluate student understanding of energy topics North Carolina State University Raleigh, NC
20. Libarkin JC, Ward EMG 2011 The qualitative underpinnings of quantitative concept inventory questions 37 48 Qualitative inquiry in geoscience education research: Geological Society of America Special Paper 474 10.1130/2011.2474(04) http://dx.doi.org/10.1130/2011.2474(04)
21. Libarkin JC, Anderson SW 2005 Assessment of learning in entry-level geoscience courses: results from the geoscience concept inventory J Geosci Educ 53 394 401 10.5408/1089-9995-53.4.394 http://dx.doi.org/10.5408/1089-9995-53.4.394
22. Knight JK, Wood WB 2005 Teaching more by lecturing less Cell Biol Educ 4 298 310 10.1187/05-06-0082 16341257 1305892 http://dx.doi.org/10.1187/05-06-0082
23. Mair P, Hatzinger R, Maier MJ, Rusch T 2015 Package “eRm.” CRAN http://cran.r-project.org/web/packages/eRm/eRm.pdf
24. Bond TG, Fox CM 2015 Applying the Rasch model: fundamental measurement in the human sciences 3rd ed L. Erlbaum Mahwah, NJ 10.4324/9781315814698 http://dx.doi.org/10.4324/9781315814698
25. Nitko AJ, Brookhart SM 2007 Educational assessment of students 5th ed. Pearson Education, Inc. Upper Saddle River, NJ
26. Bissell AN, Lemons PP 2006 A new method for assessing critical thinking in the classroom BioScience 56 66 72 10.1641/0006-3568(2006)056[0066:ANMFAC]2.0.CO;2 http://dx.doi.org/10.1641/0006-3568(2006)056[0066:ANMFAC]2.0.CO;2
27. Byrnes JP 2001 Cognitive development and learning 2nd Edition Allyn and Bacon Boston, MA
28. Crowe A, Dirks C, Wenderoth MP 2008 Biology in bloom: implementing Bloom’s taxonomy to enhance student learning in biology CBE Life Sci Educ 7 368 381 10.1187/cbe.08-05-0024 19047424 2592046 http://dx.doi.org/10.1187/cbe.08-05-0024
29. Howard DR, Miskowski JA, Grunwald SK, Abler ML 2007 Assessment of a bioinformatics across life science curricula initiative Biochem Mol Biol Educ 35 16 23 10.1002/bmb.13 21591051 http://dx.doi.org/10.1002/bmb.13
30. Vogt FD, Slish D 2011 A model for programmatic evaluation by student assessment J Sci Educ Technol 20 796 802 10.1007/s10956-010-9274-z http://dx.doi.org/10.1007/s10956-010-9274-z
31. Ebert-May D, Lim H, Batzli J 2003 Disciplinary research strategies for assessment of learning BioScience 53 1221 10.1641/0006-3568(2003)053[1221:DRSFAO]2.0.CO;2 http://dx.doi.org/10.1641/0006-3568(2003)053[1221:DRSFAO]2.0.CO;2
32. Andrew S 1998 Self-efficacy as a predictor of academic performance in science J Adv Nurs 27 596 603 10.1046/j.1365-2648.1998.00550.x 9543047 http://dx.doi.org/10.1046/j.1365-2648.1998.00550.x
33. Multon KD, Brown SD, Lent RW 1991 Relation of self-efficacy to academic outcomes: a meta-analytic investigation J Couns Psychol 38 30 38 10.1037/0022-0167.38.1.30 http://dx.doi.org/10.1037/0022-0167.38.1.30
34. Bell P, Volckmann D 2011 Knowledge surveys in general chemistry: confidence, overconfidence, and performance J Chem Educ 88 1469 1476 10.1021/ed100328c http://dx.doi.org/10.1021/ed100328c
35. Kruger J, Dunning D 1999 Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments J Pers Soc Psychol 77 1121 1134 10.1037/0022-3514.77.6.1121 http://dx.doi.org/10.1037/0022-3514.77.6.1121
36. Dunlosky J, Rawson KA 2012 Overconfidence produces underachievement: inaccurate self evaluations undermine students’ learning and retention Learn Instr 22 271 280 10.1016/j.learninstruc.2011.08.003 http://dx.doi.org/10.1016/j.learninstruc.2011.08.003
37. Momsen JL, Long TM, Wyse SA, Ebert-May D 2010 Just the facts? Introductory undergraduate biology courses focus on low-level cognitive skills CBE Life Sci Educ 9 435 440 10.1187/cbe.10-01-0001 21123690 2995761 http://dx.doi.org/10.1187/cbe.10-01-0001
38. Brabrand C, Dahl B 2009 Using the SOLO taxonomy to analyze competence progression of university science curricula High Educ Int J High Educ Educ Plan 58 531 549
39. King PM, Wood PK, Mines RA 1990 Critical thinking among college and graduate students Rev High Educ 13 167 186 10.1353/rhe.1990.0026 http://dx.doi.org/10.1353/rhe.1990.0026
40. McOsker 2009 Student understanding of error and variability in primary science communication Masters Thesis University of Maine Orono, ME
41. Glazer N 2011 Challenges with graph interpretation: a review of the literature Stud Sci Educ 47 183 210 10.1080/03057267.2011.605307 http://dx.doi.org/10.1080/03057267.2011.605307

Supplemental Material

Loading

Article metrics loading...

/content/journal/jmbe/10.1128/jmbe.v19i3.1396
2018-12-14
2019-10-14

Abstract:

We developed an assessment to track changes in understanding about marine primary production, a key concept taught across our undergraduate curriculum. Question content was informed by investigating student misunderstandings, conducting faculty interviews, and mapping primary production concepts to the curriculum. Content questions were paired with questions asking students how confident they were in their answers. Although students gained knowledge of marine primary production across educational levels, confidence data and item analysis indicated student misunderstandings on several concepts. Many students had difficulty on questions that required interpreting graphs or other higher-order thinking skills. The results set the stage for additional focused assessment and curriculum revision, and the questions may be useful in developing a large-scale, interdisciplinary marine sciences concept inventory.

Highlighted Text: Show | Hide
Loading full text...

Full text loading...

/deliver/fulltext/jmbe/19/3/jmbe-19-103.html?itemId=/content/journal/jmbe/10.1128/jmbe.v19i3.1396&mimeType=html&fmt=ahah

Figures

Image of FIGURE 1

Click to view

FIGURE 1

Flowchart describing general assessment development process. Potential misunderstandings about marine primary production were identified from existing first-year student homework and then mapped onto a concept map describing the implemented curriculum regarding marine primary production; assessment questions were drafted to target key topics for which we had evidence of some misunderstandings. Faculty experts guided question development, and some questions were revised based on student performance and comments following a pilot deployment. See Appendix 1 for details. MPP marine primary production.

Source: J. Microbiol. Biol. Educ. December 2018 vol. 19 no. 3 doi:10.1128/jmbe.v19i3.1396
Download as Powerpoint
Image of FIGURE 2

Click to view

FIGURE 2

Rasch analysis results. (a) Item map showing how each survey question varied in difficulty (y-axis) and how well it fit the Rasch model (x-axis). Question difficulty increases with vertical logit value. The vertical size of each symbol shows the standard error of the estimated difficulty. (b) Person map showing each student’s ability on the assessment (y-axis) and how well each student’s responses fit the Rasch model (x-axis). Students with more ability had larger vertical ability values. To aid readability, error estimates are not shown by symbol size. The average standard error of estimated ability was 0.65 (±0.11 SD). For both maps, the fit of the questions/students to the model is indicated by their position on the x-axis. Items/students further from 0 fit the model less well. The vertical lines at −2 and 2 infit values indicate the acceptable range for items fitting the Rasch model.

Source: J. Microbiol. Biol. Educ. December 2018 vol. 19 no. 3 doi:10.1128/jmbe.v19i3.1396
Download as Powerpoint
Image of FIGURE 3

Click to view

FIGURE 3

Mean assessment scores. Mean person parameter logit values in each SMS educational level. Larger values (less negative) indicate higher performance on the assessment. Error bars show standard error. Number of students in each educational level was level 1 = 20, level 2 = 17, level 3 = 34 and level 4 = 15. The mean score at level 1 was significantly different than all other levels (Tukey’s LSD, α = 0.1). LSD = least significant difference; SMS = School of Marine Sciences.

Source: J. Microbiol. Biol. Educ. December 2018 vol. 19 no. 3 doi:10.1128/jmbe.v19i3.1396
Download as Powerpoint
Image of FIGURE 4

Click to view

FIGURE 4

Relationship between assessment difficulty and cognitive demand of assessment questions. The linear correlation of mean logit assessment difficulty for each question is pooled over all SMS educational levels and the cognitive demands of each question as categorized by Bloom’s Taxonomy levels (R = 0.37, 0.028). Note that the signs of item difficulty values have been inverted to emphasize increased difficulty of questions with higher Bloom’s levels. SMS = School of Marine Sciences.

Source: J. Microbiol. Biol. Educ. December 2018 vol. 19 no. 3 doi:10.1128/jmbe.v19i3.1396
Download as Powerpoint
Image of FIGURE 5

Click to view

FIGURE 5

Relationship between self-reported response confidence and assessment score. The graph indicates a linear relationship between categorical mean self-reported question response confidence and overall assessment score over all questions and SMS educational levels. Each symbol represents a unique student ( = 86). The least-squares regression line is plotted (Pearson correlation, R = 0.00, α = 0.1, 0.956). SMS = School of Marine Sciences.

Source: J. Microbiol. Biol. Educ. December 2018 vol. 19 no. 3 doi:10.1128/jmbe.v19i3.1396
Download as Powerpoint
Image of FIGURE 6

Click to view

FIGURE 6

Self-reported confidence for survey Question 1. Responses are pooled over all educational levels, with confidence in each item indicated by color. The question was: The majority of actual weight (dry biomass) gained by open-ocean photosynthetic marine algae as they grow comes from which of the following substances? The possible responses were:

  1. Molecules containing carbon. (correct answer)
  2. Particulate matter.
  3. Dissolved forms of nitrogen.
  4. Energy from the sun.

Source: J. Microbiol. Biol. Educ. December 2018 vol. 19 no. 3 doi:10.1128/jmbe.v19i3.1396
Download as Powerpoint
Image of FIGURE 7

Click to view

FIGURE 7

Relationship between question cognitive demand and mean student confidence. The graph shows a linear relationship between Bloom’s Taxonomy level for each question and categorical mean self-reported question response confidence pooled over all educational levels. The least-squares regression line is plotted (Pearson correlation, R = 0.24, α = 0.1, 0.089).

Source: J. Microbiol. Biol. Educ. December 2018 vol. 19 no. 3 doi:10.1128/jmbe.v19i3.1396
Download as Powerpoint

This is a required field
Please enter a valid email address
Please check the format of the address you have entered.
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error