1887

Evaluating the Impact of a Classroom Response System in a Microbiology Course

    Authors: ERICA SUCHMAN1,*, KAY UCHIYAMA2, RALPH SMITH1, KIM BENDER3
    VIEW AFFILIATIONS HIDE AFFILIATIONS
    Affiliations: 1: Department of Microbiology, Immunology and Pathology, and; 2: School of Education and; 3: Office of the Provost, Colorado State University, Fort Collins, CO 80523
    AUTHOR AND ARTICLE INFORMATION AUTHOR AND ARTICLE INFORMATION
    • *Corresponding author. Mailing address: Department of Microbiology, Immunology and Pathology, Colorado State University, Fort Collins, CO 80523. Phone: (970) 491-6521 Fax: (970) 491-1815. E-mail: [email protected].
    • Copyright © 2006, American Society for Microbiology. All Rights Reserved.
    Source: J. Microbiol. Biol. Educ. May 2006 vol. 7 no. 1 3-11. doi:10.1128/154288106X14285807012764
MyBook is a cheap paperback edition of the original book and will be sold at uniform, low price.
  • PDF
    99.18 Kb
  • XML
  • HTML
    55.97 Kb

    Abstract:

    The use of a Classroom Response System (CRS) was evaluated in two sections, A and B, of a large lecture microbiology course. In Section B the instructor used the CRS technology at the beginning of the class period posing a question on content from the previous class. Students could earn extra credit if they answered the question correctly. In Section A, the class also began with an extra credit CRS question. However, CRS questions were integrated into the lecture during the entire class period. We compared the two classes to see if augmenting lectures with this technology increased student learning, confidence, attendance, and the instructor’s ability to respond to student’s misconceptions, over simply using the CRS as a quizzing tool. Student performance was compared using shared examination questions. The questions were categorized by how the content had been presented in class. All questions came from instructors’ common lecture content, some without CRS use, and some questions where Instructor A used both lecture and CRS questions. Although Section A students scored significantly better on both types of examination questions, there was no demonstrable difference in learning based on CRS question participation. However, student survey data showed that students in Section A expressed higher confidence levels in their learning and knowledge and indicated that they interacted more with other students than did the students in Section B. In addition, Instructor A recorded more modifications to lecture content and recorded more student interaction in the course than did Instructor B.

Key Concept Ranking

Transformation
0.6015761
Lead
0.46428573
0.6015761

References & Citations

1. Black P, Wiliam D 1998 Assessment and classroom learning. Assessment in Educ Principles Policy Practice 5 7 74 10.1080/0969595980050102 http://dx.doi.org/10.1080/0969595980050102
2. Blackman MS, Dooley P, Kuchinski B, Chapman D 2002 It worked a different way Coll Teaching 50 27 28 10.1080/87567550209595868 http://dx.doi.org/10.1080/87567550209595868
3. Borden VMH, Burton KL 1999 The impact of class size on student performance in introductory courses: AIR 1999 annual forum paper 21 Association for Institutional Research Tallahassee, Fla
4. Bullock DW, LaBella VP, Clingan T, Ding Z, Stewart G, Thibado PM 2002 Enhancing the student-instructor interaction frequency Phys Teacher 40 535 541 10.1119/1.1534821 http://dx.doi.org/10.1119/1.1534821
5. Cooper JL, Robinson P 2000 The argument for making large classes seem small New Directions for Teaching Learning 2000 5 16 10.1002/tl.8101 http://dx.doi.org/10.1002/tl.8101
6. Denig SJ 2004 Multiple intelligences and learning styles: two complementary dimensions Teachers Coll Rec 106 96 111 10.1111/j.1467-9620.2004.00322.x http://dx.doi.org/10.1111/j.1467-9620.2004.00322.x
7. Dillon M, Kokkelenberg EC 2002 The effects of class size on student achievement in higher education: applying an earnings function 42nd Annual AIR Forum Ontario, Canada Association for Institutional Research Tallahassee, Fla
8. Duncan D 2005 Clickers in the classroom Pearson, Addison Wesley Boston, Mass
9. Elliott C 2003 Using a personal response system in economics teaching Int Rev Economics Educ 1 80 86
10. Fostnot CT 1996 Constructivism: theory, perspectives, and practice 228 Teachers College Press New York, N.Y
11. Judson E, Sawada D 2002 Learning from past and present: electronic response systems in college lecture halls J. Computers in Math. Sci. Teaching 21 167 181
12. Menges RJ, Austin AE 2001 Teaching in higher education 1122 1156 Richardson V Handbook of research on teaching American Educational Research Association Washington D.C.
13. Paschal CB 2002 Formative assessment in physiology teaching using a wireless classroom communication system Adv Physiol Educ 26 299 308 12444002
14. Roschelle J, Penuel WR, Abrahamson L 2004 Classroom response and communication systems: research review and theory 8 AERA 2004 paper proposal, San Diego, California American Educational Research Association Washington D.C.
15. Sadler DR 1989 Formative assessment and the design of instructional systems Instructional Sci 18 119 144 10.1007/BF00117714 http://dx.doi.org/10.1007/BF00117714
16. Sadler DR 1998 Formative assessment: revisiting the territory Assessment in Educ. Principles Policy Practice 5 77 84 10.1080/0969595980050104 http://dx.doi.org/10.1080/0969595980050104
17. Shepard LA 2001 The role of classroom assessment in teaching and learning 1066 1101 Richardson V Handbook of research on teaching American Educational Research Association Washington D.C.
18. Slain D, Abate M, Hodges BM, Stamatakis MK, Wolak S 2004 An interactive response system to promote active learning in the doctor of pharmacy curriculum Am J Pharm Educ 68 1 9 10.5688/aj6805117 http://dx.doi.org/10.5688/aj6805117
19. Vygotsky LS 1978 Mind in society: the development of higher psychological processes 131 Harvard University Press Cambridge, Mass
20. Wit E 2003 Who wants to be...the use of a personal response system in statistics teaching MSOR Connections 3 14 20

Supplemental Material

No supplementary material available for this content.

Loading

Article metrics loading...

/content/journal/jmbe/10.1128/154288106X14285807012764
2006-05-01
2019-10-22

Abstract:

The use of a Classroom Response System (CRS) was evaluated in two sections, A and B, of a large lecture microbiology course. In Section B the instructor used the CRS technology at the beginning of the class period posing a question on content from the previous class. Students could earn extra credit if they answered the question correctly. In Section A, the class also began with an extra credit CRS question. However, CRS questions were integrated into the lecture during the entire class period. We compared the two classes to see if augmenting lectures with this technology increased student learning, confidence, attendance, and the instructor’s ability to respond to student’s misconceptions, over simply using the CRS as a quizzing tool. Student performance was compared using shared examination questions. The questions were categorized by how the content had been presented in class. All questions came from instructors’ common lecture content, some without CRS use, and some questions where Instructor A used both lecture and CRS questions. Although Section A students scored significantly better on both types of examination questions, there was no demonstrable difference in learning based on CRS question participation. However, student survey data showed that students in Section A expressed higher confidence levels in their learning and knowledge and indicated that they interacted more with other students than did the students in Section B. In addition, Instructor A recorded more modifications to lecture content and recorded more student interaction in the course than did Instructor B.

Highlighted Text: Show | Hide
Loading full text...

Full text loading...

/deliver/fulltext/jmbe/7/1/jmbe-7-1-3.xml.a.html?itemId=/content/journal/jmbe/10.1128/154288106X14285807012764&mimeType=html&fmt=ahah

Figures

Image of FIG.1

Click to view

FIG.1

Comparison of scores of students answering CRS questions during lecture. Score 1 (light bar) is the percentage of correct scores when students answered the question after being given a brief lecture on the topic by the instructor. Students were then instructed to discuss the question and reanswer. Score 2 (dark bar) is the percent correct upon reanswering.

Source: J. Microbiol. Biol. Educ. May 2006 vol. 7 no. 1 3-11. doi:10.1128/154288106X14285807012764
Download as Powerpoint
Image of FIG.2

Click to view

FIG.2

The average student performance on six or seven exam questions that were identical in each section and were either covered by lecture only in both classes (non-CRSQ) or by clicker questions in only Section A (CRSQ). Section A (light bar) is the section that used CRS questions during lecture; Section B (darker bar) used CRS questions only as quizzes at the beginning of class. Standard deviation of scores is indicated by error bars.

Source: J. Microbiol. Biol. Educ. May 2006 vol. 7 no. 1 3-11. doi:10.1128/154288106X14285807012764
Download as Powerpoint
Image of FIG.3

Click to view

FIG.3

End of the semester student learning questionnaire.

Source: J. Microbiol. Biol. Educ. May 2006 vol. 7 no. 1 3-11. doi:10.1128/154288106X14285807012764
Download as Powerpoint

This is a required field
Please enter a valid email address
Please check the format of the address you have entered.
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error