1887

Evaluating the Impact of a Classroom Response System in a Microbiology Course

    Authors: ERICA SUCHMAN1,*, KAY UCHIYAMA2, RALPH SMITH1, KIM BENDER3
    VIEW AFFILIATIONS HIDE AFFILIATIONS
    Affiliations: 1: Department of Microbiology, Immunology and Pathology, and; 2: School of Education and; 3: Office of the Provost, Colorado State University, Fort Collins, CO 80523
    AUTHOR AND ARTICLE INFORMATION AUTHOR AND ARTICLE INFORMATION
    • *Corresponding author. Mailing address: Department of Microbiology, Immunology and Pathology, Colorado State University, Fort Collins, CO 80523. Phone: (970) 491-6521 Fax: (970) 491-1815. E-mail: erica.suchman@colostate.edu.
    • Copyright © 2006, American Society for Microbiology. All Rights Reserved.
    Source: J. Microbiol. Biol. Educ. May 2006 vol. 7 no. 1 3-11. doi:10.1128/154288106X14285807012764
MyBook is a cheap paperback edition of the original book and will be sold at uniform, low price.
  • PDF
    99.18 Kb
  • XML
  • HTML
    55.97 Kb

    Abstract:

    The use of a Classroom Response System (CRS) was evaluated in two sections, A and B, of a large lecture microbiology course. In Section B the instructor used the CRS technology at the beginning of the class period posing a question on content from the previous class. Students could earn extra credit if they answered the question correctly. In Section A, the class also began with an extra credit CRS question. However, CRS questions were integrated into the lecture during the entire class period. We compared the two classes to see if augmenting lectures with this technology increased student learning, confidence, attendance, and the instructor’s ability to respond to student’s misconceptions, over simply using the CRS as a quizzing tool. Student performance was compared using shared examination questions. The questions were categorized by how the content had been presented in class. All questions came from instructors’ common lecture content, some without CRS use, and some questions where Instructor A used both lecture and CRS questions. Although Section A students scored significantly better on both types of examination questions, there was no demonstrable difference in learning based on CRS question participation. However, student survey data showed that students in Section A expressed higher confidence levels in their learning and knowledge and indicated that they interacted more with other students than did the students in Section B. In addition, Instructor A recorded more modifications to lecture content and recorded more student interaction in the course than did Instructor B.

Key Concept Ranking

Transformation
0.6015761
Lead
0.46428573
0.6015761

References & Citations

1. Black P, Wiliam D1998Assessment and classroom learning. Assessment in EducPrinciples Policy Practice577410.1080/0969595980050102 http://dx.doi.org/10.1080/0969595980050102
2. Blackman MS, Dooley P, Kuchinski B, Chapman D2002It worked a different wayColl Teaching50272810.1080/87567550209595868 http://dx.doi.org/10.1080/87567550209595868
3. Borden VMH, Burton KL1999The impact of class size on student performance in introductory courses: AIR 1999 annual forum paper 21Association for Institutional ResearchTallahassee, Fla
4. Bullock DW, LaBella VP, Clingan T, Ding Z, Stewart G, Thibado PM2002Enhancing the student-instructor interaction frequencyPhys Teacher4053554110.1119/1.1534821 http://dx.doi.org/10.1119/1.1534821
5. Cooper JL, Robinson P2000The argument for making large classes seem smallNew Directions for Teaching Learning200051610.1002/tl.8101 http://dx.doi.org/10.1002/tl.8101
6. Denig SJ2004Multiple intelligences and learning styles: two complementary dimensionsTeachers Coll Rec1069611110.1111/j.1467-9620.2004.00322.x http://dx.doi.org/10.1111/j.1467-9620.2004.00322.x
7. Dillon M, Kokkelenberg EC2002The effects of class size on student achievement in higher education: applying an earnings function42nd Annual AIR ForumOntario, CanadaAssociation for Institutional ResearchTallahassee, Fla
8. Duncan D2005Clickers in the classroomPearson, Addison WesleyBoston, Mass
9. Elliott C2003Using a personal response system in economics teachingInt Rev Economics Educ18086
10. Fostnot CT1996Constructivism: theory, perspectives, and practice228Teachers College PressNew York, N.Y
11. Judson E, Sawada D2002Learning from past and present: electronic response systems in college lecture hallsJ. Computers in Math. Sci. Teaching21167181
12. Menges RJ, Austin AE2001Teaching in higher education11221156 Richardson VHandbook of research on teachingAmerican Educational Research AssociationWashington D.C.
13. Paschal CB2002Formative assessment in physiology teaching using a wireless classroom communication systemAdv Physiol Educ2629930812444002
14. Roschelle J, Penuel WR, Abrahamson L2004Classroom response and communication systems: research review and theory8AERA 2004 paper proposal, San Diego, CaliforniaAmerican Educational Research AssociationWashington D.C.
15. Sadler DR1989Formative assessment and the design of instructional systemsInstructional Sci1811914410.1007/BF00117714 http://dx.doi.org/10.1007/BF00117714
16. Sadler DR1998Formative assessment: revisiting the territoryAssessment in Educ. Principles Policy Practice5778410.1080/0969595980050104 http://dx.doi.org/10.1080/0969595980050104
17. Shepard LA2001The role of classroom assessment in teaching and learning10661101 Richardson VHandbook of research on teachingAmerican Educational Research AssociationWashington D.C.
18. Slain D, Abate M, Hodges BM, Stamatakis MK, Wolak S2004An interactive response system to promote active learning in the doctor of pharmacy curriculumAm J Pharm Educ681910.5688/aj6805117 http://dx.doi.org/10.5688/aj6805117
19. Vygotsky LS1978Mind in society: the development of higher psychological processes131Harvard University PressCambridge, Mass
20. Wit E2003Who wants to be...the use of a personal response system in statistics teachingMSOR Connections31420
154288106X14285807012764.citations
jmbe/7/1
content/journal/jmbe/10.1128/154288106X14285807012764
Loading

Citations loading...

Supplemental Material

No supplementary material available for this content.

Loading

Article metrics loading...

/content/journal/jmbe/10.1128/154288106X14285807012764
2006-05-01
2017-09-25

Abstract:

The use of a Classroom Response System (CRS) was evaluated in two sections, A and B, of a large lecture microbiology course. In Section B the instructor used the CRS technology at the beginning of the class period posing a question on content from the previous class. Students could earn extra credit if they answered the question correctly. In Section A, the class also began with an extra credit CRS question. However, CRS questions were integrated into the lecture during the entire class period. We compared the two classes to see if augmenting lectures with this technology increased student learning, confidence, attendance, and the instructor’s ability to respond to student’s misconceptions, over simply using the CRS as a quizzing tool. Student performance was compared using shared examination questions. The questions were categorized by how the content had been presented in class. All questions came from instructors’ common lecture content, some without CRS use, and some questions where Instructor A used both lecture and CRS questions. Although Section A students scored significantly better on both types of examination questions, there was no demonstrable difference in learning based on CRS question participation. However, student survey data showed that students in Section A expressed higher confidence levels in their learning and knowledge and indicated that they interacted more with other students than did the students in Section B. In addition, Instructor A recorded more modifications to lecture content and recorded more student interaction in the course than did Instructor B.

Highlighted Text: Show | Hide
Loading full text...

Full text loading...

/deliver/fulltext/jmbe/7/1/jmbe-7-1-3.xml.a.html?itemId=/content/journal/jmbe/10.1128/154288106X14285807012764&mimeType=html&fmt=ahah

Figures

Image of FIG.1

Click to view

FIG.1

Comparison of scores of students answering CRS questions during lecture. Score 1 (light bar) is the percentage of correct scores when students answered the question after being given a brief lecture on the topic by the instructor. Students were then instructed to discuss the question and reanswer. Score 2 (dark bar) is the percent correct upon reanswering.

Source: J. Microbiol. Biol. Educ. May 2006 vol. 7 no. 1 3-11. doi:10.1128/154288106X14285807012764
Download as Powerpoint
Image of FIG.2

Click to view

FIG.2

The average student performance on six or seven exam questions that were identical in each section and were either covered by lecture only in both classes (non-CRSQ) or by clicker questions in only Section A (CRSQ). Section A (light bar) is the section that used CRS questions during lecture; Section B (darker bar) used CRS questions only as quizzes at the beginning of class. Standard deviation of scores is indicated by error bars.

Source: J. Microbiol. Biol. Educ. May 2006 vol. 7 no. 1 3-11. doi:10.1128/154288106X14285807012764
Download as Powerpoint
Image of FIG.3

Click to view

FIG.3

End of the semester student learning questionnaire.

Source: J. Microbiol. Biol. Educ. May 2006 vol. 7 no. 1 3-11. doi:10.1128/154288106X14285807012764
Download as Powerpoint

This is a required field
Please enter a valid email address
Please check the format of the address you have entered.
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error