You are here

Encouraging Active Student Participation in Chemistry Classes with a Web-based, Instant Feedback, Student Response System

Author(s): 

Charles R. Ward, James H. Reeves, and Barbara Heath, University of North Carolina at Wilmington

03/28/03 to 04/03/03
Abstract: 

Among the most difficult challenges faced by chemistry instructors is how to actively involve students in classroom discussion. The advent of networked computing devices provides new possibilities for facilitating active student involvement while simultaneously providing "on-the-fly" information about their understanding of the material being covered. At the University of North Carolina at Wilmington, we have developed a Student Response System (SRS) that provides instantaneous graphical summaries of answers supplied by students using Web-based answer pads. This paper will discuss the design of the SRS system, present data about the level of student participation, provide examples of questions that are best suited to this approach, and conduct a live demonstration of the system.

Paper: 

 Introduction

Much of the debate surrounding the use of lecture-based classes as a method for teaching science has focused on the passive role assumed by students in lecture (1, 2,3). The communication channel is primarily instructor-to-student. Numerous studies have shown that student performance in science classes improves with increasing levels of active participation by students in classroom discussions (4, 5, 6). Cooperative learning techniques, classroom discussion, and turn-to-your-neighbor (7)activities are all methods designed to improve not only instructor-to-student communication, but also student-to-instructor and student-to-student communication. In this paper, we describe a new technology-based Student Response System (SRS) for improving instructor-to-student and student-to-instructor communication that we have used effectively in both large and small classes.  

Student Response Systems, also referred to as Audience Response Systems (ARS), Personal Response Systems (PRS), and Classroom Communication Systems (CCS), all incorporate the following components: 

  • a mechanism for presenting a question to a group of students,
  • some type of mechanical or electronic device to collect answers or feedback from students, and
  • a mechanism for anonymously displaying to the students the responses collected from the students.

This last item, displaying anonymous student responses to the class, is what separates SRS systems from electronic testing systems such as those embedded in course management systems (e.g. WebCTBlackboard) or systems specifically designed to test students or employees (e.g. VUEETSQuestionmark).  

Early versions of SRS systems were homemade affairs usually installed in university lecture halls. Later, companies such as GE and IBM created commercial versions(8, 9). These systems used long wires to connect the student response units (often called response pads) to the data collection unit. Research into the use and effectiveness of the first-generation SRS systems focused on the impact the systems had on student achievement. In reviewing these studies, four distinct patterns of classroom use emerge.  

  1. Questions interspersed during lecture (10-13).
  2. Questions presented at beginning of class only (14, 15).
  3. Questions embedded in an auto-run, multi-media program (16, 17).
  4. Questions requesting feeling or belief indicators (18, 19).

Although many of the articles published about the first-generation of SRS systems mention the need to improve interactions between students and instructors, the classroom interactions described in many of these early studies seem to be limited to the communication of answers, not discussion of the content. Six of the ten articles reviewed described using the system as a quizzing tool during class (10-12, 14, 15, 18). In five of the studies, researchers attempted to measure student achievement differences between those who used SRS systems and those who did not. No significant improvement was noted when the SRS system was used (10, 12, 14, 15, 18).

Two studies (19, 13) focused on the use of SRS systems as a means to improve classroom discussion of the concepts being covered. They showed that using SRS systems promoted class discussion and lowered barriers to speaking and asking questions in class. These studies were not designed to measure differences in student achievement.  

Few research reports covering the use of SRS systems were published from the late 1970s through the middle of the 1990s. This was probably due to the cumbersome and expensive nature of the first-generation SRS systems, which limited their classroom appeal. Interest in SRS systems began to build again in the late 1980s as advanced networking technologies such as local area networks (LANs) along with infrared-based (IR) and radio frequency-based (RF) communications started to be incorporated into SRS systems and tested in classrooms (20-24). Commercial systems based on these technologies also appeared, including: Classtalk (IR),Respondex (RF), eInstruction (IR), and Educue (IR). 

Classroom research associated with the second-generation SRS systems focused on the multiple functions incorporated in the systems for assisting an instructor with such things as attendance taking, quizzing, classroom discussion, and course pacing (20, 25, 22). Evidence is presented showing that using SRS systems for attendance and quizzing increases both attendance and participation in class (21, 26). Small group discussion also appears to be improved by the use of SRS systems (23). Few studies with second-generation SRS systems focused on the use of these systems as a means to increase participation of the entire class in discussion.  

The current implementation of SRS systems appears to be limited at all grade levels. Mitigating factors to widespread adoption include the relatively high cost of commercial systems, limited classroom applications for these systems, and a paucity of research evidence supporting the effectiveness of this form of instructional technology. Other issues related to the current systems include software upgrading, maintenance, and reliability (27).

Numina II SRS: A Third-Generation SRS System 

Believing in the potential of a well-designed SRS system for improving science and mathematics instruction, a research team comprised of faculty from chemistry, mathematics/physics, and computer science at the University of North Carolina at Wilmington set out to design a system that would overcome existing SRS limitations while providing new capabilities that would be impossible to deliver with second-generation designs. This third-generation design, called the Numina II Student Response System, incorporates the following features. 

  • Web-based. The system relies on Web technologies for all communications that take place between the system and users. This means that any device capable of displaying a simple Web page can be used to interact with the system.
  • Hardware independent. The system is independent of the hardware platform available to students and instructors so will work with whatever computing technology a school already owns. It will work with PCs, Macs, workstations, laptops, desktops, Pocket PCs, Palm devices, and Web-enabled mobile phones.
  • OS independent. The system is independent of the operating system present on devices so it will work with Windows, Mac OS, Palm OS, Pocket PC OS,Unix, and Linux. (Certain advanced features require the operating system to support Macromedia Flash.)
  • Utilizes existing network infrastructure. The system operates over wired as well as wireless networks using whatever network infrastructure already exists on campus.

        Multiple response interfaces. The availability of a rich variety of interfaces means that students are not limited to simple multiple-choice or yes-no responses (Figure 1).

        Database driven. All data related to a session, regardless of whether questions were presented from the database or on-the-fly, are stored in an online database for either immediate display and analysis or for review at a later time.

  • Student anonymity. Classroom management data (e.g. attendance) are stored separately from student response data to ensure the anonymity of student responses.
     

Figure 1
Various Student Response Interfaces Shown on Pocket PCs

When using the Numina II SRS system, the instructor presents a question or series of questions to the class. The questions can be drawn from a database of questions or asked on-the-fly. Each question is displayed to the entire class using a large monitor or projection system as shown in Figure 2.

Figure 2
Instructor and Classroom View

The format of the student response depends on the interface chosen by the instructor for the particular question being asked (i.e. multiple choice, yes-no, true-false, confidence, graphics, etc.). This interface is displayed on the students device (e.g. desktop, laptop, Pocket PC, cell phone, etc.). Examples of student response interfaces are shown in Figure 1.  

Once the student has decided on his or her response, the response is submitted to and stored in the database. The instructor has the option of allowing each student to respond only once to a particular question or to respond multiple times. The instructor also has the option of deciding to show the summary of student responses as they are being received or to wait until all of the students have finished responding.  

The format for the display of student data depends on the type of interface that was chosen for the response. For example, data from the multiple-choice response interface are displayed as a bar chart (Figure 4) whereas data from the graphics response interface are displayed as a scatter chart (Figure 6).

The use of the Numina II SRS system in lecture and laboratory classes has resulted in noticeable improvements in a number of important areas. 

  •  Student participation in question sessions is consistently near 100%.

  • Student-instructor interaction, in the form of discussion, is increased (frequency) and more widespread (distribution).

  • Instructors gain immediate information regarding the extent to which students understand the concepts or procedures being presented.

  • Instructors make informed decisions based on student data that impact the pace of class, the curriculum, and classroom procedures.

  • Fewer procedural questions are asked during laboratory sessions, freeing time for the students to think more about what they are doing and giving the instructor more time to assist with technique and data analysis.

These results have been reported not only for chemistry classes, but also for classes in mathematics, educational psychology, computer science, instructional technology, and business management. The system has even been used to assist with decision making during meetings of the University Executive Council. Student opinion surveys have indicated nearly unanimous support for using the SRS system over other methods of classroom questioning.

Examples from Chemistry Classes at UNCW 

The Numina II SRS system has been used four different ways within the chemistry department at UNCW. Instructors have used the system to: (1) ask content questions during lecture and lab, (2) check understanding of procedures and techniques prior to starting a laboratory exercise, (3) solicit end-of-semester feedback, and (4) train teaching assistants. Examples of two of these uses (content questioning and TA training) are described here. 

Content Questioning. Prior to class, the instructor prepares a series of questions related to the concepts to be covered that day. The questions are stored within the SRS system and assigned to a set that will be used with that days session. Questions should be carefully constructed to avoid misinterpretation and should include answer choices that provide clear information to the instructor about student thinking. At UNCW, we use Pocket PCs equipped with wireless network cards for student input devices. The Pocket PC comes with a version of the Internet Explorer Web browser, which includes support for Macromedia Flash. As students enter the classroom, they pick up a Pocket PC from one of the distribution carts (Figure 3) and return it at the end of class.

Figure 3
Distribution System for Pocket PCs

At an appropriate time during the lecture, the instructor opens the SRS system and displays the desired question from the question set. As soon as the question is displayed, the students response units are automatically synchronized with the current question number and display screen. For example, when the multiple-choice question shown in Figure 4 is displayed to the class, the screens of each Pocket PC in the room are updated with the multiple-choice response pad shown in Figure 5.

Figure 4
Question and Responses: Multiple Choice Interface

 

Figure 5
Student Response Pad: Multiple Choice Interface

 

The instructor has the option of displaying the graph of student responses as they are submitted or waiting until everyone has finished before displaying the graph.  The most common technique is to wait until everyone has finished and then show the results.  This avoids the problem of slow responders being influenced by the results of those who responded quickly.

 

A total of 23 out of 24 students answered the question shown in Figure 4.  The correct answer to this problem is c.  It is clear from the pattern of results that most students who answered this question did not consider the fact that ionic compounds, such as Mg(NO3)2, dissociate in aqueous solution.  Only 20% of the class chose the correct answer while 60% of the class chose the option where the magnesium nitrate is dissolved but undissociated.  The instructor now has the information he or she needs to make a decision about instruction for the rest of the class period (i.e. review this concept, ask more questions, go on to the next topic, etc.).  By using the SRS system, the instructor is able to base this decision on results obtained from 96% of the class rather than the two or three students who normally raise their hands to answer questions like this.  

Figure 6 illustrates a question and answer format that, to our knowledge, is unique to the Numina II SRS system.  In this question, students are asked to point to an area on the graph that corresponds to equilibrium conditions.  What the student sees on his or her response pad is shown in Figure 7.

Figure 6
Question and Responses: Graphics Interface

 

Figure 7
Student Response Pad: Graphics Interface

The student selects a region of the graph displayed on their device and taps the screen (or clicks the mouse button if they are using desktop or laptop computers). This action places a pointer on the graph at the location where the student tapped. When the student is satisfied with her choice, she taps the Submit button to send the coordinates of her selection to the database. The SRS system displays the results of these selections for the entire class as a scatter plot (Figure 6). 

In this example, 22 out of 24 students responded to the question. The instructor can quickly determine that most students (70%) chose the correct region of the graph, but 30% of the students believe that equilibrium represents the condition where the concentrations of reactants and products are equal. Again, the system has provided valuable information about the extent to which the entire class understands this important concept. 

The design of the Numina II SRS system makes it easy to incorporate graphic images into both questions and answers. The question shown in Figure 8, part of a pre-lab briefing, was designed to determine how well students had learned the colors of common ions in solution as well as a set of simple solubility rules for inorganic compounds. The results indicate that 100% of the students responded to the question, but only 50% of them chose the correct answer (d). Interestingly, 40% of the students chose NH4Cl as the correct answer, which indicates that a significant portion of the class is confused about the difference between ammonia and the ammonium ion. This is an important concept and one that can have disastrous consequences in lab if not properly understood. It is highly unlikely that the instructor of this lab would have been aware of this problem had it not been for his use of the SRS system.

Figure 8
Question and Responses: Multiple Choice Interface

The Numina II SRS can also be used to ask questions on-the-fly. In this mode, the instructor selects a response format and then presents a question, along with possible answers, to the class verbally. Student responses are displayed just as they are for pre-prepared questions. The responses are stored in the database along with a text field in which the instructor can type the question that was asked in class.  

TA Training. The attitudes and knowledge that teaching assistants (TAs) have of legal issues related to teaching (sexual harassment, FERPA, etc.) can be a significant factor in their success or failure as an instructor (not to mention the consequences to their students and the department). Unfortunately, it is difficult to assess and even more difficult to discuss these attitudes with beginning graduate students. We have found the SRS system to be an ideal mechanism for facilitating group discussions of these sensitive topics. Figure 9 shows an example of the results obtained from a group of 13 TAs during their fall training program on a question related to improper conduct. It is clear that the group was evenly divided on this issue. What is not revealed in the data is the lively discussion that followed the presentation of these results to the group. The faculty leading the discussion had never witnessed such a frank and open discussion of these issues and they attributed it to the ability of the SRS system to draw everyone into the discussion.

Figure 9
Questions and Responses: True-False Interface

Issues Related to Implementing SRS Systems with Mobile Devices

Although the Numina II SRS system was designed to work with many different types of computing equipment, much of the design work focused on making the system compatible with mobile devices such as Personal Digital Assistants (PDAs) and Web-enabled mobile phones. Many instructors believe that using PDAs and mobile phones in class could lead to disruptive behavior that might interfere with the instructional intent of the SRS system. The following three questions are the ones most frequently asked when discussing our use of the SRS system.

  1. Does off-task behavior increase when students have mobile Internet-ready devices in their hands?
  2. Do technical problems with the devices and wireless network connections interfere with the implementation of the system?
  3. Does the distribution and collection of the devices take up too much class time?

Several studies were designed to investigate these questions and determine whether access to mobile computing devices and the Internet had any negative impact on the use of the SRS system in classrooms and laboratories. A time management study was conducted in a large lecture hall and in several smaller laboratory sections to determine how much class time was lost due to the distribution and collection of PDAs. It was found that it took less than four minutes to distribute and collect 100 PDAs in the lecture hall and less than three minutes to distribute and collect 24 PDAs in the lab. Furthermore, most of the distribution and collection occurred during class changes so that, on average, less than two minutes of instructional time was lost.  

Two lab sections were studied to determine the degree to which instruction was disrupted by off-task behavior and one lab section was observed for the effect of technical problems. Trained observers used specially designed instruments to record off-task behavior among students as well as any technical problems they encountered. Observations were made while the lab instructors conducted post-lab sessions using the SRS system.  

Only two technical problems were found among the 20 students observed. Neither of these problems affected the entire class and both were resolved in less than two minutes. These observations matched anecdotal information from other instructors and confirmed that technical issues related to PDAs and wireless networking have a negligible impact on instruction. 

Figures 10 and 11 summarize the observations made of off-task behavior during SRS sessions using Web-enabled PDAs. The data show that at all times at least 80 percent of the students were on task and that for approximately half of the time over 90 percent were on task.

Figure 10
Percent of Students Off-task vs. Time for Lab Section 1

 

Figure 11
Percent of Students Off-task vs. Time for Lab Section 2

The most frequently observed off-task behaviors were Web surfing, reading e-mail, and playing solitaire. These data indicate that, during an SRS session, students are actually more engaged in the instructional process than they are in other classroom or laboratory activities. This information is corroborated by less formal observations made during numerous SRS sessions.

Conclusions 

Student response systems, such as the Numina II SRS, are effective means of actively engaging students in thinking about concepts being discussed in class. When used with well-designed questions, nearly 100% of the students in a class will respond. The amount of classroom discussion of a topic is increased and more widespread when SRS systems are used to initiate the discussion. 

Evidence suggests that students are more attentive in class during an SRS session and exhibit fewer off-task behaviors. Students report that they like using SRS systems and prefer them to more traditional means of classroom questioning. 

Microsoft PowerPoint presentation is available that describes in more detail the technology used in the Numina II SRS system, the history of the project, and the project personnel. More information about the Numina II SRS, including operating instructions, can be accessed at the project Web site:http://aa.uncwil.edu/numina/srs

Acknowledgements

 

The authors wish to acknowledge the following members of the Numina Project team at the University of North Carolina at Wilmington for their contributions to the design, programming, and testing of the Numina II Student Response System: Dr. Ronald Vetter, Department of Computer Science, Dr. Gabriel Lugo, Department of Mathematics and Statistics, Dr. Russell Herman, Department of Mathematics and Statistics, and Ms. Jennifer Bishoff, Department of Chemistry. 

Funds for the development of the Numina II SRS were provided by the National Science Foundation, NSF Award #IIS-0002935, Pearson Education, and theUniversity of North Carolina at Wilmington. 

Literature Cited

  1. Cooper, M.M. J. Chem. Ed199572, 162-164.
  2.  Kraft, R.G. Coll. Teach198533, 149-154.
  3. Ebert-May, D., Brewer, C. Bioscience199747, 601-608.
  4. Francisco, J.S., Nicoll, G., Trautmann, M. J. Chem. Ed199875, 210-213.
  5. White, J.M. J. Chem. Ed197249, 772-774.
  6. Hunter, W.E. J. Coll. Sci. Teach19732, 35-38.
  7. Cooper, J.L., Robinson, P. New Dir. Teach. Learn200081, 17-24.
  8. Lewis, P. (1967). Nat. Sch196780, 60- 66.
  9. Ely, D.P. In Instructional Hardware/A Guide to Architectural Requirements. EFL: city, 1970; pp 99-102.
  10. Casanova, J. J. Chem. Ed197148, 453-455.
  11. Littauer, R. Ed. Tech197212, 69-71.
  12. Whitehead, J.L.; Bassett, R.E. AV Inst197520, 22-25.
  13. Uno, G. E. Am. Biol. Teach. 198446, 229-232.
  14. Bessler, W.C.; Nisbet, J.J. Sci. Ed. 1971, 55, 275-284.
  15. Brown, J.D. (1972). JEE. 197240, 12-20.
  16. Olsen, R.W., Lukas, T.G. J. Coll. Sci. Teach. 1977, 6, 54-55.
  17. Roush, D.L. (1968). Ed. Tech19688, 12-13.
  18. Rubin, S. In AERA Proceedings: Minneapolis, MN, 1970; 1-15.
  19. Garg, D.P. (1975, June). Conference on Computers in the Undergraduate Curricula, Fort Worth, TX, 1975, 3-17.
  20. Carver, C.A.; Ressler E.K.; Biehler M.A. Journal of IS Education On-line. http://gise.org/JISE/Vol7/v72_4.htm (accessed Nov 2001).
  21. Shapiro, J.A. J. Coll. Sci. Teach. 199726, 408-412.
  22. Guttenberg, N.; Bennhold, C.; Feldman, G. Interactive Student Engagement with an Electronic Response System.http://www.cidd.gwu.edu/tools/aapt_01.pdf (accessed Dec 2002).
  23. Madill, B. Case Study. http://www.be.coventry.ac.uk/BPBNetwork/casestudy/uce_tla3i.htm (accessed Dec 2002).
  24. Robertson, L.J. Med. Teach2000, 22, 237-239.
  25. Dufresne, R.J.; Gerace, W.J.; Leonard, W.J.; Mestre, J.P.; Wenk, L. J. Comput. High. Educ19967, 3-47.
  26. Ober, D. The Compleat Learner. http://web.bsu.edu/tla/resources/news/1997-98/Nov1997A1.htm (accessed Dec 2002).
  27. Kaitchuck, R. A student response system. http://web.bsu.edu/abit/meet/feb00rk.htm (accessed Dec 2002).

*Author to whom correspondence should be addressed.

Copyright © by Ward, Reeves, & Heath, all rights reserved