Audience Response Systems
Polling students during class to ensure that they are understanding the concepts that we’re lecturing about and adjusting our teaching based on those results has been a useful strategy to help drive in-class participation, student attention, feedback and remediation, and a more learner-centered approach (rather than content-centered) to teaching for several decades. Used well, it can provide the instructor with meaningful ways to gauge student understanding, broach sensitive topics by allowing for anonymous responses, demonstrate lack of consensus as means to explore nuance, and engage learners in dialogue during a face-to-face class session.
Audience response systems (ARS) comprise hardware and software that is used in conjunction with face-to-face educational processes to support, deepen, and enhance learning by promoting greater interaction between all those engaged in a learning activity (Banks, 2006). Audience Response Systems have different names in literature, such as Classroom Response Systems (e.g., Fies and Marshall, 2006), Student Response Systems (e.g., Kaleta & Joosten, 2007), Classroom Communication Systems (e.g., Beatty, 2004). Within Academic Technology Services at Minnesota State University, Mankato, we choose to place all of these names under the umbrella term, Audience Response Systems, which covers the idea of posing a question by the instructor or presenter, getting individualized (sometimes even anonymous) answers from a large audience or student body and presenting the results after aggregating the collected data.
The so-called hardware-based ARSs consist of “clickers” or “keypads”, that are used to collect data from individuals. The data is then processed by the system and can be represented in the class, or viewed afterward. The data is collected mainly by conducting multiple-choice questions. To use these systems in the classroom, either the instructor or students have to acquire specialized hardware. Academic Technology Services does not provide an enterprise solution for hardware-related ARS systems, however, this is not an uncommon approach for many faculty.
Software-based or cloud-based systems are developed entirely online in a web environment allowing instructor/presenter and audience to use the system through their own personal devices, such as computers, tablets, or phones. Academic Technology Services supports a campus-wide enterprise solution for this type of system, called Polleverywhere. This system can be utilized by faculty, staff, and students using their campus username and password. With Polleverywhere, the types of questions include a wider spectrum than that of clicker-based ARSs. Poll everywhere supports multiple-choice, true/false, and free-response questions.
Many of the strategies in this document align directly with Polleverywhere, as it is ATS’ Enterprise Tool, but efforts are made to recognize other potential tools suitable to achieve the stated results.
Quick Start Guide – Audience Response Systems 101
- Determine the educational purpose of questions: “…when the educational purpose for using the technology is firmly in place, the results in terms of student and lecturer satisfaction are better than when the technology is used for its own sake” (Banks, 2006, p. 68). The use of an ARS will be more effective if the instructor can first determine what lesson objectives are met through the implementation of student feedback, and what are the issues he/she wants to address with the help of an ARS. Quintin Cutts from the University of Glasgow suggests using ARSs to address the following four educational issues: engagement in lectures, focus on study habits, early engagement with course material, and testing fundamental concepts (Banks, 2006, p. 68).
- Compose questions: The efficacy of ARSs depends on the quality of questions. Creating effective questions is difficult and differs from creating exam and homework problems (Beatty et al., 2006). While designing a question it is important to think about the role the question will play within the course, the specific goals the question is designed to attain, and the channels or mechanisms by which it can attain these goals.
- Implementing questions into coursework: While using an ARS in the classroom, consider having questions strategically placed and interspersed throughout a presentation. Literature suggests that during a lecture, you may want to ask a question at least once every 10 minutes to keep your audience engaged and interested (Banks, 2006). If you and your students are first-time users of an ARS system, allot some time to practice how to use it. Before using the ARSs explain to students why you are using the system and what you expect from them. By setting expectations, you can help students contribute to the successful implementation of this technology in your course. Make sure students know whether the answers will be anonymous or not. If you are using the ARS for monitoring attendance do it on a daily basis (Caldwell, 2007).
- Monitor the results and adjust classroom activities: Have the results displayed on a screen as either graphs and values, or an aggregate of free responses in the form of a word cloud. When planning the lesson consider providing time to facilitate a discussion concerning the results you have collected. Make sure to summarize the discussion and explain the correct answer afterward (Caldwell, 2007).
This section outlines how you might begin to think about adopting the aforementioned teaching strategy and the tools you might consider employing.
- Just-in-Time Teaching: Audience Response Systems can be used to conduct a Just-in-Time Teaching strategy. JiTT is a pedagogical approach that combines the best features of traditional in-class instruction with the modern communication channels of 21st-century technology. Academic Technology Services provides further details into how you might approach a JiTT teaching strategy utilizing fully supported campus technology.
- Peer Instruction: According to Mazur (1997), Peer Instruction is a pedagogical approach in which the instructor stops lectures periodically to pose a question to the students. Students are given time to think over the question and report an individual answer. Having committed to an answer, they then discuss the question with a neighbor and determine if they should revise the answer. Students may choose to convince their partner as to why they believe their answer is correct. As the last step, the class reconvenes into a large class discussion. Audience Response Systems, particularly Polleverywhere can be perfectly suited in this pedagogical approach, as the system allows to implement multiple-choice questions in which possible answers represent common student ideas. Additionally, you can visually reflect on how student responses may have changed over the course of the activity.
- Conduct Backchannel discussion: As a cloud-based system, Polleverywhere makes it possible to create and implement different types of questions to foster discussions. Literature suggests implementing more dialogic types of questions, “… allowing a backchannel with a cloud-based representation of the most commonly chimed thoughts” (Higdon, Reyerson, McFadden, & Mummey, 2011).
- Critical Incident Questionnaire: This is a pedagogical method of finding out how students are experiencing their learning and the instructor’s teaching. The CIQ is a questionnaire that is delivered online or via clickers at the end of the last class of the week. It comprises five questions, each of which asks students to answer about some events that happened in the class that week. It gets them to focus on specific, concrete happenings that were significant to them (Brookfield, 1995). You can find the questions here.
For further information and ideas on how to construct audience response questionnaires, please see our supplemental material, Effective Design: Audience Response Questions
Banks, D. A. (2006). Audience response systems in higher education: Applications and cases. Hershey, PA: Information Science Pub.
Beatty, I. (2004). Transforming student learning with classroom communication systems. EDUCAUSE Center Appl. Res. (ECAR) Res. Bull., 2004(3), 1–13.
Beatty, I. D., Gerace, W. J., Leonar, W. J., and Dufresne, R. J. (2006). Designing effective questions for classroom response system teaching. American Journal of Physics. 74(1), 31–39.
Brookfield, S. (1995). Becoming a critically reflective teacher. San Francisco: Jossey-Bass.
Caldwell, J. E. (2007). Clickers in the large classroom: Current research and best practice tips. CBE Life Sciences Education, 6(1), 9-20
Fies, C., Marshall J. (2006). Classroom response systems: A review of the literature. Journal of Science Education and Technology, 15 (1), pp. 101–109
Higdon, J., Reyerson, K., McFadden, C., & Mummey, K. (2011). Twitter, Wordle, and ChimeIn as student response pedagogies. EDUCAUSE Quarterly, 34(1)
Kaleta, R., & Joosten, T. (2007). Student response systems: A University of Wisconsin system study of clickers. EDUCAUSE Research Bulletin, 2007 (10), 1-12.
Kay R. H., LeSage A.(2009). Examining the benefits and challenges of using audience response systems: A review of the literature. Computers & Education, 53,819–827.
Mazur, E. (1997). Peer instruction: A user’s manual. Upper Saddle River, N.J: Prentice Hall
Thalheimer W.(2007). Questioning strategies for audience response systems: How to use questions to maximize learning, engagement, and satisfaction. Retrieved November 31, 2009, from http://www.work-learning.com/catalog/