Clickers, or Does Technology Really Cure What Ails You?

ACRLog welcomes a guest post from Cori Strickler, Information Literacy Librarian at Bridgewater College.

During idle times at the reference desk, or when the students are gone for a break, I find myself creating instruction “wish lists” of tools or gadgets that I’d love to have for my sessions. One item that has been on my list for a few years now is clickers, or student response systems as they are officially called. In academic classrooms they are used for attendance, quiz taking, or other more informal assessments. For me, I saw clickers as a way to solve one of my basic and most frustrating problems: getting students to be engaged during the sessions. Students have little desire to participate in library sessions and trying to get them to comment on their library experience is like pulling teeth, except that the process is a lot more painful for me than it is for the students.

For those of you who haven’t heard of clickers before, they are little remote control like devices that allow the students to answer multiple choice questions by sending their responses to the computer for real time analysis. They are sort of like the devices they use on Who Wants to Be a Millionaire to poll the audience.

My library doesn’t have the budget for clickers, but this semester through a chance discussion with the director of the health services department, I learned that the college received a grant for 100 TurningPoint clickers and the necessary software. The director rarely needed all of the clickers at the same time, so she offered about fifty for me to use during my instruction sessions.

So, I now have access to a tool that I had coveted for many years, but that was only the easy part. I still have to figure out how to meaningfully integrate this technology into my sessions.

My overall goals are relatively simple. I want to encourage student involvement in any way possible so I would not have to lecture for fifty minutes straight. My voice just can’t handle the pressure. To be successful, though, I need to be purposeful with my inclusion. I can’t just stick a clicker quiz at the beginning of a session and assume that the students will suddenly be overwhelmed with a desire to learn everything there is about the library. Most faculty who schedule a library instruction session have a particular purpose in mind, so I also need to be sure that I fulfill their expectations as well.

After much consideration, I decided not to add the clickers to all my sessions. Instead, I decided to focus on first year students, who hopefully aren’t quite as jaded as the upper classmen, and haven’t already decided that they know everything about research.

For my first clicker experiment, I used them with a quiz to help me gauge the classes’ knowledge of the library. I also decided to use them as an alternative way to administer our session evaluation survey. Ultimately, I had mixed results with the clickers. The students did respond better than before, but I did not get full participation. While this isn’t a big issue with the quiz, this lack of participation was an issue when they were asked to complete the evaluation survey. For most survey questions I lacked responses from five or six students, which was a larger number than when I used the paper surveys and could potentially affect my survey results.

Their lack of participation could be due to a number of reasons. The students claimed they were familiar with the clickers, but they did not seem to be as adept as they claimed. Also, due to my inexperience with the clickers there might have been a malfunction with the devices themselves. Or, maybe the students just didn’t want to engage, especially since there was still no incentive to participate. When I looked back through the survey results, they did not seem to indicate any greater amount of satisfaction regarding the sessions.

This first experience with the clickers left me a bit skeptical, but I decided to try them again. This time, I created brief quizzes related to brainstorming keywords and types of plagiarism. My second class was smaller than the first, and I seemed to receive better engagement. The clickers also seemed to allow them to be more honest with the surveys and they seem more comfortable indicating their disinterest in the information presented, though the results also indicated that they saw the overall value in the information.

I have used the clickers in about twelve sessions this semester, and overall they were well received by the students. However, I am not completely sure that it adds significantly to the engagement. I also have not seen any indication in the surveys that my sessions are better or worse with their inclusion. I have discovered though that there may be some sessions, and topics, that are better suited for clickers than others. Upper level classes where I am trying to show specific resources do not lend themselves initially to clickers, and the time may be better spent with other activities or instruction.

I am still in the process of learning how clickers will fit into my classes, but I would generally call them a success, if only for the fact that is makes the survey process easier. Though, they aren’t the panacea for student engagement for which I had hoped. Activity type and student familiarity are essential variables that appear to affect clicker success.

Unfortunately, the overall nature of one shot instruction seems to be the greatest contributor to student disengagement. Student and faculty buy-in is the necessary component for library instruction success, whether it includes clickers or not.

Author: Maura Smale

Maura Smale is Chief Librarian at The Graduate Center, City University of New York.

3 thoughts on “Clickers, or Does Technology Really Cure What Ails You?”

  1. Do you find that the clickers allow assessment to be quicker? That’s the purpose I could see them being useful for — rather than having to spend a bunch of time administering a quiz that wouldn’t count for a ‘real’ grade anyway, I imagine I could at least get some quick baseline feedback.

  2. Olivia, I did find that that was the case. A paper copy for the survey takes maybe 5 minutes, and the oral “quizes” take about 10 mintues or so. Using the clickers for the assessment definitely took less time. The only issue I came across with the clicker survey is that I would sometimes have to wait a few moments until all the students had submitted their answer before moving onto the next question. The students either wouldn’t want to respond or were having trouble with the technology. I’m still not really sure how long to wait for the response of that one last person before moving on. Especially since, as you said, the quizes and surveys don’t count for a real grade.

  3. I appreciated your discussion about what did not seem to work well for you with clickers. I’ve been using them for a couple of years now with one-shot sessions for groups of 80-90 students where engagement is a real challenge. However, what I’ve found is that they seem to be most effective when I intersperse questions throughout the session, rather than trying to use them as a pre-assessment or quiz tool. It’s great to test the students’ preconceptions of concepts, see the mix of their candid responses and then be able to either congratulate them on widespread understanding of the concept or, more usually, use it as a way to get them to discuss why they answered a particular way and lead them to a common understanding of the “right” answer. So, for instance, in discussing the peer review process with an introductory science course, I’ll get them to talk through the process and then ask them things like “How much do peer reviewers get paid?” Another dyad is “All scholarly publications are peer-reviewed. True or false” then “All peer-reviewed journals are scholarly. True or false.” The responses are always quite revealing and often elicit gasps and many students turning to one another to say something about the outcome. This kind of use of the clickers not only leads to a series of teachable moments, but it does seem to engage and re-engage them during the session.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.