Tag Archives: library instruction

Flipping Out: Reflections Upon Landing

Last month, I shared my plans for creating “flipped” library instruction sessions. Now, after wrapping up my last flipped session, along with several conversations with my colleauges, and the opportunity to co-facilitate a “Flipped Classroom” faculty workshop, I am still digesting and evaluating all that I have learned. However, there are a few key takeaways that are bubbling to the forefront of my mind and actively shaping the rest of my instruction this semester.

“Did you know the whole section would be about my topic?”
Or – Understanding the flipped classroom as a vehicle for active learning

As I planned my flipped sessions, I struggled with understanding how flipped instruction is related to “active learning” and/or “problem-based learning. The library instruction program at my university already places a heavy emphasis on incorporating active learning exercises into our sessions, and we regularly attempt to tie library instruction directly to the course research assignments. This means that as I worked on my flipped session, I found myself modifying some existing in-class activities to promote deeper levels of understanding, rather than starting from scratch.

In one class, I had enough time to ask first-year students to search the catalog for a book about their research topic, go into the stacks to find their book, bring the book back to class, and then debrief about the experience with their classmates. It was much more effective, and quite frankly more fun, to talk about LC Classification and Subject Headings after one student spontaneously exclaimed – “I picked TWO books about my topic because I realized THE WHOLE SHELF was about sports technology!”

After this experience, I’ve come to understand my flipped classroom as a vehicle for creating additional space for active learning in the classroom. Of course, there is no such thing as “the” flipped classroom, and other interpretations of the flipped classroom abound. For me, the providing students with a pre-class “lecture” foundation on which they can build upon with active learning in the classroom was more successful than trying to cram both tasks into the regular class time.

“Oh… those videos before class weren’t optional?”
Or – Students might not complete the pre-class work. And that’s O.K.

Of course, this ideal “flipped classroom as a vehicle for active learning” assumes students come to class prepared. And a frequent concern about the flipped classroom is: “What if students don’t complete the pre-class work?” Unfortunately, there will always be students who come to class unprepared, and considering what the consequences will be for students if they don’t complete the pre-assigned work is important. Our students are smart – they learn quickly whether preparing for class is really necessary. Designing in-class assignments which require the prior knowledge gained through the pre-class homework is one way to show students it’s worth their while to come prepared.

In practice, other “consequences” for failing to complete pre-class work may mean students must complete the pre-class materials during in-class time before they are allowed to continue to the more interesting and challenging application exercises. We know some students may still struggle through class, since exposure to the pre-class activities does not necessarily guarantee students achieved any level of mastery with the material. In my sessions, I tried to purposefully use group activities in-class to emphasize peer learning, assuming students who completed and understood the material could be models for students who did not. Additionally, students were encouraged to review pre-class video materials if needed, and to ask questions as they worked through their activities.

The short quiz I paired with my pre-class material helped me monitor how many students completed pre-class work and how well they understood the material. In each of my flipped class sessions, over 3/4 of the students completed the work before class; I considered this to be a relatively high success rate. It was also helpful to go into the in-class session knowing the bulk of the class had at least attempted the pre-class work and where the problem areas we really needed to adress might be.

“But… aren’t you going to talk first?”
Or – Students are also curious about the lack of direct lecture in class.

Students often comment in their course evaluations or session feedback that library instruction should include more time for activities and less time devoted to lecture. As a new instructor, I struggle with this for a variety of reasons, not the least of which is that creating active-learning based instruction that allows students to “discover” answers to questions or build their own skills is frequently harder than falling back into the “sage on the stage” routine. Of course, there is also no guarantee that students will use class time appropriately when given the requested discussion or problem-based activities. So I was extremely interested to find out how students would respond to the lack of direct, in-class instruction in the flipped sessions.

During my first flipped class, I decided to give a “quick” review of the pre-class material before students started on their activities. Big mistake, since the “review” quickly turned into a regular lecture. However, during my second flipped class I simply asked students to come in and get started on their activity, reminding them they should work together and review the video materials or ask questions, as necessary. At first, they were confused about not starting with a lecture, however they eventually dove into the activity with success. And encouraging students to first attempt the activity allowed me to eventually review only the concepts that the majority of the class was consistently struggling with (for instance, correctly combining both “ANDs” and “ORs” in a complex database search). Overall, this session was much more enjoyable for both myself and my students, and it was one of the few times I left our session confident that class time was used to its fullest advantage.

Talking about teaching has value.

My final point of reflection is not limited to “flipped instruction,” but has grown out of conversations with my colleagues inspired by our participation in the flipped project. Given heavy instruction loads, faculty or student expectations, and other pressing projects, it’s easy to fall back into comfortable patterns of the same ol’ library session. Sometimes, simply carving out the time to talk about teaching seems like a luxury we cannot necessarily afford. Given the increasing emphasis on instruction in academic libraries, our mission to arm students with multifaceted critical information skills, and the trend toward providing evidence that our instruction adds value to the library and our parent institutions, deeper discussions about teaching and pedagogy can’t just be a luxury – they should be the reality.

I am lucky to have a job where I am encouraged to think about teaching, talk about teaching, and take calculated risks to grow as an instructor. Incorporating new pedagogical strategies like the “flipped classroom” is just one example of how this might happen.

Flipping Out: Preflip Planning

One of my current professional goals is to experiment with new ways to improve my library instruction sessions and grow as an instructor. So when our residency librarian decided to lead a group of instruction librarians to test the “flipped classroom” in library instruction, I welcomed the opportunity to discover how “flipping” might transform my classes. Given the previous interest in “flipping” here at ACRLog, I’ve also decided to share a bit of my planning, implementation, and reflection to continue the discussion about “flipping out” in the library world.

At first, re-envisioning my instruction sessions was a bit overwhelming – although I am still a newbie library instructor, I spent a great amount of time last semester crafting lessons and developing my own teaching style. I can only image how daunting this may seem to more experienced instructors who have honed their own lessons and style over several years of teaching!

Although I’ve used different lesson planning methods during graduate school classes and in my first semester of teaching, (e.g., Backward Design and Madeline Hunter’s model), I had trouble using these methods to plan my flip. Pretty soon, I found myself falling back to the “5 W’s” -  Who, What, When, and Why - to organize my thoughts. My considerations for each question are below.

Photo: By Ted Hood (Courtesy of State Library of New South Wales)

WHO: Who are the students in my flipped class? Who is the professor? Which class will lead to the most successful flipped experience?

If considering only learning outcomes and session materials, nearly any of my instruction sessions could be flipped. However, since the professor for my assigned freshman seminar class is equally interested  in trying out new instruction techniques, I decided his class would be a good match for the trial flipped sessions. Due to his support and investment in the process, I feel confident he will actually distribute pre-class materials to students and will motivate students to complete the assigned pre-class work. (As an added bonus, I also have three, 75-minute instruction sessions with this class, which leaves a cushion to “catch-up” if for some reason the entire flipped experience falls apart.)

WHAT: What are the student learning outcomes? What will students learn through pre-class materials? What activities will students complete during class to cement learning?

Answering these questions has been the most difficult part of planning my flipped classroom. During my “regular” classes, I already try to involve students with hands-on, active learning experiences whenever possible. The challenge with the “flip” has been to make those activities more complex, pushing students to deeper levels of learning, as well as to identify what types of pre-class background students need to successfully complete those activities. Our residency librarian presented this as “What are the basics students should come to class knowing? What are the complexities that in-class sessions will address?”

Like many of the librarians in our “flipping” group, I am using the library’s existing collection of online tutorials as the basis of my flipped materials. I decided to give students 2-3 short videos to watch before class to cover  basic skills, like the “click-by-click” mechanics of searching a database and the beginnings of constructing a search. Then, in-class activities will challenge them to apply those skills to their group research project at increasingly challenging levels.

WHERE: How will flipped materials be organized and delivered to students?

I’m already a big fan of using Google Forms to collect student feedback at the end of instruction sessions. Since I wanted to pair the pre-class videos with a measure of how many students completed the activities and how well they understood the material, Google Forms once again turned out to be an easy solution. For each flipped session, I created a Google form with links to videos along with quiz questions, and the course professor will distribute the form to students before our session.

WHEN: When should students complete pre-class activities?

The week before our in-class session, students will have access to the pre-class materials. Any earlier and I worry the connection between pre-class videos and in-class activities would be lost. This decision was fairly easy to nail down, and getting the date on my calendar is a good reminder finish materials with enough time to review the plan with the professor, distribute to students, etc.

WHY: Why is “flipping” an method I want to try for library instruction?

Although “flipping” is one way I’m fulfilling my goal to explore new instructional techniques, the deeper I dig into planning, the more I think it’s a model that can be useful in library instruction. Most of the librarians I work with or have observed are already moving away from lectures and database demonstrations. But it’s hard to jump into more complex applications and exploratory activities during a traditional 50 or 60 minute class if students don’t have a basic foundation on which to build advanced skills. Off-loading the procedural instructions, like how to navigate the library’s website or basic catalog searching, to pre-class activities can free up in-class time for librarians to help students work through more complex activities.

My flipped experiment is also allowing me to carve out a chunk of in-class time to address additional material, including brainstorming and concept mapping. Last semester, I noticed students in the seminar struggling to craft a manageable research question, which later affected their ability to construct effective searches and to evaluate information for it’s relevancy to their topic. This semester, since I’m providing some of the procedural instruction outside of class, I can accommodate more hands-on experiences into the class and set students up for better guided learning.

Ready, Set, Go!

The first round of pre-class materials is going out to students this week, and our first in-class session is next week! I am excited for student responses to the pre-class material to start coming in and to dive into the full flipped experience. I’m planning to report back in March with my thoughts about how the flip unfolds!

Do you have experience with the flipped classroom? What considerations do you think are vital when planning “the flip?”

Not as simple as “click-by-click”

One of the projects I inherited as emerging technologies librarian is managing our library’s collection of “help guides.” The online learning objects in this collection are designed to provide asynchronous guidance to students when completing research-related tasks. Over the last few months, my focus has been on updating existing guides to reflect website and database interface changes, as well ensuring compliance with federal accessibility standards. With those updates nearly complete, the next order of business is to work with our committee of research and instruction librarians to create new content. The most requested guide at the top of our list? How to use the library’s discovery service rolled out during the Fall 2012 semester.

Like many other libraries, we hope the discovery service will allow users to find more materials across the library’s collections and beyond. Previously, our library’s website featured a “Books” search box to search the catalog, as well as an “Articles” search box to search one of our interdisciplinary databases. To ease the transition to the discovery system, we opted to keep the “Books” and “Articles” search boxes, in addition to adding the “one search box to rule them all”; however, these format search boxes now search the discovery tool using the appropriate document type tag. Without going into the nitty gritty details, this method has created certain “quirks” in the system that can lead sub-optimal search results.

This back-story leads to my current question about creating instructional guides for our discovery system – how do we design screencasts to demonstrate simple searches by format?

So far, this has boiled down to two options:

  1. Address the way students are most likely to interact with our system. We know users are drawn to cues with high information scent to help them find what they need; if I’m looking for a book, I’m more likely to be drawn to anything explicitly labeled “Books.” We also know students “satisfice” when completing research tasks, and many are unfortunately unlikely to care if their searches do not retrieve all possible results. Additionally, whatever we put front-and-center on our homepage is, I think, a decision we need to support within our instructional objects.
  2. Provide instruction demonstrating the way the discovery system was designed to be used. If we know our system is set up in a less-than-optimal way, it’s better to steer students away from the more tempting path. In this case, searching the discovery system as a whole and demonstrating how to use the “Format” limiters to find a specific types of materials. While this option requires ignoring the additional search options on our website, it will also allow us to eventually phase out the “Books” and “Articles” search boxes on the website without significant updates to our screencasts.

While debating these options with my colleagues, it’s been interesting to consider how this decision reflects the complexities of creating  standalone digital learning objects. The challenge is that these materials are often designed without necessarily knowing how, when, or why they will be used; our job is to create objects that meet students at a variety of point-of-need moments. Given that objects like screencasts should be kept short and to-the-point, it’s also difficult to add context that explains why the viewer should complete activities as-shown. And library instruction are not usually designed to make our students “mini-librarians.” Our advanced training and interest in information systems means it is our job to be the experts, but our students to not necessarily need to obtain this same level of knowledge to be successful information consumers and creators.

Does this mean we also engage in a bit of “satisficing” to create instructional guides that are “good enough” but not, perhaps, what we know to be “best?” Or do we provide just enough context to help students follow us as we guide them click-by-click from point A to point B, while lacking the complete “big picture” required to understand why this is the best path? Do either of these options fulfill our goals toward helping students develop their own critical information skills?

No instruction interaction is ever perfect. In person or online, synchronous or asynchronous, we’re always making compromises to balance idealism with reality. And in the case of creating and managing a large collection of online learning objects, it’s been interesting to have conversations which demonstrate why good digital learning objects are not synonymous with “click-by-click” instructions. How do we extend what we know about good pedagogy to create better online learning guides?

 

Waiting on Wikipedia

Recently while I was teaching a class the instructor asked me whether I thought that Wikipedia would ever come to be considered a generally trustworthy, credible source. I always talk about Wikipedia in my one-shot instruction sessions, especially with first year students, but this was the first time I’d ever gotten a question along those lines. And I’ve been thinking about it ever since.

In my classes I point out to students that most of us — students, faculty, librarians, everyone — use Wikipedia all the time. My usual strategy for talking about Wikipedia in library instruction is likely similar to many librarians: I show students how to use it for brainstorming and background information, suggest that they mine the references, and point out the View history link to show them how the entry has changed. I end by noting that Wikipedia is a great place to start but that students shouldn’t cite it in their assignments because it’s much too general, just as they wouldn’t cite a general print encyclopedia. Instead, they should use Wikipedia to point them to other resources that are more appropriate for use in college work.

But I do wonder when Wikipedia will cross the line into acceptable-for-use-as-a-cited-source territory. Will it ever? Has it already?

Full disclosure: I cited Wikipedia in a scholarly journal article I wrote last year. I had what I thought were (and still think are) good reasons. I was writing about using games in information literacy instruction, and I used Wikipedia to define several specific genres of videogames. I felt that the Wikipedia definitions for those types of games were more current and accurate than definitions I found in other published sources. In this case the fluidity and impermanence of Wikipedia were assets. Genres and micro-genres can evolve and change quickly, and I think that most Wikipedia entries on popular culture (in which I’d include videogames) are probably written and edited by fans of those topics. There’s an argument to be made that those fans are the subject experts, so it’s the information they’ve put together that I was most confident in citing. While one of the peer reviewers did note the Wikipedia citations, the journal editor and I discussed it and agreed to keep them.

Of course, Wikipedia won’t always be the best source. Right now I’m working on writing up the results of a project and needed to find the construction dates for campus buildings at one of my research sites. After scouring the college’s website with no luck, I stumbled upon the information in Wikipedia only to come up against a dilemma I’m sure our students face all the time: the information seems true, it’s not blatantly, obviously false, but there’s no citation for it. In this case I didn’t feel comfortable citing Wikipedia so I emailed the college archivist for more information, which she quickly and graciously provided. But what do our students do in a situation like this? There won’t always be a readily identifiable person or source to check with for more information.

According to this recent article in the Atlantic, Wikipedia seems to be moving into a more mature phase. The rate at which Wikipedia articles are edited is decreasing, as is the rate for adding new articles. What does this slowdown mean for Wikipedia? Is it really “nearing completion,” as the article suggests? And when Wikipedia is finished, will it then become a citable source?

Clickers, or Does Technology Really Cure What Ails You?

ACRLog welcomes a guest post from Cori Strickler, Information Literacy Librarian at Bridgewater College.

During idle times at the reference desk, or when the students are gone for a break, I find myself creating instruction “wish lists” of tools or gadgets that I’d love to have for my sessions. One item that has been on my list for a few years now is clickers, or student response systems as they are officially called. In academic classrooms they are used for attendance, quiz taking, or other more informal assessments. For me, I saw clickers as a way to solve one of my basic and most frustrating problems: getting students to be engaged during the sessions. Students have little desire to participate in library sessions and trying to get them to comment on their library experience is like pulling teeth, except that the process is a lot more painful for me than it is for the students.

For those of you who haven’t heard of clickers before, they are little remote control like devices that allow the students to answer multiple choice questions by sending their responses to the computer for real time analysis. They are sort of like the devices they use on Who Wants to Be a Millionaire to poll the audience.

My library doesn’t have the budget for clickers, but this semester through a chance discussion with the director of the health services department, I learned that the college received a grant for 100 TurningPoint clickers and the necessary software. The director rarely needed all of the clickers at the same time, so she offered about fifty for me to use during my instruction sessions.

So, I now have access to a tool that I had coveted for many years, but that was only the easy part. I still have to figure out how to meaningfully integrate this technology into my sessions.

My overall goals are relatively simple. I want to encourage student involvement in any way possible so I would not have to lecture for fifty minutes straight. My voice just can’t handle the pressure. To be successful, though, I need to be purposeful with my inclusion. I can’t just stick a clicker quiz at the beginning of a session and assume that the students will suddenly be overwhelmed with a desire to learn everything there is about the library. Most faculty who schedule a library instruction session have a particular purpose in mind, so I also need to be sure that I fulfill their expectations as well.

After much consideration, I decided not to add the clickers to all my sessions. Instead, I decided to focus on first year students, who hopefully aren’t quite as jaded as the upper classmen, and haven’t already decided that they know everything about research.

For my first clicker experiment, I used them with a quiz to help me gauge the classes’ knowledge of the library. I also decided to use them as an alternative way to administer our session evaluation survey. Ultimately, I had mixed results with the clickers. The students did respond better than before, but I did not get full participation. While this isn’t a big issue with the quiz, this lack of participation was an issue when they were asked to complete the evaluation survey. For most survey questions I lacked responses from five or six students, which was a larger number than when I used the paper surveys and could potentially affect my survey results.

Their lack of participation could be due to a number of reasons. The students claimed they were familiar with the clickers, but they did not seem to be as adept as they claimed. Also, due to my inexperience with the clickers there might have been a malfunction with the devices themselves. Or, maybe the students just didn’t want to engage, especially since there was still no incentive to participate. When I looked back through the survey results, they did not seem to indicate any greater amount of satisfaction regarding the sessions.

This first experience with the clickers left me a bit skeptical, but I decided to try them again. This time, I created brief quizzes related to brainstorming keywords and types of plagiarism. My second class was smaller than the first, and I seemed to receive better engagement. The clickers also seemed to allow them to be more honest with the surveys and they seem more comfortable indicating their disinterest in the information presented, though the results also indicated that they saw the overall value in the information.

I have used the clickers in about twelve sessions this semester, and overall they were well received by the students. However, I am not completely sure that it adds significantly to the engagement. I also have not seen any indication in the surveys that my sessions are better or worse with their inclusion. I have discovered though that there may be some sessions, and topics, that are better suited for clickers than others. Upper level classes where I am trying to show specific resources do not lend themselves initially to clickers, and the time may be better spent with other activities or instruction.

I am still in the process of learning how clickers will fit into my classes, but I would generally call them a success, if only for the fact that is makes the survey process easier. Though, they aren’t the panacea for student engagement for which I had hoped. Activity type and student familiarity are essential variables that appear to affect clicker success.

Unfortunately, the overall nature of one shot instruction seems to be the greatest contributor to student disengagement. Student and faculty buy-in is the necessary component for library instruction success, whether it includes clickers or not.