Facilitating student learning and engagement with formative assessment

Information literacy instruction is a big part of my job. For a little context, I teach somewhere in the range of 35-45 classes per semester at my small liberal arts college. While a few of the sessions might sometimes be repeats for a course with multiple sections, they’re mostly unique classes running 75 minutes each. I’ve been teaching for some time now and while I’m a better teacher than I was ten or five years ago or even last year, there’s always plenty of room for improvement of course. A few months ago, I wrote a post about reflection on and in my teaching, about integrating “more direct discussion of process and purpose into my classes […] to lay bare for students the practice, reflection, and progression that complicates [information literacy] work, but also connects the gaps, that brings them closer to crossing the threshold.” Each year, I’ve been devoting more attention to trying to do just that: integrate process and purpose into my classes to improve student learning and engagement.

It didn’t start out as anything momentous, just a little bit all the time. Initially, it was only a small activity here or there to break things up, to give students a chance to apply and test the concept or resource under discussion, and to scaffold to the next concept or resource. I would demo a search strategy or introduce a new database and then ask students to try it out for their own research topic. I would circle the class and consult individually as needed. After a few minutes of individual exploration, we would come back together to address questions or comments and then move on to the next resource, strategy, or concept. This appeared to be working well enough. Students seemed to be on board and making progress. By breaking a class into more discrete chunks and measuring the pace a bit, students had more of a chance to process and develop along the way. Spacing out the hands-on work kept students engaged all class long, too.

For some time, I’ve started classes by reviewing the assignment at hand to define and interpret related information needs, sometimes highlighting possible areas of confusion students might encounter. Students expressed appreciation for this kind of outlining and the shape and structure it gave them. I felt a shift, though, when I started asking students, rather than telling them, about their questions and goals at the outset of a class. Less Here are the kinds of information sources we’ll need to talk about today and more What kinds of information do you think you need to know how to access for this assignment? What do you hope that information will do for you? What have been sticky spots in your past research experiences that you want to clarify? I wanted students to acknowledge their stake in our class goals and this conversation modeled setting a scope for learning and information needs. We then used our collective brainstorm as a guiding plan for our class. More often than not, students offered the same needs, questions, and problems that I had anticipated and used to plan the session, but it felt more dynamic and collaboratively constructed this way. (Of course, I filled in the most glaring gaps when needed.)

So why not, I finally realized one day, extend the reach of this approach into the entire class? While scaffolding instruction with small activities had helped students process, develop, and engage, I was still leading the charge at the pace I set. But what if we turned things around?  What if, essentially, they experimented on their own in order to determine something that worked for them (and why!) and shared their thoughts with the class? What if we constructed the class together? Rather than telling them what to do at the outset of each concept chunk, I could first ask them to investigate. Instead of demonstrating, for example, recommended search strategies and directing students to apply them to their own research, I could ask students to experiment first with multiple search strategies in a recommended database for a common topic in order to share with the class the strategies they found valuable. The same goes for navigating, filtering, and refining search results or for evaluating sources and selecting the most relevant or for any concept or resource for that matter. Why not, I thought, ask students to take a first pass and experiment? We could then share ideas as a class, demonstrating and discussing the strengths and weaknesses of their tactics along the way, collaboratively building a list of best practices strategies. Students could then revisit their work, applying those best practices where needed.

This kind of experiment-first-then-build-together-then-revise approach is simple enough, but its advantages feel rather significant to me. It makes every class exciting, because it’s—in part, at least—unique and responsive to precisely those students’ needs. Of course I have a structure and goals in mind, prepared notes in hand, but it’s a flexible approach. While it’s not appropriate for every class, the low stakes/low prep makeup is readily applicable to different scenarios and content areas. The students and I are actively involved in constructing the work of the class together. Everyone has a chance to contribute and learn from each other. In particular, more experienced students get to share their knowledge while less experienced students learn from their peers. The expectation to contribute helps students pay attention to the work and to each other. Its scaffolded and iterative design helps students digest and apply information. Its reflective nature reveals for students practice and process, too; it models the metacognitive mindset behind how to learn, how to do research. I don’t mean to get too ebullient here. It’s not a panacea. But it has made a difference. It’s probably no surprise that this kind of teaching has required a degree of comfort, a different kind of classroom leadership, and a different kind of instinct that would have been much, much harder to conjure in my earlier teaching.

While I wasn’t aware of it initially and didn’t set out to make it so, I now recognize this as formative assessment. Not only do these small activities increase opportunities for engagement and learning, they serve as authentic assessment of students’ knowledge and abilities in the moment. They provide evidence of student learning and opportunities for action immediately. With that immediate input, I can adjust the nature and depth of instruction appropriately at the point of need. All in a way that’s authentic to and integrated in the work of the class.

The informality of this approach is part of what makes it flexible, low prep, and engaging. It’s such a rich site for documentation and evaluation of student learning, though. I want to capture the richness of this knowledge, demonstrate the impact of instruction, document students’ learning. But I’m struggling with this. I haven’t yet figured out how to do this effectively and systematically. Some formative assessments result in student work artifacts that can illustrate learning or continuing areas of difficulty, but the shape my implementation has so far taken results in less tangible products. At the ACRL 2015 conference a few weeks ago, I attended a great session led by Mary Snyder Broussard, Carrie Donovan, Michelle Dunaway, and Teague Orblych: “Learning Diagnostics: Using Formative Assessment to Sustainably Improve Teaching & Learning.” When I posed this question in the session, Mary suggested using a “teacher journal” to record my qualitative reflections and takeaways after each class and to notice trends over time. I’m interested in experimenting with this idea, but I’m still searching for something that might better capture student learning, rather than only my perception of it. I’m curious to read Mary’s book Snapshots of Reality: A Practical Guide to Formative Assessment in Library Instruction, as well as Michelle and Teague’s article “Formative Assessment: Transforming Information Literacy Instruction” to see if I might be able to grab onto or adapt any other documentation practices.

Do you use formative assessment in your teaching? How do you document this kind of informal evidence of student learning? I’d love to hear your thoughts in the comments.

Failing Forward, Supporting Students

My son starts middle school in a week, so I’ve been more susceptible than usual to headlines about how parents can help their kids succeed academically. A couple of recent articles in the New York Times caught my eye. First was an opinion piece by psychologist Madeline Levine called Raising Successful Children. Levine is the author of Teach Your Children Well: Parenting for Authentic Success, and she encourages parents stand back and let children make mistakes (within reasonable safety parameters, of course), rather than jump in to fix problems that kids should learn how to solve themselves. More recently I read a review of a new book called How Children Succeed by journalist Paul Tough. He echoes many of Levine’s points about giving kids the space to try, fail, and try again, but cautions that unless children are supported in their efforts it will be difficult for them to pick themselves up and keep going. The reviewer refers to this as a “character-building combination of support and autonomy.”

It’s easy to consider strategies to use to encourage students to try, fail, and try again in a college course, as there’s time over the semester for students to work on problems and concepts that may initially elude them. I’m interested in games-based learning and this is a familiar theme in all good games; noted education scholar James Paul Gee calls it “failing forward.” In a videogame, for example, I usually don’t finish the boss level in my first try, but I learn its attributes and weaknesses so that I can apply what I’ve learned in my next attempt (and repeat until victorious).

In academic libraries we don’t usually have the semester-length relationship with students that classroom faculty have. How can academic librarians allow — or even encourage — students to fail, but be there to support and encourage them when they do?

  • As an instruction librarian, one obvious strategy that leaps to mind is giving students the space to practice their research and library skills during our instruction sessions and workshops. I still struggle with my tendency to want to tell students every single thing about the library, but I’m getting better about keeping my presentation short and preserving time for students to search on their own as I make myself available to answer their questions (and watch closely so I can offer help to students who don’t explicitly ask). And if I happen to fail when demonstrating a search to students, so much the better.
  • At the Reference Desk, we can allow students to “drive” their search for information by turning the computer keyboard over to them so they can type their search query. We can support them as they sort through their results, and offer suggestions of strategies for revising their search to produce better results. This might be tricky at busy times, of course, so we might not always be able to use this approach with students. We can also think of roving Reference as an opportunity to help students fail forward: librarians can roam the study areas in search of students who look like they may have a question or be in need of assistance.
  • On our websites, we can embed instructional text, tutorials, and ask a librarian links within our electronic resources and services, or on the web pages that link to them. Ideally students will try to use these research tools themselves, but, if they run into trouble or don’t find what they need, they can easily find support or can reach out and ask for our help. One caveat is that it may be difficult to determine whether students are taking advantage of the support offered rather than just failing and moving on, though usability studies and web analytics could be employed to gather information about usage.

I’m sure there are lots of other ways that academic librarians can help students try, fail, and try again — I’d be interested to hear about them. And what about students who won’t or can’t seek encouragement, how can we support them when they try and fail?