Small Steps, Big Picture

As I thought about composing a blog post this week, I felt that familiar frustration of searching not only for a good idea, but a big one. I feel like I’m often striving (read: struggling!) to make space for big picture thinking. I’m either consumed by small to-do list items that, while important, feel piecemeal or puzzling over how to make a big idea more precise and actionable. So it feels worthwhile now, as I reflect back on the semester, to consider how small things can have a sizable impact.

I’m recalling, for example, a few small changes I’ve made to some information evaluation activities this semester in order to deepen students’ critical thinking skills. For context, here’s an example of the kind of activity I had been using. I would ask students to work together to compare two sources that I gave them and talk about what made the sources reliable or not and if one source was more reliable than the other. As a class, we would then turn the characteristics they articulated into criteria that we thought generally make for reliable sources. It seemed like the activity helped students identify and articulate what made those particular sources reliable or not and permitted us to abstract to evaluation criteria that could be applied to other sources.

While effective in some ways, I began to see how this activity contributed to, rather than countered, the problem of oversimplified information evaluation. Generally, I have found that students can identify key criteria for source evaluation such as an author’s credentials, an author’s use of evidence to support claims, the publication’s reputation, and the presence of bias. Despite their facility with naming these characteristics, though, I’ve observed that students’ evaluation of them is sometimes simplistic. In this activity, it felt like students could easily say evidence, author, bias, etc., but those seemed like knee-jerk reactions. Instead of creating opportunities to balance a source’s strengths/weaknesses on a spectrum, this activity seemed to reinforce the checklist approach to information evaluation and students’ assumptions of sources as good versus bad.  

At the same time, I’ve noticed that increased attention to “fake news” in the media has heightened students’ awareness of the need to evaluate information. Yet many students seem more prone to dismiss a source altogether as biased or unreliable without careful evaluation. The “fake news” conversation seems to have bolstered some students’ simplistic evaluations rather than deepen them.

In an effort to introduce more nuance into students’ evaluation practices and attitudes, then, I experimented with a few small shifts and have so far landed with revisions like the following.

Small shift #1 – Students balance the characteristics of a single source.
I ask students to work with a partner to evaluate a single source. Specifically, I ask them to brainstorm two characteristics about a given source that make it reliable and/or not reliable. I set this up on the board in two columns. Students can write in either/both columns: two reliable, two not reliable, or one of each. Using the columns side-by-side helps to visually illustrate evaluation as a balance of characteristics; a source isn’t necessarily all good or all bad, but has strengths and weaknesses.

Small shift #2 – Students examine how other students balance the strengths and weaknesses of the source.
Sometimes different students will write similar characteristics in both columns (e.g., comments about evidence used in the source show up in both sides) helping students to recognize how others might evaluate the same characteristic as reliable when they see it as unreliable or vice versa. This helps illustrate the ways different readers might approach and interpret a source.

Small shift #3 – Rather than develop a list of evaluation criteria, we turn the characteristics they notice into questions to ask about sources.
In our class discussion, we talk about the characteristics of the source that they identify, but we don’t turn them into criteria. Instead we talk about them in terms of questions they might ask of any source. For example, they might cite “data” as a characteristic that suggests a source is reliable. With a little coaxing, they might expand, “well, I think the author in this source used a variety of types of evidence – statistics, interviews, research study, etc.” So we would turn that into questions to ask of any source (e.g., what type(s) of evidence are used? what is the quantity and quality of the evidence used?) rather than a criterion to check off.

Despite their smallness, these shifts have helped make space for conversation about pretty big ideas in information evaluation: interpretation, nuance, and balance. What small steps do you take to connect to the big picture? I’d love to hear your thoughts in the comments.

Making strategy more transparent

I’m not one to make new year’s resolutions, per se. Still, I have been trying to work on something resolution-esque in the past few months, or maybe even for a year now, although it didn’t begin with any formal shape or label. However, it’s mid-February. It’s the end of week four of the semester and things are feeling rather hectic. My resolve seems weak and my desire for hibernation and Girl Scout cookies is strong. So right about now feels like a good time to check in for a kind of status report and a little refocusing and reinvigoration.

My “resolution” centers around the notion of strategy. I’ve been trying to work on better communicating with others the strategy behind what I’m doing and thinking. That is to say not just the items I cross off each day’s to do list, but how those items intersect in service of a larger plan or aim. For example, not just the classes I’m teaching today or next week or this month, but how selected classes connect as part of a scaffolded information literacy instruction plan for anchor, or core, courses in academic majors. Or that the assessment project I’m working on now is part of a larger plan for assessment that contributes to our multi-faceted understanding of students’ information literacy learning and outcomes. I’m not trying to blow smoke here. I’m just saying that what I see as strategy isn’t always apparent to others. How could it be if I didn’t tell anyone about what I’m thinking? I’m trying to work on this in large part by just talking about it more.

By talking about it more, I mean I’m trying to clarify my strategy for myself and articulate it more clearly for others. I’m trying to communicate in different ways–both abstract concepts and concrete examples, both words and graphics–to make stronger connections. I’m trying to be more transparent about what I’m thinking and how I’m connecting the dots. But I’m also trying to carefully listen to what others have to say to see how my thinking and my work is part of a still larger whole. This librarian-led scaffolded information literacy instruction plan for a series of anchor courses in the psychology major that I mentioned a moment ago, for example, is only part of still more expansive information literacy teaching and learning for psychology students. So when I meet with psychology faculty, I talk about students’ development across that series of courses, but I ask about where and how they are also teaching information literacy in those courses and others, as well. We talk together about assignment design and course goals and students’ needs. It’s not about some great reveal, as if by magic, at the end. Talking about it along the way makes the individual steps and component parts more connected, more meaningful, more collaborative, and, therefore more successful.

the_larger_whole

R-chie overlapping structure arc diagram by Daniel Lai, Jeff Proctor, Jing Yun and Irmtraud Meyer” by dullhunk is licensed under CC BY 2.0

I’ve been focusing on strategy directly with students, too, in the classroom and during research consultations. When I ask students to experiment with a research question in a database, for example, I frame our discussion of their approaches as “best practices for search strategies.” We talk not only about which words they typed in, but why they picked the words they did and what impact their choices had on search results. We add things like “identify major concept words” and “use synonyms for major concepts words” to our list of strategies. I think this metacognitive approach helps students turn a concrete experience into a framework for future application. I am increasingly talking with students about what their strategies are, how they are (and should be) developing strategies, and how strategies can give them agency over their research processes and learning. When we talk about strategies for organizing, reading, or synthesizing sources, students are (mostly, not all–let’s be real) interested. I try to be transparent about my strategies, too: why we’re doing what we’re doing.in the classroom. Students seem eager for a framework that helps them decode, maneuver, manage, and direct their work. They are engaged in these conversations. Never have I seen them take more notes than when we talk about strategy.

It’s well and good to intend to work on strategy and think about the big picture–indeed, it’s an attitude or habit of mind–but the reality is that it takes practice, requires space, and demands reflection. Part of my “resolution” is also to get better at strategic thinking and work. My attempts to make time and space have so far included three approaches.

  1. Visual organization. I’m a big fan of lists and post-its and paper. I write everything down to keep track of ideas and tasks big and small. I regularly organize and reorganize these notes. I’ve started grouping them by theme or project in a chart, rather than just simple lists. The visual layout has been a helpful reminder of how small items are part of a larger whole. It helps me think about connections.
  2. Scheduling time for strategic thinking. I’m not doing so well on this one, to be honest. It’s rather easy to lose the thread of this practice when you’re suffering from email/instruction/meeting/life overload. As a case in point, I jotted down about three (probably more interesting) ideas for this blog post that I was excited about, but they all required more big picture thinking and research than I could make happen before this deadline. I’ve been trying to schedule time in my calendar for strategy, just like I schedule meetings. But then I catch up with email instead or I schedule in a student who needs last-minute help or I cross a few other little things off my to do list. Even though I blocked two hours in my schedule to work on reviewing results of recent assessment projects to find connecting themes across them, I let the other stuff in. Those things were more pressing, but also just more easily accomplished. Of course, the pace of the semester doesn’t always permit open blocks of time to devote to the bigger picture. But I also need to work on sticking to it.
  3. Research, presentation, and publication. The motivation of an approaching conference presentation or a writing commitment forces my hand to think and reflect more strategically and meaningfully, not just in passing, about the big picture of my daily work. I’ve been seeking more opportunities for this kind of structure because it’s been so helpful for processing, interpreting, and meaning-making.

How do you motivate your strategic thinking? How do you make room in your daily and weekly schedule? Or perhaps, how do you use small chunks of time for big picture thinking and work? I’m eager to hear your strategies in the comments…