Category Archives: Information Literacy

The Emphasis on Text(s)

The current dominant paradigm of information literacy emphasizes the importance of connecting with textual information. This produces a deficit model of information literacy which does not take into account the importance of information learning or other sources of information which are accessed through communication or action.

–Annemaree Lloyd, Information Literacy: Different Contexts, Different Concepts, Different Truths?
as cited by Eamon Tewell in 
The Problem with Grit

It all started with this quote. I was sitting in Eamon Tewell’s presentation at LOEX earlier this month, learning about the problematic nature of grit narratives in education and libraries, when these two sentences showed up in his slide deck. Eamon was convincingly linking the popularity of grit to current deficit models of information literacy education. By defining information literacy in academic libraries in a particular way, we categorize students as academically deficient. They may be able to solve complex information problems on their own, in their own way, but because, as Annemaree Lloyd states, we emphasize text as information in academia, their experiences and abilities are invalidated. Our academic librarian version of information literacy is rooted in the written word, and not just any written word, but words of a certain kind: academic journal articles, scholarly books, book chapters, reports, grey literature, legal documents, etc. Our emphasis as librarians is on the things we can read that signal some connection to the academy.

We see examples of this in our work all the time. We might say something like,”You are used to using Google, but Google won’t help you in this situation,” (Spoiler: It probably still will). Or, “Let’s start our research with the library databases.” We might try to branch out from scholarly texts by encouraging students to use Wikipedia or news sources as launch pads for research, but these are all still resources rooted in the written word. I can always count on Library Twitter to help me process problematic ideas and issues, so I posed the following questions to my colleagues:

Responses were so thoughtful and thought-provoking. Desmond Wong, Outreach Librarian at the University of Toronto, shared the problematic nature of current information literacy education in relation to searching for and accessing indigenous peoples’ knowledge. This idea is seconded by research done by Alex Watkins of University of Colorado Boulder, who sees this emphasis on academic textual sources as “academics policing the boundaries of authority as well as elevating a particular way of knowing.” (Side note: Both Desmond and Alex have done some excellent work researching indigenous knowledge practices and information literacy). And Karen Nicholson pointed me to the great chapter by Alison Hicks on this very topic in her recent book co-edited with Maura Seale, The Politics of Theory and the Practice of Critical Librarianship

In Making the Case for a Sociocultural Perspective on Information Literacy, Alison Hicks moves beyond the ACRL Framework vs. Standards debate to advocate for a sociocultural approach to information literacy. This focuses on the ways in which information literacy “shows itself” in different communities, and the ways in which it is shaped by different contexts. A sociocultural approach to information literacy shows us that the way we’ve defined information literacy as librarians is just one version of information literacy. It is a “social practice that emerges from a community’s information interactions” (p. 73). But by adopting a “single understanding of information literacy” as the information literacy, we impose one group’s knowledge practices on another (p. 75). What we are teaching in academic libraries is specific to an academic context, but we are teaching it as though it is universal.

I can already hear the dissent brewing, because so entrenched is my relationship to a particular type of information literacy that I had a similar, initial, knee-jerk reaction. “But we need to teach students how to use and understand these textual, scholarly resources precisely because they are new and they have never used them before!” I had to counter my own reaction with a blog post I read a few years ago by the ever-prolific Barbara Fister. Referencing the PIL study that looks at info-seeking behavior of recent college graduates, she laments the difficulty these young adults have setting up their own personal learning networks. We’ve focused so strongly on information as a textual source in information literacy education, that we don’t address the information literacy practices of different communities, including the workplace. Think about the last time you started a new job and how you gathered new information about your place of work. Did you immediately start digging into scholarly articles about best practices? Or did you set up formal and informal information appointments with your new colleagues? I think we all know the power of information in the workplace and our lives, and we’d be lying if we said we got all of this information from reading text, much less academic texts.

I’ve been deep in this idea lately as I start a new job and seek resources for my son who has various learning differences. As much as I want to say that scholarly, empirical research articles have been my go-to information sources, they absolutely haven’t. For me, as a new employee, my information literacy practices have centered around talking to people and learning from their experiences and institutional knowledge. For me, as a parent of a child with learning differences, my information literacy practices have centered around meeting and speaking with other parents and special needs education advocates. This is the information literacy I practice in my daily life, and I am starting to think more and more about how to incorporate this into the work I do with students, librarians, and faculty as an information literacy educator.


Developing a Campus Framework for Digital Literacy

ACRLog welcomes a guest post from Julia Feerrar, Head of Digital Literacy Initiatives at Virginia Tech.

During the summer of 2016 my library began to envision a more coordinated effort around supporting digital literacy on our campus. We began by examining the scope of digital literacy at Virginia Tech and have since developed a framework to help us build towards a shared definition and language for our context.

For me this process has been a really interesting chance to reflect on the relationship between information and digital literacy (as well as media, data, and many other literacies), and to explore perceptions and needs around these literacies on my campus. Building towards consensus around a nebulous, multifaceted concept like digital literacy can be very challenging, but we’ve been able to have some exciting conversations and build connections across campus as we move towards shared understanding.

Our framework

As Alexander et al. illustrate in the NMC Horizon Project Strategic Brief on Digital Literacy in Higher Education, Part II, definitions and frameworks for digital literacy vary, particularly in their emphasis on technical skills, critical thinking and creative abilities, and social or cultural competencies. Considering that these pieces can shift in different contexts, I think that it was important for us to begin with the what and why of digital literacy, before jumping into the how of digital literacy on our campus.

An initial task force within the University Libraries began the work of navigating existing definitions for digital literacy and identifying needs in our context. The task force was particularly influenced by Jisc’s Digital Capability Framework, which positions digital literacy as “capabilities which fit someone for living, learning and  working in a digital society.” We discussed these capabilities as including engagement with a variety of digital tools, types of content, creation processes, and decision-making. The ACRL Framework for Information Literacy in Higher Education and  its emphasis on students as “consumers and creators of information who can  participate successfully in collaborative spaces” was also foundational to our thinking. It was important to us to think about digital literacy as flexible enough to include common or foundational skills related to critical consumption, creation, and collaboration, and to support learners in achieving their own goals for their digital lives.

Following the Libraries’ task force,  we drafted a framework graphic and sought feedback across the Virginia Tech community. We reached faculty and graduate students through existing professional development opportunities as well as by hosting a day-long digital literacy symposium, which also served as an opportunity to build community among those who support digital literacy at Virginia Tech. During these feedback conversations, we asked participants about any elements they saw as missing from the framework draft as well as where  they saw their work connecting to it. With all of this feedback in mind, we revised the framework to a final (for now) version.

Infographic illustrating Virginia Tech's digital literacy framework

This framework represents four aspects or layers for digital literacy at Virginia Tech.

  1. The learner at the center, who might engage with the other  areas in the framework in any combination or order.
  2. Core competencies that each include technical, critical thinking, and social aspects
  3. Key values that connect and contextualize the competencies. I see these as as particularly tied to the why of digital literacy and our hopes for our learners as engaged digital citizens.
  4. Multiple literacies that frame the outside of our framework. I think of the literacies as our lens or lenses on digital literacy.

Navigating literacies

Our framework approaches digital literacy as a kind of umbrella or metaliteracy that includes information, data, media, and invention literacies. While a particular class session, workshop, or online learning module might focus on one of these in particular, they come together to inform the way we think about digital literacy as a whole.

While I find this to be a useful way to think about the relationship between these several overlapping literacies, I want to acknowledge that it is certainly not the only way. As Jennifer Jarson points out in her 2015 post, many of us might conceptualize information literacy as the broader category that includes digital literacy. I think it’s possible to take any number of literacies into the foreground as a lens for others and I find that my own thinking shifts depending on the context. As individuals we might gravitate towards one literacy or another, perhaps depending on disciplinary background, but ultimately I think that looking at them in conjunction can help us to think more expansively about our hopes for our learners.

Our framework in action

Looking forward, our framework will guide the continued development of digital literacy initiatives. Within VT Libraries, I see this framework as helping us with two major activities: more strategically coordinating and sequencing our existing library educational offerings around digital literacy (course-embedded instruction, co-curricular workshops and events, new spaces and  technology for creation) and identifying areas for further development. More broadly, my hope is that our framework will also help us to continue to build shared language and shared vision for digital literacy learning as we continue to build partnerships in support of student learning.

Ghosts in the Library – A Collaborative Approach to Game-Based Pedagogy

ACRLog welcomes a guest post from Mandy Babirad, Instructional Services Librarian at SUNY Morrisville State College, Heather Shimon, Science and Engineering Librarian at the University of Wisconsin Madison, and Lydia Willoughby, Reference Librarian, Research and Education, at SUNY New Paltz.

Mandy Babirad (now at SUNY Morrisville), Heather Shimon (now at UW-Madison), and Lydia Willoughby (SUNY New Paltz) created an instructional game called Ghosts in the Library (Ghosts) to use in English Composition I library sessions (Comp I) at SUNY New Paltz in Fall 2015.

The game aligns with established Comp I learning outcomes and includes self-directed learning, problem solving, collaborative learning, and peer review. In the game, students work in groups and use the library catalog and databases to research a notable person with ties to New York State (a “ghost” who is haunting the library), and then create a digital artifact based on that research to appease the ghost. The “ghosts” are people of color and women who have made significant contributions to New York State, yet are underrepresented in the historical record. With the library’s namesake of Sojourner Truth, and student protests against a predominantly white curriculum occuring in Fall 2015, the game was also an attempt to include marginalized voices within the library collection and course syllabi.

The primary goal for Ghosts was to frame a 75 minute one shot library instruction session with a pedagogy of possibility. Roger Simon[1], in work that drew from deep collaboration with Henry Giroux, thought about student-centered learning as a choice of hope, and teaching as an act of hope. “Hope is the acknowledgement of more openness in a situation than the situation easily reveals… the hopeful person acts.” (3) Being open to possibilities is the only mindful and clear choice for teaching librarians facing technology distraction and student disinterest in a required library session. Bringing in curiosity as play engages inquiry as an affective process that asks student and teacher to act and reveal a more whole self in the classroom.

Ghosts Game Play

The game has one central goal: to appease your team’s ghost so that the ghost will leave the library and our campus alone. Each team gets a ghost card, team members choose role cards, the team members then use the tool cards to hunt down information that will help them appease their ghosts, and the final and culminating component of the game is the team creation of a historical marker.

All game materials can be downloaded from the Ghosts research guide:

Players in the Ghosts game receive a packet that contains the following game materials:

  • Map of the Sojourner Truth Library (with corresponding key to call numbers to floors)
  • Worksheet for the game to be completed in class time (the worksheet contains the rubric that teams use to evaluate their work and the success of their historical marker at appeasing their ghost).
  • Game Rules (like all rules, this is probably more useful for the librarian and teachers, than it is used by students. This was a key element in our game design and creation process, though it is most likely the least utilized part of the game by actual players during game play.)
  • A Packet of Cards (each packet contains 1 ghost card, 3 role cards and a 3 tool cards.
    • Ghost cards are randomly given to each group and are all women and people of color from New York State history that have a tie to the Hudson Valley region.
    • The 3 role cards include a historian, a presenter and a facilitator. If the composition of the class that you are teaching needs the group to be divided into more than 3 people per group, you can double up on historian role cards. All role cards contribute to information gathering and drafting the text of the historical marker.
      • The historian takes notes on the worksheet and enters the team’s text on the historical marker that the team is working together to create.
      • The presenter is the person that presents the team’s historical marker to the class.
      • The facilitator keeps the team on track and ensures that all tool cards have been used in information gathering, and that the team’s work fulfills all the roles of the rubric.
    • The 3 tool cards correspond to the library research tools students use on the library website to conduct research.
      • A tool card for databases that guides students to Academic Search Complete to find scholarly articles
      • A tool card for the library catalog that helps them discover books
      • A tool card for reference resources helps students to find background and biographical information on their ghost using Gale Virtual Reference Library.

The final part of the worksheet is a space where they can draft the text of their historical marker, a synthesis of their respective roles and tools in the research process. Once teams have completed the worksheet, they go to our custom historical marker website,, to enter their text. Once they publish their historical marker and hit “Create,” their original text will appear on a digital artifact that looks like a ‘real’ NY State Education Department 1940. The artifact creation component of this game is designed to encourage student learning with a pedagogy that helps students connect to something ‘real’ and physical in the research process. Students present the historical markers, and all game players receive a ghost button and a FAQ zine about the library. Summary discussion concludes the session focused on what kinds of information the students gleaned from which kinds of library resources.

The game was tested with library staff, librarians, and student staff before being used in the classroom for the Science and Technology Entry Program program and for one Comp I in Spring 2016. Ghosts launched as a pilot in Fall 2016. Since that time, the game has continued as pilot for Comp I sessions in Fall 2017. Ghosts has been taught in roughly 42% of Comp I sessions since its launch. The assessment and feedback that we have is based on the worksheets from students, a survey given to faculty and students

Student, Teacher Feedback on Ghosts

We found that student input on the worksheet question, “Why did you choose this?,” to be the most valuable question to assess student learning. Even though only 52% of students reported that they would definitely use the resources from Ghosts again (and 42% reported ‘kind of’), their worksheets suggested otherwise. The student worksheets demonstrated skill in describing the research process in detail, showing an ability to evaluate information sources and needs. Even so, only 35% of student reported that the game had value to their course assignments, and 48% said that the game was ‘kind of’ valuable to their course work. There is a disconnect between the students ability to reflect on their own research and their view of the usefulness of those skills. Meaning, that as with all library instruction, the value of learning systemic thinking struggles to be visible and relevant to course assignments when structured in required sessions. The students were describing their research process, but not equating that task with the value of learning how to research. Interestingly, 71% of students definitely felt included while playing the game, and 31% ‘kind of’ felt included.

In the future, more evaluative questions will be posed in both the worksheet completed during the game and the post-assessment. The final product, the historical marker, won’t have a word count. Editing the marker down to 50 words took up a lot of time and stressed some of the students out which in turn may have influenced their evaluation of the game. The game could be tightened up and the worksheets could be transferred to online forms, so that the game is more of an online tutorial, which would facilitate flipped learning and provide the opportunity to use class time to have a more discussion based session informed by the work that was done outside of class. An idea for a follow-up assignment include writing a letter or postcard to your ghost describing how you research their history and what kind of things you found to encourage students to again practice being descriptive of their process. It is hard to get students to reflect on process; it is not practiced and it is rarely evaluated or asked for in their graded assignments.

The code for the historical marker would not have been possible without the work of software developer Andrew Vehlies. He created the marker from scratch in consultation with the librarians and posted the code on Github (available here) so that other history enthusiasts can benefit from his work. Once the code was developed and posted publicly, library technician (and human grumpy cat) Gary Oliver was able to post to local servers so that it could be used by students. Many thanks to instruction coordinator Anne Deutsch, at SUNY New Paltz, for letting us pilot Ghosts in the first place, and for supporting the game development in the library instruction program.

[1] Simon, R.I. (1992). Teaching against the grain: Texts for pedagogy of possibility. Greenwood Publishing Group.

Narrative as Evidence

This past week I attended the MLGSCA & NCNMLG Joint Meeting in Scottsdale, AZ. What do all these letters mean, you ask? They stand for the Medical Library Group of Southern California and Arizona and Northern California and Nevada Medical Library Group. So basically it was a western regional meeting of medical librarians. I attended sessions covering topics including survey design, information literacy assessment, National Library of Medicine updates, using Python to navigate e-mail reference, systematic reviews, and so many engaging posters! Of course, it was also an excellent opportunity to network with others and learn what different institutions are doing.

The survey design course was especially informative. As we know, surveys are a critical tool used by librarians. I learned how certain question types (ranking, for example) can be misleading, how to avoid asking double-barreled questions, and how to not ask a leading question (i.e. Do you really really love the library?!?) Of course, these survey design practices reduce bias and attempt to represent the most accurate results. The instructor, Deborah Charbonneau, reiterated that you can only do the best you can with surveys. And while this seems obvious, I feel that librarians can be a little perfectionistic. But let’s be real. It’s hard to know exactly what everyone thinks and wants through a survey. So yes, you can only do the best you can.

The posters and presentations about systematic reviews covered evidence-based medicine. As I discussed in my previous post, the evidence-based pyramid prioritizes research that reduces bias. Sackett, Rosenberg, Gray, Haynes, and Richardson (1996) helped to conceptualize the three-legged stool of evidence based practice. Essentially, evidence-based clinical decisions should consider the best of (1) the best research evidence, (2) clinical expertise, and (3) patient values and preferences. As medical librarians we generally focus on delivering strategies for the best research evidence. Simple enough, right? Overall, the conference was informative, social, and not overwhelming – three things I enjoy.

On my flight home, my center shifted from medical librarianship to Joan Didion’s Slouching Towards Bethlehem. The only essay I had previously read in this collection of essays was “On Keeping a Notebook”. I had been assigned this essay for a memoir writing class I took a few years ago. (I promise this is going somewhere.)  In this essay, Didion discusses how she has kept a form of a notebook, not a diary, since she was a child. Within these notebooks were random notes about people or things she saw, heard, and perhaps they included a time/location. These tidbits couldn’t possibly mean anything to anyone else except her. And that was the point. The pieces of information she jotted down over the years gave her reminders of who she was at that time. How she felt.

I took this memoir class in 2015 at Story Studio Chicago, a lofty spot in the Ravenswood neighborhood of Chicago. It was trendy and up and coming. At the time, I had just gotten divorced, my dad had died two years prior, and I discovered my passion for writing at the age of 33. So, I was certainly feeling quite up and coming (and hopefully I was also trendy). Her essay was powerful and resonated with me (as it has for so many others). After I started library school, I slowed down with my personal writing and focused on working and getting my degree, allowing me to land a fantastic job at UCLA! Now that I’m mostly settled in to all the newness, I have renewed my commitment to writing and reading memoir/creative non-fiction. I feel up and coming once again after all these new changes in my life.

As my plane ascended, I opened the book and saw that I had left off right at this essay. I found myself quietly verbalizing “Wow” and “Yeah” multiples times during my flight. I was grateful that the hum of the plane drowned out my voice, but I also didn’t care if anyone heard me. Because if they did, I would tell them why. I would say that the memories we have are really defined by who we were at that time. I would add that memory recall is actually not that reliable. Ultimately, our personal narrative is based upon the scatterplot of our lives: our actual past, present, future; our imagined past, present, future; our fantasized past, present, and future. As Didion (2000) states:

I think we are well advised to keep on nodding terms with the people we used to be, whether we find them attractive company or not. Otherwise they turn up unannounced and surprise us, come hammering on the mind’s door at 4 a.m. of a bad night and demand to know who deserted them, who betrayed them, who is going to make amends. We forget all too soon the things we thought we could never forget. We forget the loves and the betrayals alike, forget what we whispered and what we screamed, forget who we were. (p. 124)

What does this have to do with evidence-based medicine? Well, leaving a medical library conference and floating into this essay felt like polar opposites. But were they? While re-reading this essay, I found myself considering how reducing bias (or increasing perspectives) in research evidence and personal narrative can be connected. They may not seem so, but they are really part of a larger scholarly conversation. While medical librarians focus upon the research aspect of this three-legged stool, we cannot forget that clinical expertise (based upon personal experience) and patient perspective (also based upon personal experience) provide the remaining foundation for this stool.

I also wonder about how our experiences are reflected. Are we remembering who we were when we decided to become librarians? What were our goals? Hopes? Dreams? Look back at that essay you wrote when you applied to school. Look back at a picture of yourself from that time. Who were you? What did you want? Who was annoying you? What were you really yearning to purchase at the time? Did Netflix or Amazon Prime even exist?? Keeping on “nodding terms” with these people allows us to not let these former selves “turn up unannounced”. It allows us to ground ourselves and remember where we came from and how we came to be. And it is a good reminder that our narratives are our personal evidence, and they affect how we perceive and deliver “unbiased” information. I believe that the library is never neutral. So I am always wary to claim a lack of bias with research, no matter what. I prefer to be transparent about the strengths of evidence-based research and its pitfalls.

A couple creative ways I have seen this reflected in medicine is through narrative medicine, JAMA Poetry and Medicine, and Expert Opinions, the bottom of the evidence-based pyramid, in journals. Yes, these are biased. But I think it’s critical that we not forget that medicine ultimately heals the human body which is comprised of the human experience. Greenhalgh and Hurwitz (1999) propose:

At its most arid, modern medicine lacks a metric for existential qualities such as the inner hurt, despair, hope, grief, and moral pain that frequently accompany, and often indeed constitute, the illnesses from which people suffer. The relentless substitution during the course of medical training of skills deemed “scientific”—those that are eminently measurable but unavoidably reductionist—for those that are fundamentally linguistic, empathic, and interpretive should be seen as anything but a successful feature of the modern curriculum. (p. 50)

Medical librarians are not doctors. But librarians are purveyors of stories, so I do think we reside in more legs of this evidence-based stool. I would encourage all types of librarians to seek these outside perspectives to ground themselves in the everyday stories of healthcare professionals, patients, and of ourselves.



  1. Didion, J. (2000). Slouching towards Bethlehem. New York: Modern Library.
  2. Greenhalgh, T., & Hurwitz, B. (1999). Why study narrative? BMJ: British Medical Journal, 318(7175), 48–50.
  3. Sackett D.L., Rosenberg W.M., Gray J.A., Haynes R.B., & Richardson W.S. (1996). Evidence based medicine: What it is and what it isn’t. BMJ: British Medical Journal, 312(7023), 71–2. doi: 10.1136/bmj.312.7023.71.


Small Steps, Big Picture

As I thought about composing a blog post this week, I felt that familiar frustration of searching not only for a good idea, but a big one. I feel like I’m often striving (read: struggling!) to make space for big picture thinking. I’m either consumed by small to-do list items that, while important, feel piecemeal or puzzling over how to make a big idea more precise and actionable. So it feels worthwhile now, as I reflect back on the semester, to consider how small things can have a sizable impact.

I’m recalling, for example, a few small changes I’ve made to some information evaluation activities this semester in order to deepen students’ critical thinking skills. For context, here’s an example of the kind of activity I had been using. I would ask students to work together to compare two sources that I gave them and talk about what made the sources reliable or not and if one source was more reliable than the other. As a class, we would then turn the characteristics they articulated into criteria that we thought generally make for reliable sources. It seemed like the activity helped students identify and articulate what made those particular sources reliable or not and permitted us to abstract to evaluation criteria that could be applied to other sources.

While effective in some ways, I began to see how this activity contributed to, rather than countered, the problem of oversimplified information evaluation. Generally, I have found that students can identify key criteria for source evaluation such as an author’s credentials, an author’s use of evidence to support claims, the publication’s reputation, and the presence of bias. Despite their facility with naming these characteristics, though, I’ve observed that students’ evaluation of them is sometimes simplistic. In this activity, it felt like students could easily say evidence, author, bias, etc., but those seemed like knee-jerk reactions. Instead of creating opportunities to balance a source’s strengths/weaknesses on a spectrum, this activity seemed to reinforce the checklist approach to information evaluation and students’ assumptions of sources as good versus bad.  

At the same time, I’ve noticed that increased attention to “fake news” in the media has heightened students’ awareness of the need to evaluate information. Yet many students seem more prone to dismiss a source altogether as biased or unreliable without careful evaluation. The “fake news” conversation seems to have bolstered some students’ simplistic evaluations rather than deepen them.

In an effort to introduce more nuance into students’ evaluation practices and attitudes, then, I experimented with a few small shifts and have so far landed with revisions like the following.

Small shift #1 – Students balance the characteristics of a single source.
I ask students to work with a partner to evaluate a single source. Specifically, I ask them to brainstorm two characteristics about a given source that make it reliable and/or not reliable. I set this up on the board in two columns. Students can write in either/both columns: two reliable, two not reliable, or one of each. Using the columns side-by-side helps to visually illustrate evaluation as a balance of characteristics; a source isn’t necessarily all good or all bad, but has strengths and weaknesses.

Small shift #2 – Students examine how other students balance the strengths and weaknesses of the source.
Sometimes different students will write similar characteristics in both columns (e.g., comments about evidence used in the source show up in both sides) helping students to recognize how others might evaluate the same characteristic as reliable when they see it as unreliable or vice versa. This helps illustrate the ways different readers might approach and interpret a source.

Small shift #3 – Rather than develop a list of evaluation criteria, we turn the characteristics they notice into questions to ask about sources.
In our class discussion, we talk about the characteristics of the source that they identify, but we don’t turn them into criteria. Instead we talk about them in terms of questions they might ask of any source. For example, they might cite “data” as a characteristic that suggests a source is reliable. With a little coaxing, they might expand, “well, I think the author in this source used a variety of types of evidence – statistics, interviews, research study, etc.” So we would turn that into questions to ask of any source (e.g., what type(s) of evidence are used? what is the quantity and quality of the evidence used?) rather than a criterion to check off.

Despite their smallness, these shifts have helped make space for conversation about pretty big ideas in information evaluation: interpretation, nuance, and balance. What small steps do you take to connect to the big picture? I’d love to hear your thoughts in the comments.