Category Archives: Teaching

A Conceptual Model for Interdisciplinary Collaboration

ACRLog welcomes a guest post from Laura MacLeod Mulligan, M.L.S., Information Services Librarian, and Dr. Adam J. Kuban, Assistant Professor of Journalism, both at Ball State University.

Academic buzzwords such as “interdisciplinary” and “collaboration” get paid ample lip service in university administration strategic plans and current scholarship, but practically speaking it can be difficult to begin or sustain such a partnership. With strong faculty support, public services librarians can become embedded in courses, revise assignments, review student output, and assess student learning—playing a more meaningful role in the physical and virtual classroom. We wish to reveal our methods of interdisciplinary collaboration—specifically what has given it longevity and made it successful. From evidence grounded in aggregate literature and personal anecdotes, we have developed a conceptual model for effective collaboration that could apply to any interdisciplinary partnership.

Our conceptual model

Our own collaborative efforts began in January 2012 in order to revise the curriculum of an introductory journalism research course for undergraduates in the Department of Journalism at Ball State University. This ultimately led to the creation of an innovative, technology-based capstone exercise that exemplified the nexus of screencasts with library database instruction. We have also embarked on a research study that assesses the same students’ comprehension of information literacy concepts à la ACRL’s new Framework for Information Literacy. One of our current projects is a practical consideration of interdisciplinary collaboration (in particular between library professionals and faculty in the disciplines).

Scholars who collaborate rarely read literature about collaboration before they begin endeavors. Even if you wanted to brush up on best practices for successful collaboration, you would have to wade through case studies and data surrounding discipline-specific scenarios. We began this project with a conceptual model based on personal anecdotes (i.e., a “model-first” approach) simply because it is natural to begin with “what has worked for us.” Please see our full paper from the 2014 Brick & Click conference for a full literature review where we discuss trends and themes in the literature and make recommendations for further reading. As we read others’ stories and studies and noticed patterns in what led to successful collaboration, we looked for areas of support as well as additional attributes that ought to exist as elaboration to the initial model presented.

We identified and organized a non-discipline-specific conceptual model outlining the (1) workplace conditions; (2) qualities/attitudes; and (3) common goals that have enhanced our collaborative, interdisciplinary experience and could thus serve as a model for any faculty-librarian partnership. To help unpack the importance of these three facets, we sketched a visual depiction of it (see figure 1) and also shared personal anecdotes from our experiences (see table 1).

Conceptual model
Figure 1: Our conceptual model for successful interdisciplinary collaboration

Two of these elements can be controlled: (a) favorable attitudes and personality qualities toward interdisciplinary engagement and (b) common goals determined between the involved parties. The third element—(c) workplace conditions—is largely out of the collaborators’ control but still impacts the partnership. When all three facets come together, we believe successful collaboration can occur. In the event that one facet is absent or lacking, we believe that collaboration can still function but may be difficult to sustain.

Table 1. Qualifiers for a three-faceted conceptual model for successful collaboration

Workplace Conditions Qualities/Attitudes Common Goals
  • Regular communication
  • Standing meetings
  • Physical space
  • Administrative support
  • Cooperative—able to compromise
  • Equitable—respect for roles
  • Trust—perceived competence
  • Shared vulnerability—safe setting to explore, inquire & critique
  • Enthusiasm—desire to continue collaboration
  • Identify individual strengths
  • Select conference & publication venues that “count” for both, or alternate
  • Establish research “pipeline” & philosophy
  • Articulate/update timelines

Workplace conditions

Essential to our collaboration has been regular communication. Keeping a standing meeting throughout the year has given us at least an hour per week to touch base, bounce ideas off one another, strategize, delegate, and debrief ongoing tasks. Booking a conference room in the university library gave us a neutral space in which to talk, think, and work without distraction. Having a coffee machine, audio/visual equipment (including a projection screen and speakers), and a large table made us feel comfortable and well equipped for any task—whether it be critiquing student screencasts, sketching out a four-foot-by-eight-foot poster, drafting correspondence to journal editors, or working side-by-side on separate computers.

Arguably most important in this facet is apparent administrative support. We are fortunate to have current supervisors who embrace our collaborative endeavors, valuing it in subsequent reviews and evaluations. Without it, the interdisciplinary collaboration would likely end, as one or both would deem it too high-risk to continue.

Qualities/attitudes

We have found that if there are common emotional qualities, a collaborative relationship can remain collegial and productive. In our experience, the following stood out as ideal qualities: a cooperative and compromising attitude; respect for and equitable treatment of individual collaborator roles; trust in one another’s competence; ability to be vulnerable, open, honest, and willing to learn; and an enthusiasm for the projects pursued.

Collaboration among faculty and librarians sometimes results in the librarian acting in a supporting role to help execute the vision of a faculty member. In our collaboration, the roles are refreshingly equitable, leaving each person feeling like a co-leader. For example, Adam would not finalize student grades in his introductory research course without receiving feedback from Laura regarding their capstone projects (i.e., screencast database tutorials) in case there were incorrect aspects related to the library resources that she, as an information professional, could identify. This arrangement sustains the momentum and collegiality longer than a leader-follower partnership.

Common goals

While research styles and philosophies differ from discipline to discipline, we discovered that we share similar interests in information literacy, critical thinking skills, student engagement, and assessment driven by qualitative data. Projects stemming from these research interests have been undertaken more easily because of mutual pedagogical interests and shared research methods. We have been able to identify professional development activities that “count” for both of us, and we alternate the focus of activities to make for an even distribution. For example, after presenting at a journalism educators’ conference in summer 2012, we took a derivative of the material to a state library conference in fall 2012 to share our work with that audience. We’ve come to call this our “research pipeline,” and it keeps our activities equitable and interdisciplinary.

What’s missing from the model?

Once we had consulted the literature, one noteworthy qualifier emerged that deserves mention in an ongoing effort to conceive an evolving model that reflects effective interdisciplinary partnerships.

It seems oxymoronic that literature acknowledges the benefit of interdisciplinary scholarship, advocating that “it likely yields more innovative and consequential results for complex problems than traditional, individual research efforts” (Amey & Brown 30), yet institutionalized traditions within academia continue to stymie interdisciplinary efforts. Amey and Brown explain that graduate students who identify with a specific discipline spend years being socialized into that culture, being taught to maintain a particular research identity lodged within the confines of their discipline. In a qualitative study by Teodorescu & Kushner, untenured junior faculty understand the theoretical benefit from interdisciplinary collaboration but feel compelled to abstain from it until after tenure, viewing it as a high-risk activity. KerryAnn O’Meara, an associate professor of higher education at the University of Maryland, issues a call to action via an essay written for Inside HigherEd: “Let’s not assume all candidates must make their case for tenure and promotion based on one static, monolithic view of scholarship.”

Similarly, LIS programs may not adequately prepare their students for interdisciplinary endeavors. Kim Leeder notes that “librarians are not initiated into [their] fields in the same way that faculty are: by reading scholarship, identifying [their] own specific area(s) of specialization, presenting at conferences, and building a network of colleagues whose interests overlap.”

This phenomenon could fit under the Workplace Conditions (resulting from administrative attitudes out of our control) or the Attitudes facet of the model (where it impedes expression of vulnerability in an attempt to solve problems and work together toward solutions).

Conclusion

Postsecondary educators want students ready for an integrated marketplace. Programs of study require students to complete coursework outside of their chosen major(s). Experiential, immersive, and/or service learning are topics of discussion at conferences about college teaching. It seems that, as educators, we recognize the globalization of society and the overlapping nature of most occupations, and we want our students to have diverse, interdisciplinary experiences—thus it seems prudent to adopt a similar mindset for our own scholarly endeavors. We should set an example for our students, valuing efforts to “reach across the aisle” and emphasizing interdisciplinary opportunities.

We believe our conceptual model could assist others as they begin to embark on interdisciplinary initiatives. In time, facets and qualifiers will evolve, transforming the notion of what equates to successful interdisciplinary collaboration.

Facilitating student learning and engagement with formative assessment

Information literacy instruction is a big part of my job. For a little context, I teach somewhere in the range of 35-45 classes per semester at my small liberal arts college. While a few of the sessions might sometimes be repeats for a course with multiple sections, they’re mostly unique classes running 75 minutes each. I’ve been teaching for some time now and while I’m a better teacher than I was ten or five years ago or even last year, there’s always plenty of room for improvement of course. A few months ago, I wrote a post about reflection on and in my teaching, about integrating “more direct discussion of process and purpose into my classes […] to lay bare for students the practice, reflection, and progression that complicates [information literacy] work, but also connects the gaps, that brings them closer to crossing the threshold.” Each year, I’ve been devoting more attention to trying to do just that: integrate process and purpose into my classes to improve student learning and engagement.

It didn’t start out as anything momentous, just a little bit all the time. Initially, it was only a small activity here or there to break things up, to give students a chance to apply and test the concept or resource under discussion, and to scaffold to the next concept or resource. I would demo a search strategy or introduce a new database and then ask students to try it out for their own research topic. I would circle the class and consult individually as needed. After a few minutes of individual exploration, we would come back together to address questions or comments and then move on to the next resource, strategy, or concept. This appeared to be working well enough. Students seemed to be on board and making progress. By breaking a class into more discrete chunks and measuring the pace a bit, students had more of a chance to process and develop along the way. Spacing out the hands-on work kept students engaged all class long, too.

For some time, I’ve started classes by reviewing the assignment at hand to define and interpret related information needs, sometimes highlighting possible areas of confusion students might encounter. Students expressed appreciation for this kind of outlining and the shape and structure it gave them. I felt a shift, though, when I started asking students, rather than telling them, about their questions and goals at the outset of a class. Less Here are the kinds of information sources we’ll need to talk about today and more What kinds of information do you think you need to know how to access for this assignment? What do you hope that information will do for you? What have been sticky spots in your past research experiences that you want to clarify? I wanted students to acknowledge their stake in our class goals and this conversation modeled setting a scope for learning and information needs. We then used our collective brainstorm as a guiding plan for our class. More often than not, students offered the same needs, questions, and problems that I had anticipated and used to plan the session, but it felt more dynamic and collaboratively constructed this way. (Of course, I filled in the most glaring gaps when needed.)

So why not, I finally realized one day, extend the reach of this approach into the entire class? While scaffolding instruction with small activities had helped students process, develop, and engage, I was still leading the charge at the pace I set. But what if we turned things around?  What if, essentially, they experimented on their own in order to determine something that worked for them (and why!) and shared their thoughts with the class? What if we constructed the class together? Rather than telling them what to do at the outset of each concept chunk, I could first ask them to investigate. Instead of demonstrating, for example, recommended search strategies and directing students to apply them to their own research, I could ask students to experiment first with multiple search strategies in a recommended database for a common topic in order to share with the class the strategies they found valuable. The same goes for navigating, filtering, and refining search results or for evaluating sources and selecting the most relevant or for any concept or resource for that matter. Why not, I thought, ask students to take a first pass and experiment? We could then share ideas as a class, demonstrating and discussing the strengths and weaknesses of their tactics along the way, collaboratively building a list of best practices strategies. Students could then revisit their work, applying those best practices where needed.

This kind of experiment-first-then-build-together-then-revise approach is simple enough, but its advantages feel rather significant to me. It makes every class exciting, because it’s—in part, at least—unique and responsive to precisely those students’ needs. Of course I have a structure and goals in mind, prepared notes in hand, but it’s a flexible approach. While it’s not appropriate for every class, the low stakes/low prep makeup is readily applicable to different scenarios and content areas. The students and I are actively involved in constructing the work of the class together. Everyone has a chance to contribute and learn from each other. In particular, more experienced students get to share their knowledge while less experienced students learn from their peers. The expectation to contribute helps students pay attention to the work and to each other. Its scaffolded and iterative design helps students digest and apply information. Its reflective nature reveals for students practice and process, too; it models the metacognitive mindset behind how to learn, how to do research. I don’t mean to get too ebullient here. It’s not a panacea. But it has made a difference. It’s probably no surprise that this kind of teaching has required a degree of comfort, a different kind of classroom leadership, and a different kind of instinct that would have been much, much harder to conjure in my earlier teaching.

While I wasn’t aware of it initially and didn’t set out to make it so, I now recognize this as formative assessment. Not only do these small activities increase opportunities for engagement and learning, they serve as authentic assessment of students’ knowledge and abilities in the moment. They provide evidence of student learning and opportunities for action immediately. With that immediate input, I can adjust the nature and depth of instruction appropriately at the point of need. All in a way that’s authentic to and integrated in the work of the class.

The informality of this approach is part of what makes it flexible, low prep, and engaging. It’s such a rich site for documentation and evaluation of student learning, though. I want to capture the richness of this knowledge, demonstrate the impact of instruction, document students’ learning. But I’m struggling with this. I haven’t yet figured out how to do this effectively and systematically. Some formative assessments result in student work artifacts that can illustrate learning or continuing areas of difficulty, but the shape my implementation has so far taken results in less tangible products. At the ACRL 2015 conference a few weeks ago, I attended a great session led by Mary Snyder Broussard, Carrie Donovan, Michelle Dunaway, and Teague Orblych: “Learning Diagnostics: Using Formative Assessment to Sustainably Improve Teaching & Learning.” When I posed this question in the session, Mary suggested using a “teacher journal” to record my qualitative reflections and takeaways after each class and to notice trends over time. I’m interested in experimenting with this idea, but I’m still searching for something that might better capture student learning, rather than only my perception of it. I’m curious to read Mary’s book Snapshots of Reality: A Practical Guide to Formative Assessment in Library Instruction, as well as Michelle and Teague’s article “Formative Assessment: Transforming Information Literacy Instruction” to see if I might be able to grab onto or adapt any other documentation practices.

Do you use formative assessment in your teaching? How do you document this kind of informal evidence of student learning? I’d love to hear your thoughts in the comments.

Intentional teaching, intentional learning: Toward threshold concepts through reflective practice

ACRLog welcomes a guest post from Jennifer Jarson, Information Literacy and Assessment Librarian at Muhlenberg College.

This fall marked the start of my tenth academic year as a librarian. It startles me, to say the least, to count up the years and arrive at (almost) ten. Having spent the majority of my career so far at a small college, I’ve been fortunate to be involved in a wide variety of projects. As a public services librarian, though, my attention has most frequently been directed to reference, instruction, and all things information literacy. It’s no surprise that, six-ish weeks into the semester, information literacy instruction is on my agenda and my mind.

Just the other week, a faculty member and I were chatting about our past versus present selves in the classroom. A critical eye back over the years dredges up some pretty squirm-worthy memories. Because they were performed in front of an audience of students and faculty, these mis-steps are especially embarrassing to bring to mind. I cringe to recall, for example, some excruciating moments in early years in which I droned on about the minutiae of search strategies, students’ eyes glazing over, drool practically trickling down their chins. I’m grateful, then, to look back and also recognize successes and, more importantly, evolution in my teaching. Perhaps it’s just those most awkward and agonizing of moments that best surface the need for change and fuel experimentation with alternate approaches.

For many, a protocol of reflection and experimentation, of trial and error, seems a natural drive. Yet demands on our time and attention might cause us to repeat an ineffective session because we don’t have the time to examine its inadequacies and restructure. Our many competing obligations might prevent us from effecting the more wholesale change we sometimes desire. In an effort to promote the “intentionality” of my reflection and experimentation, as Booth (2011) might say, and to pay it more of the attention it deserves, I’ve been compelling myself to make space for it, adding it to my to-do lists, to my annual goals. In years past, themes of my reflection-for-self-improvement-in-the-classroom regimen have included, for example, scaffolding skills to slow the pace adequately for students’ development and enhancing student engagement through more constructive (and constructivist) in-class activities. This intentional reflection is giving me the perspective and head space to uncover my assumptions and shortcomings and to motivate improvement, rather than revisit the same practice again and again for no good reason.

I don’t mean to claim that I’m reinventing any library instruction wheels. Far from it. But I do hope I’m oiling its sometimes rusty squeak for a smoother, more productive and engaging ride that takes us all (student, faculty, and librarian alike) a little further down the pike. As are you, I don’t doubt. How I will feel in another ten years when I look back on yesterday’s class or today’s blog post, even, is up for grabs. But on we march. And thank goodness for this drive forward, for the chance to reflect, learn from these shortcomings, and try again. Moment to moment, class to class, semester to semester. Small or large, these steps trend toward progress.

As I reflect on my practice in this particular year, then, I think that what I’m trying to teach—and where I’m still coming up short—is the practice of reflection. Too often, I know I have focused on the how at the expense of the why. I long ago moved beyond the point-and-click method of library instruction. Yet despite my efforts thus far to model, scaffold, and construct our way toward information literate, a connecting piece seems to be still sometimes missing. When I look back now to find my in-class nods to the what for, I better recognize their nuance and how hard it must have been for inexperienced students to catch them, decode them. While my modeling and scaffolding certainly have had the why at their core, many students haven’t had the frame of reference to recognize its presence. I want to uncover for students the habits of mind—the “knowledge practices” and “dispositions,” so to speak—of information literacy, not just the clickpaths to mimic it.

So this year I’m looking to add reflective, metacognitive moments to help expose rationales, purposes, and processes for students. With the metacognitive mindset made visible, I hope students will develop a more flexible information literacy lens to apply to their future paths. I think this is a strength of the new (draft) Framework for Information Literacy for Higher Education: highlighting the reflective practice and metacognitive mettle at the core of information literacy. Metacognitive awareness is, no surprise to us, inherent in information literacy skills and development; the new draft framework helps us to enhance its prominence.

Now to the business of actually doing this. How, you might ask? Good question. I wish I had more answers. So far I’m trying to integrate more direct discussion of process and purpose into my classes. I’m trying to lay bare for students the practice, reflection, and progression that complicates this work, but also connects the gaps, that brings them closer to crossing the threshold. And I’m trying to work with faculty to extend this work beyond the limitations of single, isolated library sessions. I see some successes so far, but it feels more than a little premature to claim I’ve conquered such a problem. By their nature, these concepts and this work are complex and protracted. For now, I am (mostly) satisfied to be working on it.

I feel I can’t so much as stick a toe into these waters without at least a nod to their expansiveness. I imagine you recognize, too, the shared roles of librarians and faculty in this kind of information literacy instruction. These are not topics and goals isolated to a one-shot instruction session. This is the work of not one class, but many. This is the work not only of librarians, but of faculty, too. We work to establish the library as a leader in information literacy on our campuses, but it’s also our aim to recognize the extensive information literacy work that takes place outside the library-instruction-specific classroom. Our ambitions to promote shared faculty and librarian understanding of information literacy, common investment in students’ learning, and opportunities for collaboration and curricular development are ever more relevant.

As I recognize the role of intentional reflection in my own development, then, I’m struck to see its place of primacy in my teaching goals, as well. I might typically brush this aside as a self-apparent truth requiring no further deliberation. In my reflection-oriented state, though, I’m more inclined to pause for a moment and consider the parallels of these themes in information literacy teaching and in information literacy learning. With my ongoing push (some days it’s a bit more like a shove) into an intentionally reflective practice, I’m aiming to improve student learning as a more effective, responsive, and flexible instructor. I’m simultaneously aiming for a congruent push toward a reflective and metacognitive student mentality to tip their scales toward greater engagement and transformation. As Townsend, Brunetti, and Hofer (2011) wrote, it’s these “big ideas that make information science exciting and worth learning about.”

What about you? What are you uncovering and developing in your pedagogy? What roles have reflection, metacognition, and threshold concepts played in your instructional evolution? I’d love to hear your thoughts in the comments.

If At First You Don’t Assess, Try, Try Again

ACRLog welcomes a guest post from Katelyn Tucker & Alyssa Archer, Instruction Librarians at Radford University.

Instruction librarians are always looking for new & flashy ways to engage our students in the classroom. New teaching methods are exciting, but how do we know if they’re working? Here at Radford University, we’ve been flipping and using games for one-shot instruction sessions for a while, and our Assessment Librarian wasn’t going to accept anecdotal evidence of success any longer. We decided that the best way to see if our flipped and gamified lessons were accomplishing our goals was to evaluate the students’ completed assignments. We tried to think of every possible issue in designing the study. Our results, however, had issues that could have been prevented in hindsight. We want you to learn from our mistakes so you are not doomed to repeat them.

Our process

Identifying classes to include in this assessment of flipped versus gamified lessons was a no-brainer for us. A cohort of four sections of the same course that use identical assignment descriptions, assignment sheets, and grading rubrics meant that we had an optimal sample population. All students in the four sections created annotated bibliographies based on these same syllabi and assignment instructions. We randomly assigned two classes to receive flipped information literacy instruction and two to play a library game. After final grades had been submitted for the semester, the teaching faculty members of each section stripped identifying information from their students’ annotated bibliographies and sent them to us. We assigned each bibliography a number and then assigned two librarian coders to each paper. We felt confident that we had a failsafe study design.

Using a basic rubric (see image below, click to enlarge), librarians coded each bibliography for three outcomes using a binary scale. Since our curriculum lists APA documentation style, scholarly source evaluation, and search strategy as outcomes for the program, we coded for competency in these 3 areas. This process took about two months to complete, as coding student work is a time-consuming process.

assessmentchart

The challenges

After two librarians independently coded each bibliography, our assessment librarian ran inter-rater reliability statistics, and… we failed. We had previously used rubrics to code annotated bibliographies for another assessment project, so we didn’t spend any time explaining the process with our experienced coders. As we only hit around 30% agreement between coders, it is obvious that we should have done a better job with training.

Because we had such low agreement between coders, we weren’t confident in our success with each outcome. When we compared the flipped sections to the gamified ones, we didn’t find any significant differences in any of our outcomes. Students who played the game did just as well as those who were part of the flipped sections. However, our low inter-rater reliability threw a wrench in those results.

What we’ve learned

We came to understand the importance of norming, discussing among coders what the rubric means, and incorporating meaningful conversations on how to interpret assessment data into the norming process. Our inter-rater reliability issues could have been avoided with detailed training and discussion. Even though we thought we were safe on this project, because of earlier coding projects, the length of time between assessments created some large inconsistencies.

We haven’t given up on norming: including multiple coders may be time-intensive, but when done well, gives our team confidence in the results. The same applies to qualitative methodologies. As a side part of this project, one librarian looked at research narratives written by some participants, and decided to bravely go it alone on coding the students’ text using Dedoose. While it was an interesting experiment, the key point learned was to bring in more coders! While qualitative software can help identify patterns, it’s nothing compared to a partner looking at the same data and discussing as a team.

We also still believe in assessing output. As librarians, we don’t get too many opportunities to see how students use their information literacy skills in their written work. By assessing student output, we can actually track competency in our learning outcomes. We believe that students’ papers provide the best evidence of success or failure in the library classroom, and we feel lucky that our teaching faculty partners have given us access to graded work for our assessment projects.

Musings on Outreach as Instruction

Last week, librarians from many branches of our university gathered for a Teaching Librarians Retreat. The retreat was organized and hosted by a few wonderful colleagues, who I cannot thank enough for their efforts and a fantastic event. The goal for the retreat was to promote a community of sharing, peer support, and ongoing learning among UI librarians who teach, and was a chance to reflect on the year and find colleagues with similar interests and concerns about teaching. Making dedicated time for sharing and reflection is especially important in an institution as large and with as many librarians as ours.

We broke out into discussion groups for part of the retreat, and my group gathered to talk about “outreach as instruction.” What struck me first as we each shared our thoughts is that “outreach” can mean so many different things. We had people contributing to the conversation from perspectives of social media, events and programming, marketing, digital badges, special collections, working with student organizations, and outreach to faculty vs. students vs. the community.

My take on “outreach as instruction” and why it matters has to do with the limitations of one-shot sessions and ways we can expand the impact of instruction beyond traditional methods. One-shot sessions are valuable as point-of-need instruction for academic coursework, but relying solely on them is limiting: only a fraction of students receive library instruction, and a number of them may not be particularly interested in the General Education required course that brought them into the library. This is where I think outreach can be powerful – in the many possibilities to connect with students outside of a classroom setting, while still teaching something. Here are a few ideas on how to go about doing that:

  1. Connect over something interest-based, rather than academics-based. For example, I’ve heard of academic libraries having knitting sessions (which is also closely tied with stress-relief activities during finals week), but it could be something else. The draw to participate is something of general interest that can also be connected to research and resources available at the library.
  2. Communicate with student organizations, and let the student leaders know how the library can support their group and members. This can lead to tailored teaching opportunities for students who are involved and invested in a group that may not get this attention and instruction otherwise.
  3. Use the collection creatively. We’ve found ways to do this by using images from the Iowa Digital Library on buttons, postcards, and Valentine cards. Those are all short and simple activities that can naturally lead to learning something new about a variety of resources. (You can see the Valentine’s activities here.)

Those are just a few ideas, which clearly come from my perspective as an Undergraduate Services Librarian (and barely crack the surface of our group discussion at the Teaching Librarians Retreat). For you, “outreach as instruction” could mean building on relationships with faculty, an emphasis on social media, or something else. Outreach itself is a broad concept with multiple definitions, but that also means there are so many variations and opportunities for librarians to engage with their users and community.

When I hear “outreach as instruction,” I think of how we can connect with undergraduates in ways other than in the classroom for a one-shot session, and incorporate what I like to call “nuggets of information literacy.” What does it mean for you and your library?