ACRLog welcomes a guest post from Donna Witek, Associate Professor and Public Services Librarian at the University of Scranton.
Photo by Moyan Brenn on Flickr
When I first learned about assessment at the very beginning of my professional work as a librarian, there was one aspect of the process that made complete sense to me. I was instructed that an assessment plan is just that–a plan–and that it is not only OK but expected for the plan to change at some point, either during or after it’s been put into action.
Now, the specifics on how these changes happen, what are best practices in altering an assessment plan, and the relationship between the integrity of the assessment data gathered and any changes made, are all complex questions. I am in my seventh year working as an instruction librarian in an academic library, and I consider myself at best an engaged learner-practitioner when it comes to assessment–I am by no means an expert, and I offer this as a disclaimer as I share some thoughts on assessment and the Framework for Information Literacy for Higher Education [pdf].
In the years since I was first trained in basic assessment practices, I still find the recursive, cyclical nature of assessment to be the aspect of the process that legitimizes the rest. Learning is a messy process, and as instructors we understand that there are multiple ways to reach the same goal–or learning outcome–and that different learners learn differently. It could mean our approach to teaching (i.e., our pedagogy) needs to be adapted–sometimes on the fly!–to meet the needs of the students in front of us. Or, maybe the way I articulated one of the learning outcomes for an instruction session turns out to be way too ambitious for the scope of the instruction, and ten minutes in I realize I need to change the formulation of the outcome in my mind in order for my teaching and the students’ learning to harmonize.
What I love about the principle that an assessment plan is meant to be changed (at some point) is that it means the above scenarios are not failures, but part of an authentic teaching and learning process. This is empowering for teachers and students alike.
Now, it is my understanding that all assessment plans change eventually. In the case of an assessment plan that from the outset is harmonized perfectly to the learning context to which it is applied, it isn’t changed until the end of the assessment cycle, but it still changes and develops in response to the information (call it data if you’d like) gathered throughout the process.
At the end of this week and after almost two years of development and review by the profession, the Framework will be considered for adoption by the ACRL Board of Directors during ALA Midwinter. The Framework is not conceived as an assessment document, as it “is based on a cluster of interconnected core concepts, with flexible options for implementation, rather than on a set of standards or learning outcomes or any prescriptive enumeration of skills” (Framework [pdf], p.2).
This begs the questions: What is the relationship between the Framework and assessment? And how does this in turn relate to the revision task force’s recommendation that the Information Literacy Competency Standards for Higher Education be sunsetted in July 2016 (Board of Directors Action Form [pdf], p.3)?
Before I share some ideas in response to these questions, Megan Oakleaf offers to the profession “A Roadmap for Assessing Student Learning Using the New Framework for Information Literacy for Higher Education” [pdf] (JAL 40.5 2014). I highly recommend reading Oakleaf’s roadmap, as my own ideas touch on many of the same points found in her “Ok, So Now What?” section, though I want to fold into the discussion the relationship between this process and the proposed sunsetting of the Standards.
Here I offer just one of many possible paths toward incorporating the Framework into your local information literacy instructional practice. It is a theoretical model, because it has to be at this point: the Framework is not yet adopted. As will hopefully be made clear, not enough time has passed for this model to have been fully implemented, though some libraries have begun the process. (1)
The first step I would recommend, based on evidence from libraries that have taken this approach and found it fruitful and impactful on both student learning and programmatic practices, is to read the Framework, both individually and as a group with colleagues in your instruction program, and through reflection and discussion identify intersections between the Framework and the information literacy instruction work you are already doing. (2) Rather than feel pressured to overhaul an entire instruction program overnight, instead use the Framework as a new way to understand and build upon the things you’re already doing on both the individual and programmatic levels.
If your current practices are heavily situated within the Standards, I think this exercise will surprise by unearthing the connections that do in fact exist between the Standards and the Framework, even as the latter represents a significant shift in our collective approach to teaching and learning. (3)
The next step would be to review your learning outcomes for individual instruction sessions in light of the Framework, to be inspired by the connections, and to be challenged by the gaps–and to rewrite these outcomes based on both your engagement with the Framework and your recent assessment of your own students’ learning using these outcomes. The cycle of assessment for learning outcomes tied to individual instruction is short–these outcomes can and should be reviewed and revised in the period of reflection that immediately follows each instruction session.
In many ways, this makes individual instruction the most immediately fertile context in which to use the Framework to be inspired and to transform your instructional practice, keeping in mind the complex concepts that anchor the frames require learners to engage them in multiple learning contexts throughout the curriculum in order to be fully grasped. Still, even a one-shot can incorporate learning outcomes that will help learners progress toward understanding of these concepts in a manner appropriate to the learner’s current level of training in a discipline or disciplines.
But what of your programmatic information literacy learning outcomes? What about the places where information literacy has been integrated into curricular programs within or across the disciplines? And what about those (fortunate!) institutional contexts in which information literacy is integrated explicitly into the learning outcomes for the institution as a whole?
The beauty of assessment, as I suggest above, is that it is cyclical. Just as all ACRL guidelines and standards undergo cyclical review, so too do our local assessment and curriculum plans–or at least, they should. As each assessment plan comes up for review, librarians who have been engaging the Framework in their individual instructional practice can share “upwards” their experiences and the impact on student learning they observed through that engagement, and so fold the concepts underpinning the Framework into each broader level of assessment.
In this way, the Framework’s influence will cascade upwards within a local institutional context according to a timeline that is determined by the review cycles of that institution. While the revision task force’s recommendation to the ACRL Board is for the Standards to be sunsetted a year and a half after the Framework’s recommended adoption, I would argue that it is in the spirit of the Framework for local timelines to necessarily trump ACRL’s: as long as librarians are engaging the Framework, both individually (in instruction) and collaboratively (as local assessment plans and curricular documents come up for review), and doing so in light of the information literacy instruction work your library has been doing since (or even prior to) the adoption of the Standards fifteen years ago, (4) then the worry associated with sunsetting the Standards on the national level will be eclipsed by the particular, robust influence the Framework is having on your own campus, with your own students.
And anyway, we do our best work when we’re focusing on the students in front of us. So, let’s get to work.
(1) Nicole Pagowsky shares the first steps of a similar process underway at the University of Arizona.
(2) The first example of this I’ve encountered is at Trinity College; librarians leading in different areas of Trinity’s information literacy instruction program presented at the 2014 Connecticut Information Literacy Conference their success with this initial approach to implementing the Framework (video and prezi).
(3) Amanda Hovious has created a helpful series of Alignment Charts for ACRL Standards and Proposed Framework, which represent one practitioner’s approach to connecting these two documents. I would argue that there are as many potential charts/models for connecting the Standards to the Framework as there are practitioners interpreting the meaning and content of each. It is for this reason I believe it was prudent for the revision task force to abstain from developing a model for alignment themselves, as such a model would run the danger of being wrongly interpreted as “canonical” because of its association with the task force that developed the Framework. That being said, Hovious’ charts are informed by her training as an instructional designer, and coupled with her notes for interpretation at the beginning of the document, represent a valuable perspective on how these two approaches to information literacy instruction relate. Another example that is equally compelling, in this case because the alignment is anchored to locally developed core competencies, is offered by Emily Krug, King University. It is compelling because it models (literally) the notion that information literacy is locally situated, by using King University’s core competencies as the concrete bridge between Standards and Framework.
(4) Barbara Fister offers an historical perspective in which she recalls the anticipated reception of the Standards when they were first adopted in 2000, and the remarkably similar conversations we are having now in relation to the Framework.