Category Archives: Assessment

Let’s Not (Just) Do the Numbers

Meredith Farkas has a thoughtful post at Information Wants to be Free on our love of numbers and how little they tell us without context. Less traffic at the reference desk: what does that mean? It could mean that students don’t find the help they get there useful, or that your redesigned website or new signage has solved problems that used to require human intervention. More instruction sessions? Maybe more faculty attended conferences and needed a babysitter.

Meredith’s post made me think about the statistics I recently compiled for our annual report. Many of them are things we count in order to share that information with others through national surveys. We dutifully count how much microfiche and microfilm we have added to the collection (seriously?) and how many print periodicals we have (fewer all the time, but our growing access to electronic full text is virtually impossible to measure; does a title that has a 12 month embargo count?). We haven’t used this report to share how much use our databases are getting and which journals in those databases are getting downloaded most often, or what Google Analytics tells us about which web pages attract the most attention. We use that information for decision-making, but it doesn’t become part of the record because the time series we use was started back when the earth’s crust was still cooling. (Guess what: acquisition of papyrus scrolls, clay tablets and wax cylinders is way down.)

In the end, I’m not all that interested in the numbers. The really interesting data is usually the hardest to gather. How do students decide which sources to use, and does their ability to make good choices improve over time? When they read a news item that someone has posted to Facebook, are they better prepared after our sessions to determine whether it’s accurate? Do students who figured out how to use their college library transfer those skills to unfamiliar settings after they graduate? Do students grow in their ability to reason based on evidence? Have they developed a respect for arguments that arrive at conclusions with information that isn’t cherry-picked or taken out of context? Can they make decisions quickly without neglecting to check the facts? The kind of literacy we’re hoping to foster goes far beyond being able to write a term paper. And knowing how many microfiche we own doesn’t have anything to do with it.

Now I have a question for our readers. Are there ways you regularly assess the kinds of deep learning that we hope to encourage? What measures of learning, direct and indirect, do you use at your library? Have you conducted studies that have had an impact on your programs? Are you gathering statistics that seem particularly pointless? Should we start an Awful Library Statistics blog? The floor is open for comments.

photo courtesy of Leo Reynolds.

Assessment is the New Black

I’m teaching a course this semester for the Graduate School of Library & Information Science at Illinois called, “Libraries, Information, and Society.” Like similar courses, it presents an introduction to a number of core concepts for future information professionals, as well as an introduction to professional skills, values, and employment environments. This week, we heard an excellent presentation from my colleague, Tina Chrzastowski, author of “Assessment 101 for Librarians,” an essay that appeared Science & Technology Libraries in 2008. The point of the presentation, and the message that I hope my students took from it, is that the ability to design an assessment program and to use its results in planning and decision making is a critical skill set for any information professional. Assessment is the new black – it goes with whatever job you have, and it is relevant to every library environment.

Assessment may also the new instruction, though – a critical skill set for academic librarians that is not clearly and appropriately addressed in LIS programs. It is no coincidence that instruction librarians have been among the early leaders in assessment activities (I’m looking at you, Deb Gilchrist!): this reflects their connection to broader campus efforts to identify student learning outcomes, but also their experience in having to learn critical skills on the job that were not a focus for their professional education. The list of studies showing that teaching skills are required for a wide variety of academic library positions is almost as long as the list of studies showing that few LIS programs have ever made this a focus for their coursework or their faculty hiring (a shout-out to those who break that mold, including the University of Washington and Syracuse University). I imagine that a similar list of studies will find their way into the literature regarding the importance of assessment and evidence-based library and information practice for librarians of all types, and the need for greater attention to those skills across the LIS curriculum. As we remain concerned about attention paid to instruction in LIS programs some 30 years after those first studies started to come out, though, it may take a while to see real change. Of course, it may be that assessment is really the new knowledge management, in which case the courses will be available much more quickly!

As Chrzastowski’s article points out, there are many resources available to librarians interested in continuing professional education in assessment. The Association of Research Libraries has held two successful conferences on this topic, and there is an international movement in support of evidence-based practice that supports a journal and conference programs. As with instruction, there are “lighthouse” LIS programs, too; in this case the University of North Carolina, which offers a course on “Evidence Based Practices in the Library and Information Sciences”.

What can ACRL do? If assessment is the new instruction, should we see more attention to looking at assessment across the association, and to fostering the development of a corps of academic librarians (beyond assessment coordinators) who see this as a critical area of personal expertise? Since assessment skills are critical not only to public services and collections librarians, but also to technical services and information technology specialists, is this an area of functional specialty that could broaden our appeal across the academic library enterprise, or be an initiative on which we can fruitfully collaborate across ALA divisions?

I don’t have the answers, but I know you all look good in black!