Daily Archives: July 26, 2010

Let’s Not (Just) Do the Numbers

Meredith Farkas has a thoughtful post at Information Wants to be Free on our love of numbers and how little they tell us without context. Less traffic at the reference desk: what does that mean? It could mean that students don’t find the help they get there useful, or that your redesigned website or new signage has solved problems that used to require human intervention. More instruction sessions? Maybe more faculty attended conferences and needed a babysitter.

Meredith’s post made me think about the statistics I recently compiled for our annual report. Many of them are things we count in order to share that information with others through national surveys. We dutifully count how much microfiche and microfilm we have added to the collection (seriously?) and how many print periodicals we have (fewer all the time, but our growing access to electronic full text is virtually impossible to measure; does a title that has a 12 month embargo count?). We haven’t used this report to share how much use our databases are getting and which journals in those databases are getting downloaded most often, or what Google Analytics tells us about which web pages attract the most attention. We use that information for decision-making, but it doesn’t become part of the record because the time series we use was started back when the earth’s crust was still cooling. (Guess what: acquisition of papyrus scrolls, clay tablets and wax cylinders is way down.)

In the end, I’m not all that interested in the numbers. The really interesting data is usually the hardest to gather. How do students decide which sources to use, and does their ability to make good choices improve over time? When they read a news item that someone has posted to Facebook, are they better prepared after our sessions to determine whether it’s accurate? Do students who figured out how to use their college library transfer those skills to unfamiliar settings after they graduate? Do students grow in their ability to reason based on evidence? Have they developed a respect for arguments that arrive at conclusions with information that isn’t cherry-picked or taken out of context? Can they make decisions quickly without neglecting to check the facts? The kind of literacy we’re hoping to foster goes far beyond being able to write a term paper. And knowing how many microfiche we own doesn’t have anything to do with it.

Now I have a question for our readers. Are there ways you regularly assess the kinds of deep learning that we hope to encourage? What measures of learning, direct and indirect, do you use at your library? Have you conducted studies that have had an impact on your programs? Are you gathering statistics that seem particularly pointless? Should we start an Awful Library Statistics blog? The floor is open for comments.

photo courtesy of Leo Reynolds.