For all the time this profession has put into promoting the information literacy concept, I was surprised that data from the National Center for Educational Statistic’s report Academic Libraries 2006 showed an underwhelming lack of penetration into or acceptance by academic institutions. Table 13 has data for the percentage of academic institutions reporting information literacy activities. There are five indicators of information literacy activity. They are:
1. defined information literacy or information literate student
2. incorporated information literacy into institution’s mission
3. incorporated information literacy into institution’s strategic plan
4. has institution-wide committee to implement strategic plan for information literacy
5. strategic plan formally recognizes the library’s role in information literacy instruction
First, here at the corresponding percentages for each of those five items:
After all these years of researching it, writing about it, presenting about it, discussing it and selling the information literacy concept to our institutions do these numbers seem as low to you as they do to me? The fourth one is especially surprising. Do our faculty colleagues and academic administrators find information literacy of so little importance that they are unable to justify allocating one committee in the governance structure to it? Apparently so. You might argue that an institution doesn’t need an official committee to develop an information literacy initiative, but I would counter that it goes a long way toward legitimizing it and paving the way for better collaboration with faculty. Perhaps it is the way the question is worded. It seems to suggest you should only answer “yes” if the committee is implementing a strategic plan goal, and perhaps more institutions have information literacy committees and task forces, but are in no way related to a strategic plan.
I was also surprised to learn that smaller institutions, mostly likely teaching and learning-focused colleges, reported less activity than larger institutions. So whereas only 31 percent of small colleges (less than 1,500 FTE) had incorporated IL into the mission, at insitutions with over 5,000 FTE it was 41 percent. I would have expected that smaller institutions that are more focused on learning would be the ones to move more quickly and fully into integrating information literacy into the curriculum. Masters I and II institutions appear to consistenly have the highest numbers of activity. While the best results come from the activity of defining information literacy or an information literate student, I can’t say I’m exactly sure what that means. Where is it defined? Does anyone else know about it? Maybe this report needs some better questions.
I’d like to think that with all the hard work academic librarians and their organizations have put into information literacy initiatives that we’d be doing better by now. It is possible that in 2008, as opposed to 2006, we are doing much better, and that the numbers for these indicators are higher now as a result. That said, these numbers paint a somewhat bleak picture that should give some cause for concern. Or are these numbers meaningless for your institution because there is already a thriving information literacy initiative in place, regardless of committees or strategic plan mentions. What these numbers might suggest in the long run is that NCES needs to do a better job of asking the right questions so we can get a realistic picture of the penetration rate of informaton literacy initiatives and programming at U.S. colleges and universities.
Since we’re on the topic of information literacy I’m going to leave you with some words of wisdom from a faculty member who gave a talk titled “Scholars Perspective: Impact of Digitized Collections on Learning and Teaching“. It is a paper presented by David Watt, a faculty member in the Temple University History Department, at the June 4, 2008 RLG Programs Symposium. Here’s an excerpt that provides some advice for librarians on communicating with faculty about information literacy. I think you’ll find it worthwhile reading:
It is also clearly the case that many of Templeâ€™s faculty are deeply resistant to making â€œinformational literacyâ€ a major component of their courses. It is not a category that makes much sense to many Temple professors. To many of them, it sounds like the kind of phrase that educational bureaucrats who donâ€™t do much teaching or research love to throw around. For many of them, it raises the specter of universities built around â€œassessing student learning outcomes.â€ So, there is the bad news: we are living in a world in which there is good reason to believe that students really do need to work on their informational literacy and in which faculty seem resistant to helping them do so. Here is the good news: our experiences at Temple suggest that this is a challenge that can sometimes be easily negotiated. All one has to do, some of us at Temple are coming to believe, is stop preaching to faculty about the need for them to take an interest in informational literacy and, instead, start asking faculty about their hopes for their students.
As soon as one begins to do thatâ€”as soon as one begins asking historians, for example, about their hopesâ€”one begins to get answers such as the following:
â€œI want them to understand that they should read all primary sources with a certain amount of skepticism and that they should be even more skeptical when they are reading secondary works.â€
â€œI want them to be able to distinguish between relatively reliable primary sources and ones that are less reliable.â€
Now, none of the faculty responses to the question about their hopes for their students contains the magic phrase â€œinformational literacy.â€ But that is not really the point, is it?