Tag Archives: nces_data

Latest NCES Data Shows Little IL Progress

In a post from August 2008 I shared some data straight out of a report titled Academic Libraries 2006 that presents tabulations for the 2006 Academic Libraries Survey (ALS) conducted by the United States Department of Education’s National Center for Education Statistics (NCES). The data related to the percentage of libraries reporting information literacy activities was underwhelming when one considers all of the attention our profession places on and puts into information literacy and library instruction initiatives. For the fall of 2006, there were far too few institutions reporting that information literacy was a part of the institutional mission or had been incorporated into the strategic plan. So I was curious when I saw the latest Academic Libraries 2008: First Look report that presents tabulations from 2008. How did we do? Was there an increase in reported information literacy activity between 2006 and 2008?

There was some change all right, but not in the right direction. Here are the same five data items identified in the NCES Survey related to information literacy:

1. defined information literacy or information literate student
2. incorporated information literacy into institution’s mission
3. incorporated information literacy into institution’s strategic plan
4. has institution-wide committee to implement strategic plan for information literacy
5. strategic plan formally recognizes the library’s role in information literacy instruction

Here at the corresponding percentages for each of those five items for 2006 versus 2008:

2006———————————–2008
1. 48.4——————————- 1. 46.3
2. 34.3——————————–2. 32.5
3. 30.4——————————–3. 30.3
4. 17.6——————————–4. 17.8
5. 24.8——————————–5. 24.2

So there was either decline or no significant change. That’s quite puzzling and somewhat disturbing. Here we are two years later and academic librarians’ efforts to advance the integration of information literacy into our institutions appear to be backsliding. Maybe we need to discount the data item number two above. How many academic institutions are going to incorporate something about information literacy into their mission statements? I wouldn’t even expect my own institution to do that. And what about the incorporation of information literacy into the strategic plan. At my own institution an early draft of a new strategic plan written this year included some text about the importance of the library for supporting research – nothing about information literacy. But even that minimal language was dropped in a later version. So getting the institution to incorporate information literacy into the strategic plan is no easy task. I would expect number 4 to be higher though. Here is an objective worth working towards. And rather than ask about integration of information literacy into the strategic plan or mission, why not change that to integration into a curriculum plan for core education. I think more academic libraries could report that their institution’s plan for general education or liberal education does discuss information literacy as does the one at my institution.

Academic librarians still have their work cut out for them when it comes to institutional recognition of the value of information literacy. Beyond that, what can the Academic Report 2008 tell us about our performance and contributions to the academic community? Not much. But here are some comparative numbers that may interest you:

Total Circulation-144,119,450(06)–138,102,762(08) 4.175 % decrease

Interlibrary Loan-10,801,531(06)—-11,095,168(08) 2.718 % increase

Returnables – 8.676 % increase
Non-Returnables – 5.265 % decrease

Gate Counts–18,765,712(06)——20,274,423(08) 8.04 % increase

Reference Transactions–1,100,863(06)—1,079,770 1.916 % decrease

Presentations–471,089(06)—-498,337(08) 5.784 % increase

E-Books–64,365,781(06)—–102,502,182(08) 59.249 % increase

FTE Librarians—26,469(06)—–27,030(08) 2.119 % increase

You can find more of these data items in the full report, and it’s not too difficult to toggle back and forth between the 2006 and 2008 reports to see where the differences are. As the representative items offered here suggest there hasn’t been much significant change over the two year period, excepting a big increase in the number of e-books. Without doing any sort of detailed analysis it looks to me like academic libraries are holding their own. There’s nothing here to suggest the academic community is abandoning their libraries. Circulation and reference are down a bit, but ILL is still busy, more people are visiting the building and despite the anaemic indicators for information literacy, the number of instruction sessions (included in “presentations” I take it) continues to increase.

I hope that the folks who construct the NCES survey instrument for academic libraries will give more thought to what type of questions would give us a better picture of the status of information literacy integration into the institutional curriculum rather than the mission or strategic plan. I see they do include a group of academic librarians in the development of the report. Perhaps for their next meeting they’ll put this issue on the agenda.

Data Shows Information Literacy Has Far To Go

For all the time this profession has put into promoting the information literacy concept, I was surprised that data from the National Center for Educational Statistic’s report Academic Libraries 2006 showed an underwhelming lack of penetration into or acceptance by academic institutions. Table 13 has data for the percentage of academic institutions reporting information literacy activities. There are five indicators of information literacy activity. They are:

1. defined information literacy or information literate student
2. incorporated information literacy into institution’s mission
3. incorporated information literacy into institution’s strategic plan
4. has institution-wide committee to implement strategic plan for information literacy
5. strategic plan formally recognizes the library’s role in information literacy instruction

First, here at the corresponding percentages for each of those five items:

1. 48.4
2. 34.3
3. 30.4
4. 17.6
5. 24.8

After all these years of researching it, writing about it, presenting about it, discussing it and selling the information literacy concept to our institutions do these numbers seem as low to you as they do to me? The fourth one is especially surprising. Do our faculty colleagues and academic administrators find information literacy of so little importance that they are unable to justify allocating one committee in the governance structure to it? Apparently so. You might argue that an institution doesn’t need an official committee to develop an information literacy initiative, but I would counter that it goes a long way toward legitimizing it and paving the way for better collaboration with faculty. Perhaps it is the way the question is worded. It seems to suggest you should only answer “yes” if the committee is implementing a strategic plan goal, and perhaps more institutions have information literacy committees and task forces, but are in no way related to a strategic plan.

I was also surprised to learn that smaller institutions, mostly likely teaching and learning-focused colleges, reported less activity than larger institutions. So whereas only 31 percent of small colleges (less than 1,500 FTE) had incorporated IL into the mission, at insitutions with over 5,000 FTE it was 41 percent. I would have expected that smaller institutions that are more focused on learning would be the ones to move more quickly and fully into integrating information literacy into the curriculum. Masters I and II institutions appear to consistenly have the highest numbers of activity. While the best results come from the activity of defining information literacy or an information literate student, I can’t say I’m exactly sure what that means. Where is it defined? Does anyone else know about it? Maybe this report needs some better questions.

I’d like to think that with all the hard work academic librarians and their organizations have put into information literacy initiatives that we’d be doing better by now. It is possible that in 2008, as opposed to 2006, we are doing much better, and that the numbers for these indicators are higher now as a result. That said, these numbers paint a somewhat bleak picture that should give some cause for concern. Or are these numbers meaningless for your institution because there is already a thriving information literacy initiative in place, regardless of committees or strategic plan mentions. What these numbers might suggest in the long run is that NCES needs to do a better job of asking the right questions so we can get a realistic picture of the penetration rate of informaton literacy initiatives and programming at U.S. colleges and universities.

Since we’re on the topic of information literacy I’m going to leave you with some words of wisdom from a faculty member who gave a talk titled “Scholars Perspective: Impact of Digitized Collections on Learning and Teaching“. It is a paper presented by David Watt, a faculty member in the Temple University History Department, at the June 4, 2008 RLG Programs Symposium. Here’s an excerpt that provides some advice for librarians on communicating with faculty about information literacy. I think you’ll find it worthwhile reading:

It is also clearly the case that many of Temple’s faculty are deeply resistant to making “informational literacy” a major component of their courses. It is not a category that makes much sense to many Temple professors. To many of them, it sounds like the kind of phrase that educational bureaucrats who don’t do much teaching or research love to throw around. For many of them, it raises the specter of universities built around “assessing student learning outcomes.” So, there is the bad news: we are living in a world in which there is good reason to believe that students really do need to work on their informational literacy and in which faculty seem resistant to helping them do so. Here is the good news: our experiences at Temple suggest that this is a challenge that can sometimes be easily negotiated. All one has to do, some of us at Temple are coming to believe, is stop preaching to faculty about the need for them to take an interest in informational literacy and, instead, start asking faculty about their hopes for their students.

As soon as one begins to do that—as soon as one begins asking historians, for example, about their hopes—one begins to get answers such as the following:

“I want them to understand that they should read all primary sources with a certain amount of skepticism and that they should be even more skeptical when they are reading secondary works.”

“I want them to be able to distinguish between relatively reliable primary sources and ones that are less reliable.”

Now, none of the faculty responses to the question about their hopes for their students contains the magic phrase “informational literacy.” But that is not really the point, is it?