Category Archives: ACRL News

For ACRL announcements, events, etc.

On Being Valuable: Point-Counterpoint

The POINT: Amy Fry

On Tuesday, September 14, ACRL released Value of Academic Libraries: A Comprehensive Research Review and Report by Dr. Megan Oakleaf. The report lays out the current landscape of academic library assessment and seeks to provide strategies for libraries to demonstrate and quantify their value within the context of institutional missions and goals.

Oakleaf states that internal measures of value, such as use statistics, user satisfaction, and service quality, while interesting to librarians, are less compelling for external stakeholders such as administrators and trustees (11). Instead, she suggests determining externally-focused measures of value such as “library impact” (best measured by observing what users are doing and producing as a result of using the library) and “competing alternatives” (which focuses on defining what users want and how libraries, rather than our competitors, can help them achieve it) (21-22). She suggests ten key areas libraries should try to address in such assessment: enrollment, retention/graduation, student success, student achievement, student learning, student engagement, faculty research productivity, faculty grants, faculty teaching, and institutional reputation (17). Oakleaf also offers strategies for approaching assessment related to each area.

Oakleaf claims that “use-based definitions of value are not compelling to many institutional decision makers and external stakeholders. Furthermore, use is not meaningful, unless that use can be connected to institutional outcomes” (20). In a brief section about e-resources, she explains that usage counts don’t show why a library resource was used or the user’s satisfaction with it (50); she therefore suggests that, rather than collecting and reporting usage data for electronic resources, libraries try to collect qualitative data, like the purpose of the use (using the ARL MINES protocol). She also suggests examining successful grant applications to “examine the degree to which citations impact whether or not faculty are awarded grants.”

The question of how to use e-resources statistics to draw qualitative conclusions about users’ information literacy levels and the effectiveness of electronic collections (or even about the library’s impact on faculty research or student recruitment and retention), is of special interest to me now, as I have just agreed to examine (and hopefully overhaul) my institution’s management of e-resources statistics. However, such questions are overshadowed for me (and for most libraries), by how to effectively gather, merge, and analyze the statistics themselves, what to do with resources that don’t offer statistics at all or don’t offer them in COUNTER format, and when and how to communicate them internally for collection decisions. It is difficult to see arriving at higher-level methods for library assessment that involve overlaying complex demographic data, research output data, cost data, collection data, and use data in order to tell compelling stories about library use and impact when even the most basic systems for managing inputs and outputs have not been implemented.

I understand and even agree with Oakleaf’s characterization of the shortcomings of “use-based definitions of value,” but am not sure that surveying users about the purpose of their information use or linking library collections to successful grant applications truly gives a more compelling picture of the value of electronic resources collections, nor one that is more complete. For example, assessing value by linking library collections to grants funded or patents produced seems like it would discount libraries’ value to humanities research, because humanities scholarship will never approach the sciences in the amount of dollars coming in.

It is true that libraries currently “do not track data that would provide evidence that students who engage in more library instruction are more likely to graduate on time, that faculty who use library services are more likely to be tenured, or that student affairs professionals that integrate library services into their work activities are more likely to be promoted” (13). But that stuff just really seems like no-brainers to me. If we spend a lot of time and energy collecting the data and putting it together to get the numbers that will allow us to make these claims – then what? What’s the payoff for the library? Administrators who don’t think libraries are just black holes for funding? A way to prove to students that they should use the library? If administrators and trustees are not inclined to fund libraries because their backgrounds did not include library use, or students are not inclined to use libraries because they are focused on graduation and employment instead of research, I don’t know that any such externally focused assessment will result in what seems to be, ultimately, the desired outcome – a reassertion of libraries’ relevance to our core constituents. It will, however, be a drain on library staff time and expertise – time and expertise that could be spent on core activities, like collection building, collection access, and public service.

Oakleaf concludes that our focus should be not to prove but to increase value (141). We should not ask, “Are libraries valuable?” but “How valuable are libraries?” she says. What about “How are libraries valuable?” But this is semantics. No matter what our approach to assessment, I’m afraid the answer will still depend less on what data we present than who we ask.

The COUNTERPOINT: Steven Bell

What’s the payoff for the library? That’s an important question when it comes to assessment and efforts to demonstrate the academic library’s value to its own institution and higher education. Amy Fry makes a good point that we could invest considerable time and energy to collect and analyze the data needed to determine our value in any or all of the ten key areas recommended in the ACRL Value Report – but why bother? She states that when it comes to questioning if library instruction sessions can be connected to better grades or students graduating on time, that’s “no brainer” territory.

But can we in fact assume that just because a student attends an instruction session – or because faculty have access to research databases – that they are indeed achieving institutional outcomes? If, as a profession, we thought that was no brainer territory why are there hundreds of research articles in our literature that attempt to prove that students who sit through library instruction sessions are better off than the ones who don’t – we clearly aren’t just assuming they are, we want to prove it – and in doing so prove why we make a difference to our students’ education and learning process.

As Barbara Fister points out in her response to the Report, provosts already acknowledge, anecdotally, that they value their libraries and librarians. And we also know that the library is the heart of the institution, and that libraries are like Mom and apple pie; everyone likes the library. You probably couldn’t find an academic administrator who would go on record trashing the academic library (well, maybe this one). But none of that may stop administrators, when push comes to shove, from taking drastic measures with library services to resolve a budget crisis. Being the heart of the institution didn’t stop Arizona’s Coconino Community College from performing radical heart surgery by outsourcing the library operations to North Arizona University’s (NAU) Cline Library. Admittedly, that’s a rare occurrence, and I can’t say for sure that even the best set of library value data could prevent it from happening. Yet one can’t help but imagine that if Coconino’s librarians had some rock solid assessment data on hand to confirm their value to administrators – be it how the library keeps students retained or helps students to achieve higher GPAs – they’d still have their jobs and be delivering services to their students at their own library (which was largely chopped up and pieced out to other academic units).

And better assessment and demonstration of library value can indeed result in a financial payoff for the institution if awarded government grants and the indirect costs associated with conducting research. Those indirect costs, typically a percentage rate negotiated between the institution and federal government, can make a huge difference in institutional funding for research. Given the size of some grants, just a slight increase – perhaps a percentage point or two – can make a real impact over time. Amy mentions the ARL MINES protocol, which is a process for making a concrete connection between researchers working on grant projects and their use the library resources to conduct that research. Often the contribution of the library is drastically understated, and therefore it is barely reflected in the calculation of the ICR (indirect cost recovery). My own institution is currently conducting a survey similar to MINES so that our “bean counters” (as Barbara likes to refer to them) can more accurately connect the expenditures for library electronic resources to research productivity – and the government’s own bean counters have very rigid rules for calculating increases to the ICR. It can’t be based on anecdotal evidence or simply having researchers state that they use library resources for their research. In this case, asking the users if we provide value doesn’t mean squat. Providing convincing evidence might mean an increase to our ICR of one or two percent –which over time could add up to significant amounts of funding to support research. That is a real payoff, but make no mistake that we have invested considerable time and expense in setting up the survey process.

For many academic librarians, it may be better to, as Amy suggests, focus on the core activities such as collection building and traditional services (reference, instruction, etc.) – and to keep improving on or expanding in those areas. But I like to think that what drives real advancement in our academic libraries is confronting new mysteries that will force us to seek out new answers that could lead to improvements in fundamental library operations. What happens when we fail to seek out new mysteries to explore is that we simply continue to exploit the same existing answers over and over again until we drive them and ourselves into obsolescence (for more on “knowledge funnel” theory read here).

Lately I’ve been advocating that the new mystery for academic librarians should focus on assessment. We need to get much better at answering a simple question that represents this mystery: How can we tell that we are making a difference – and how will we gather the data to quantitatively prove it? From this perspective the question would be neither “How valuable are academic libraries?” or “How are libraries valuable?” but “How are academic libraries making a real difference and how do we prove it?” Perhaps it remains a case of semantics, but any way we approach this new mystery, the road should lead to a better grasp of the value we provide and new ways to communicate it to our communities. Whatever you may think about assessment and the value question, take some time to review the ACRL Value Study. I’ll be at the Library Assessment Conference in DC at the end of October. I’m looking forward to learning more about how academic librarians are approaching the new mystery of assessment, and how we can all do a better job of quantifying and communicating our value proposition.

ACRL 2011 National Conference Update – Paper/Panel Submissions

Just in! Some data on the number of submissions for the contributed paper and panel sessions (plus workshops and preconferences) for ACRL’s National Conference in Philadelphia in 2011. As you might expect – the number of submissions (mostly) continues to increase.

Here’s the data:

Contributed Papers

Number of submissions – 238
Number that can be accepted – 66
Acceptance rate 28%

Panel Sessions

Number of submissions – 202
Number that can be accepted – 44
Acceptance rate 22%

Preconferences

Number of submissions – 11
Number that can be accepted – 6
Acceptance rate 55%

Workshops

Number of submissions – 50
Number that can be accepted – 12
Acceptance rate 24%

Comparative Numbers for ACRL 2009

Contributed Papers – 230 submissions; 44 accepted; 19% acceptance rate

Panel Sessions – 169 submissions; 35 accepted; 21% acceptance rate

Preconferences – 15 submissions; 6 accepted; 40% acceptance rate

Workshops – 47 submissions; 11 accepted; 23% acceptance rate

ACRL has responded to a major request from the membership – provide more academic librarians with an opportunity to present at national conference. ACRL is making this possible by increasing the number of papers from 44 to 66 so that will increase the acceptance rate nearly 10 points (thanks to a stable number of submissions) over 2009. The trade-off is that each paper presentation is just 20 minutes, so there are now three papers, not two, at every session. Even with 9 additional panel sessions, owing to a substantial increase in the number of submissions, the acceptance rate is pretty much the same. Looks like those who submitted a preconference proposal will have the best shot at acceptance. But overall more of you will be presenting at ACRL!

Good luck to all those who submitted a proposal. I hope you came up with a snappy title (see more on that here).

And in the event your proposal is rejected, keep in mind that the submission deadline for poster sessions, cyber zed shed, roundtables and virtual conference sessions is November 1, 2010. So there will still be plenty of time to submit a proposal. There are a bunch of other innovations being planned for the conference – and you’ve probably now found out who the keynoters are – so I hope you’ll be planning to come to Philadelphia in 2011.

ACRLog Welcomes Its Emerging Leaders

Editor’s Note: ACRLog is pleased to announce that a group of ALA Emerging Leaders was assigned to work with the ACRLog blog team (and ACRL Insider too), and use our little blog to share ideas that will enhance ALA conference attendance for both first-timers and veterans alike. Over the next few months we’ll feature occasional posts from members of the Emerging Leaders team – pictured below. This first guest post is a group effort. We look forward to reading what our Emerging Leaders have to share.

You have likely heard about the ALA Emerging Leaders Program, which began in 2007 as part of past ALA president Leslie Burger’s six initiatives to expand opportunities for involvement and leadership in ALA to newer librarians. What you might not know is that ACRL sponsors a team of Emerging Leaders to support the ACRL 101 program, which is designed to enhance the ALA Annual Conference experience for first-time attendees.

This year, our Emerging Leaders team comes from universities ranging from Alaska to Georgia. We are all enthusiastic about our work in academic libraries and our involvement with ACRL. Through our project with ACRL 101 we will share our conference experiences and help new conference attendees make the most of their first ALA Annual experience. We will offer insight into the structure of ACRL and help extend the network of support that ACRL 101 currently offers to new members.

The ACRLog-ALA Emerging Leaders Team
The ACRLog-ALA Emerging Leaders Team

From left to right, our team of ACRL 101 Emerging Leaders include: Amanda Dinscore, Public Services Librarian at California State University, Fresno; Wendy Girven, Public Services Librarian at University of Alaska Southeast; Kimberley Bugg, Assistant Head, Information & Research Services, Atlanta University Center; Hui-Fen Chang, Social Sciences Librarian, Oklahoma State University; Rachel Slough, MLS Candidate, Indiana University; and Miriam Rigby, Social Sciences Librarian at the University of Oregon. Not pictured, Mary Jane Petrowski, Associate Director of ACRL, serves as the ACRL Staff Liaison. Susanna Boyston, Head of Library Instruction and Collection Development at the Davidson College Library, is the project mentor.

After an initial meeting at ALA Midwinter in Boston, our group is now working with representatives from ACRL to plan and implement a series of ACRLog and ACRL Insider blog posts. These posts will focus on areas of interest to new librarians such as conference tips, ACRL resources, highlights of selected ACRL sections, and advice on how to get involved. We will also be hosting OnPoint chats for first time conference attendees, to provide insight into conference structure and guidance to help you make the most of your time at ALA Annual in Washington, DC. Finally, we will be planning several ACRL mini-sessions at ALA Annual which will build upon the content covered in the ACRL 101 program.
Keep an eye out for future blog posts from members of our active group on ACRLog and ACRL Insider in the coming weeks, and please support the ACRL 101 Emerging Leaders – whatever your career stage – by giving us your feedback and comments. Last but not least, come visit us at the ALA Pavilion at the Annual Conference in Washington D.C.

Maureen Sullivan – ACRL Academic/Research Librarian Of The Year

Maureen Sullivan, owner of Maureen Sullivan Associates and Professor of Practice in the Simmons College Graduate School of Library and Information Science Ph.D. Program in Managerial Leadership, is the 2010 Association of College and Research Libraries’ (ACRL) Academic/Research Librarian of the Year. The award, sponsored by YBP Library Services, recognizes an outstanding member of the library profession who has made a significant national or international contribution to academic/research librarianship and library development. ACRLog congratulates Sullivan on being named the newest recipient of this prestigious ACRL award.

ACRLog also congratulates the winners of the 2010 ACRL Excellence in Academic Libraries Award Winners: The Bucks County Community College Library, Newtown, Pa.; the A.C. Buehler Library at Elmhurst College, Elmhurst, Ill.; and the Indiana University Bloomington Libraries.

Your ACRL Conference Planning Team

An enormous amount of work goes into planning the ACRL National Conference. No sooner does one end then the cycle of planning starts again for the next one. At ALA the 2011 conference planning committee had its first official meetings. We first met with members of the 2009 planning group for a debriefing session. Then we moved on to our first major task of identifying the conference themes and trying to come up with catchy names for them. Whereas the Seattle conference had five themes the Philadelphia conference will likely have seven. We think that will make it easier for those submitting proposals to more easily find a theme into which their idea fits.

At the end of the day loads of ACRL members will be involved in making the conference a success, from the many members of the planning committees to everyone who presents and participates. But the backbone of the conference is really three people. The chair of the conference committee and two ACRL staff members who somehow help us clueless members to pull this whole thing off. Here is your conference team for 2011:

alaconf2009
Your ACRL 2011 Conference Team

On the far left is Margot Conahan, ACRL’s manager of professional development, to the far right you have Tory Ondrla, ACRL conference supervisor, and in the center is Pam Snelson, Library Director at Franklin & Marshall College – and the Chair of the Conference Planning Committee for 2011. Together these three will lead the conference planning committee in organizing another memorable ACRL conference.