Monthly Archives: September 2010

Managing E-Resources For Users, 100%

I returned to electronic resources librarianship – and full-time work – 16 months ago in a brand-new e-resources coordinator position at an academic library. The catch? It was in public services.

Not many e-resources librarians live among the folks in reference and instruction – link resolvers, proxy servers, A-Z lists, COUNTER compliance, and ERMs usually keep us pretty close to our colleagues in acquisitions, serials and IT. Public services librarians, who spend their days building relationships with teaching faculty, performing classroom instruction, and juggling reference questions don’t have time to worry about the circuitous, detailed process involved in e-resources acquisitions and maintenance. Likewise, technical services and technology staff don’t necessarily see the daily impact their work and decisions have on users. Feeling caught in the middle, my transition was difficult. As a public services librarian, I got to do things like teach and work reference in a way most e-resources librarians don’t. But I also had limited opportunities to connect with my colleagues on the technical side, leaving me out of the decision making loop at crucial points.

Despite its necessary involvement in technical processing, I feel that electronic resources librarianship is actually very well suited to being located in public services. My previous e-resources position, at a small college, meant I managed e-resources from a public services position because we all did public services, and our close contact with students, faculty and each other helped us stay focused on making decisions that we thought were good for users even if for collections they were only good enough. How did that affect my approach to e-resources management? For one, I didn’t get into our systems from the back-end – I used the front end, the way our students did, and still do. I didn’t care at all how our records were constructed and linked in the ILS – in fact, most of our e-resources weren’t in the ILS at all, because that’s not how our users found them. Instead, I cared about how items were labeled and displayed so people could understand what they were and what they did. I was never preoccupied with usage statistics but more interested in promoting use. Those concerns were at the forefront of my mind because they were on the minds of the people I interacted with most often – other reference and instruction librarians.

Early job ads for e-resources librarians emphasized public services skills like reference and instruction (Fisher 2003: “the position title of Electronic Resources Librarian has been pre-empted by the public service sector of the profession”); over the years, these changed to emphasize more specialized technical skills – licensing, web development and customization (Albitz & Shelburne 2007). Why the shift? My guess is that early e-resources required a lot of instruction to use, even for other librarians (I remember trying to use Infotrac as a frustrated undergraduate in 1998 – a lot of librarian intervention was required before I got it), and public services librarians became the early adopters of a lot of the first online resources. But as CD-ROM databases were replaced by more and more online journals (and the platforms to search these in aggregate), we tried to mainstream them into existing workflows. Only these workflows, created to acquire print objects and hold on to them forever, have proven difficult to adapt.

At the Electronic Resources & Libraries Conference in Austin, Texas, last February, Rick Lugg of R2 Consulting talked about how models for approaching e-resources management have changed. First there was the “hub,” or expert model, in which one person in an organization was the point person for all the specialized processes and expertise required for e-resources management. This worked for small collections, but, as e-resources encompassed more and more of libraries’ content and budgets and became our most-used resources, the lack of scalability of this model demanded another approach. The next management model has tried to place e-resources into traditional workflows. This is the model most of us still try to adhere to, and is, in my opinion, another reason most e-resources work has come to rest in technical services. As one of my colleagues explained, many librarians whose jobs previously revolved around print materials feel it is essential that they have some responsibility for electronic materials; otherwise, what would their jobs become? Thus, selection and licensing of e-resources at my institution has stayed with collection development, acquisitions has handled processing, serials has handled e-journals, and IT has worked on access issues.

Rick, however, also suggested a model for the future in which libraries push much of the technical work associated with e-resources management up the food chain to consortia and collectives, freeing local librarians to deal more with acquiring, synthesizing and communicating information about virtual materials. Some libraries are further along this model than others: in Ohio, OhioLINK (for a long time the gold standard for library consortia, in my opinion) handles licensing, acquisition, payment, and sometimes even search interface customization for many of our e-resources, though not all: about a third are still processed locally, meaning that staff and workflows for all aspects of e-resources management must be maintained locally. Smaller consortia can absorb more of the work: the  California Digital Library, for example, is focused on just the 10 UCs, which have more in common (from programs to missions to administrative systems) than the 89 OhioLINK libraries. I am interested in seeing what models the enormous new LYRASIS will adopt – it is well positioned to fulfill Rick’s prediction for the future of e-resources management, though I imagine its challenges in doing so will prove to be as huge as the collective itself.

For someone in a public services e-resources position like mine, tracking information about e- resources and the issues that affect every stage of their lifecycles (from technology developments to budget pressures, staff changes, and trends in user behavior) was an important, if not the most important, part of my work. This is supported by Joan Conger & Bonnie Tijerina’s assessment of e-resources management in “Collaborative Library-wide Partnerships: Managing Electronic Resources Through Learning and Adaptation” (in Collins & Carr 2008). The dynamic process of managing e-resources “requires effective incorporation of information from a rich array of sources,” they write (97). The information it is important to pursue is most often stored in experiences – of vendors, library professionals, and patrons. To get to this contextual information, they say, librarians must keep current, particularly with users. They suggest “usability tests, library advisory groups, focus groups, direct observation,” as well as informal assessment to learn new things about user behavior (99). They also remind their readers that it is important to communicate what you learn.

Interfacing between the user experience and the information required to improve it proved to be the part of my job best suited to my location in public services, and in my first year at Bowling Green I focused on user issues. I participated in web and OPAC redesign projects, resource re-description, customization, usability testing, and training. I also made an effort to stay informed: I read (Don’t Make Me Think!, Studying Students, Online Catalogs: What Users and Librarians Want), I talked to vendors, I attended conferences and sat in on webinars.  But no matter how much e mail I sent, how many meetings I attended, or how many blogs and wikis I used, I couldn’t seem to find a way to merge the information I had together with the information from my colleagues so that together we could make our management of e-resources more effective for users. I discovered, during this period, that it’s not enough to recognize that lots of people are involved in making e-resources available; it’s also about having a seat at the right tables so you can advocate for these materials and their users, and, in my library at least, I was sitting at the wrong table.

After a retirement incentive program was completed last fiscal year, our technical services department found itself down five people, two of them faculty librarians. Library-wide, we discussed reorganization, and a number of staff changed locations, but I was the only one who actually changed departments: officially, my position is now split, and I am now 51% technical services – no longer with reference and instruction, for the first time in my career.

I’m excited about this change – everyone involved thought it would be best for the library and collections. Many of my new tech services colleagues started their careers in reference, so a focus on the patron is embedded in all of their approaches to processing, cataloging and collection management. But I also feel a little like I’ve given up a good fight. Why did I have to move to technical services? I know the answer is because that’s where a lot of e-resources work is still located. The model we had been trying, while I am convinced it is viable and know it worked at my previous job, wasn’t scalable for a large academic library with broadly distributed functions. Not yet. However, while my location has changed, it’s promising that my job description retains many of my public services functions. I will still work reference, teach, work on public web interfaces, and participate in usability efforts. These things may officially only be 49% of my job now, but I still want everything I do to be for users, 100%.

Underground Resource Sharing

One outcome of the Netflix discussion that took place in the library community is that there seems to be general agreement that adhering to licensing agreements is the right thing for academic librarians to do for a number of good reasons. Not only is it a good way to avoid a potential lawsuit from Netflix or a movie studio, but it sets the right example for students and faculty. How can we expect them to abide by fair use guidelines and licensing agreements if the campus librarians are openly flouting them. We need to take the moral high ground, even if Netflix represents a reasonably good solution to the DVD distribution challenge.

So I find it interesting that this blogger is complaining about not having access to JSTOR as an alumnus of some college or university. Dr. Koshary writes:

I didn’t think this would happen, now that I’m out of grad school, but I’m feeling a fresh surge of hatred for Dear Old University. I tried to log in to JSTOR to look up an article, and found that I no longer have access to JSTOR through my DOU affiliation.

I’m pretty sure Dr. Koshary knows that JSTOR is a restricted database, and that most libraries are prohibited from allowing alumni to gain access (unless they make some sort of arrangement which likely isn’t cheap – and Dr. Khosary suspects his alma mater has such an agreement). At the end of the rant against his alma mater he asks:

I don’t suppose any of my readers has a better/cheaper idea for me to regain access to JSTOR?

Turns out they do, and most of those offering advice don’t seem too concerned about taking the moral high ground – or even abiding by their university or library’s guidelines for sharing accounts:

Do what everybody I know who’s been in your position has done: get a friend who has access to a research library and its databases to share their log-in and password with you. I know I’ve helped a few people out in this way, and I’ve done it with a spring in my step and a song in my heart. Sure, it’s technically “wrong” but I’d argue that it’s more wrong to charge underemployed people money for access to scholarly resources.

I just ran into this, where my new school has some journal accesses but not many, and I crowdsourced it on facebook — some current Gradschooland students offered me their proxy server login, and another was already in the library and emailed me the pdf.

Everyone does it. Hell, I’ll give you MY login if you want

Virtually everyone I know who’s not employed by a top-tier R1 has a bootlegged EEBO account: through friends who are still grad students, advisors, or friends with cushier jobs.

Makes you wonder why we even bother with licensing agreements in the first place? As long as you can get it for free somewhere else that’s all that matters. Just how rampant is this practice? Wish I had a way to do an anonymous poll of faculty, grad students and alums to see how many think it’s all right to provide or take an account to give someone else free access to restricted resources. Based on this post – probably a lot more than we think. So much for setting good examples.

Sudden Thoughts And Second Thoughts

A Beloit Mindset Moment

As part of our Library event for incoming freshmen we organized a scavenger hunt. They are pretty popular right now, and putting one together takes some thought and effort. But we got the participants to get around the entire library, visit a few service areas, try our text-a-librarian and cell phone tour services, and overall it went pretty well. The students seemed to enjoy it, and we offered a few nice gifts. But clearly we aren’t able to completely put ourselves into the mindset of the college freshman, and as a result one student thought we had an unfair question. Seems we asked the students to record the name of a movie for which we have a poster hanging in our media services area. To find the right poster the students were told to look for Humphrey Bogart. According to this student, she had never heard of him – so how could she know who to look for (this is overlooking his name is on the poster in 12″ letters). My colleagues and I were a bit taken back by that – could you be 18 and not know Bogie? Then again, when the class of 2014 was born in 1992, he was already dead for 35 years. Next time, we’ll just go with the poster for the Creature From the Black Lagoon. Every college student knows that guy, right?

Here’s An Idea for an Experiment – No Academic Library for Two Years

I read an anecdote shared by a librarian from brand name, elite 4-year college, about a faculty member who said something along the lines of “Our students graduate and become incredibly successful. They haven’t had much research instruction, and they aren’t particularly good at conducting research, but they are successful. So if that’s the end outcome, why bother with the research instruction?” How do you respond to a comment like that? I’m not sure, but what concerns me is that the librarians will buy into that line of thinking, and just give up on instruction all together. Why bother if the students end up at Wall Street brokerage firms with six-figure incomes? Is that how we measure success? [quite possible the faculty member simply means "success" at whatever the students aspire to]

The next logical step from that line of thinking is why bother having a library at all? Just close the library and cancel all the subscriptions. Allow faculty to use the library budget to get personal subscriptions to the journals they want. Use library funds to buy every student an e-book reader with a quota of a few thousand dollars to buy whatever books and paywall content they want. If after two years of no library or librarians the results show that students still graduate and still become incredibly successful, that tells us that the library never made a difference in the first place – other then for faculty and administrators to gush about the library as the “heart of the institution” – and as a good stop on the campus tour. I wonder if it makes a difference that a faculty observation like this one comes from an elite, brand name institution where the students arrive with many lifestyle advantages that will contribute to their post-college success. What about the institutions, like Chicago State University, where student failure is the norm? I wonder what faculty there have to say about the need for research instruction? Do they have time to think about it at all?

What’s the Biggest Mistake You’ve Made As a Leader?

It’s a long road and hard work becoming an effective leader, whether you are responsible for the vision and direction of a library, a single unit or program within the library that needs leadership for it to survive, or leading your colleagues in an association effort. Along the way you’ll likely make some mistakes. Hopefully one of them one won’t be the “big mistake” that shatters your leadership potential. Best of all, if you are new on the leadership path – or if you’ve been traveling that path a long time – you can avoid the big mistake by studying the lessons learned by other leaders.

A good opportunity for that type of learning can be discovered from Harvard Business Review’s video piece on “the biggest mistake a leader can make” which features a mix of academics and executives sharing what he or she thinks is that biggest mistake. Here’s a quick list of what I gleaned from each expert – but watch the video – it’s just over 7 minutes – there’s more good advice to be had there:

* Putting self-interest before the interests of the organization – leadership is about responsibility for the staff and stakeholders and putting yourself ahead of them is a fatal error.

* Betraying trust – if you fail here nothing else matters.

* Being certain – once you think you know how it all works there is reluctance to change; great leaders understand the power of uncertainty.

* Not living up to values – if you espouse values and fail to live up up to them you will rapidly be found out by followers.

* Overly enamored with vision – becoming single minded and obsessed with a vision makes a leader blind to other opportunities and possibilities.

* Personal arrogance or hubris – confuses the success of the organization with his or her individual persona; leads to the making of huge mistakes.

* Acting too fast – leaders need to step back and think before they act, and seek out advice from subordinates; re-think the vision/plan and then act.

* Failure to be consistent – followers need to know their leaders are authentic and predictable; if you are pleasant one day and a monster the next it destroys trust.

* Lack of self-reflection – leaders need to constantly review their own behavior and honestly contemplate what affect they have on others; good leaders are self-aware, learn from their mistakes and improve.

In my leadership positions I’ve made any number of these mistakes at one time or another; you can only hope to learn from a bad experience. But I’ve worked very hard never to betray trust, and I think that would be the ultimate leadership mistake. What about you? What is the biggest mistake you’ve made as a leader, what big mistake have you seen a leader make or which one on this list is the worst sin for you?

Some Writing Advice Worth Your Attention

I hope your regular reading regimen includes the Chronicle. If I had to guess I’d say it’s the most read non-library publication for the typical academic librarian. I’m also guessing many academic librarians will only go and read a Chronicle article or essay if someone else tells them they should go read it. As an academic librarian blogger I try to avoid leaning too heavily on the Chronicle or Inside Higher Ed as a source. It would be all to easy to do that – and then I’d just end up writing about whatever other librarians are already reading and discussing anyway – not too challenging or exciting.

But this essay on improving your writing gave some good advice, and as an academic librarian blogger one thing in particular resonated with me. Number nine on the list of ten reads: Your most profound thoughts are often wrong. That’s a profound thought right there. I often find myself second guessing many of my blog posts because I question if I’m making sense or effectively communicating my message. Then again I’ll go ahead and post them anyway thinking I’ve come up with something profound only to realize it wasn’t something all that great and that it didn’t make anyone think twice anyway. There’s been talk of the death of blogging for years now. But blogs persist even though many librarians show a preference for sharing their thoughts – as much as that is possible – with a facebook or twitter update. Perhaps in those mediums, since what’s written quickly passes on and fades, there’s not much need to think about whether what’s being written is profound or possibly wrong. With 140 characters, it may not matter much. An exception – when a simple tweet sets off a strong reaction with a blogger. So even though there’s a good chance my profound thoughts are wrong I’ll likely continue to share some of them with you. One of the best things about blogging at ACRLog is receiving comments that help me to re-think what I thought was profound and become more clear about my thinking and writing. Not an easy task.

On Being Valuable: Point-Counterpoint

The POINT: Amy Fry

On Tuesday, September 14, ACRL released Value of Academic Libraries: A Comprehensive Research Review and Report by Dr. Megan Oakleaf. The report lays out the current landscape of academic library assessment and seeks to provide strategies for libraries to demonstrate and quantify their value within the context of institutional missions and goals.

Oakleaf states that internal measures of value, such as use statistics, user satisfaction, and service quality, while interesting to librarians, are less compelling for external stakeholders such as administrators and trustees (11). Instead, she suggests determining externally-focused measures of value such as “library impact” (best measured by observing what users are doing and producing as a result of using the library) and “competing alternatives” (which focuses on defining what users want and how libraries, rather than our competitors, can help them achieve it) (21-22). She suggests ten key areas libraries should try to address in such assessment: enrollment, retention/graduation, student success, student achievement, student learning, student engagement, faculty research productivity, faculty grants, faculty teaching, and institutional reputation (17). Oakleaf also offers strategies for approaching assessment related to each area.

Oakleaf claims that “use-based definitions of value are not compelling to many institutional decision makers and external stakeholders. Furthermore, use is not meaningful, unless that use can be connected to institutional outcomes” (20). In a brief section about e-resources, she explains that usage counts don’t show why a library resource was used or the user’s satisfaction with it (50); she therefore suggests that, rather than collecting and reporting usage data for electronic resources, libraries try to collect qualitative data, like the purpose of the use (using the ARL MINES protocol). She also suggests examining successful grant applications to “examine the degree to which citations impact whether or not faculty are awarded grants.”

The question of how to use e-resources statistics to draw qualitative conclusions about users’ information literacy levels and the effectiveness of electronic collections (or even about the library’s impact on faculty research or student recruitment and retention), is of special interest to me now, as I have just agreed to examine (and hopefully overhaul) my institution’s management of e-resources statistics. However, such questions are overshadowed for me (and for most libraries), by how to effectively gather, merge, and analyze the statistics themselves, what to do with resources that don’t offer statistics at all or don’t offer them in COUNTER format, and when and how to communicate them internally for collection decisions. It is difficult to see arriving at higher-level methods for library assessment that involve overlaying complex demographic data, research output data, cost data, collection data, and use data in order to tell compelling stories about library use and impact when even the most basic systems for managing inputs and outputs have not been implemented.

I understand and even agree with Oakleaf’s characterization of the shortcomings of “use-based definitions of value,” but am not sure that surveying users about the purpose of their information use or linking library collections to successful grant applications truly gives a more compelling picture of the value of electronic resources collections, nor one that is more complete. For example, assessing value by linking library collections to grants funded or patents produced seems like it would discount libraries’ value to humanities research, because humanities scholarship will never approach the sciences in the amount of dollars coming in.

It is true that libraries currently “do not track data that would provide evidence that students who engage in more library instruction are more likely to graduate on time, that faculty who use library services are more likely to be tenured, or that student affairs professionals that integrate library services into their work activities are more likely to be promoted” (13). But that stuff just really seems like no-brainers to me. If we spend a lot of time and energy collecting the data and putting it together to get the numbers that will allow us to make these claims – then what? What’s the payoff for the library? Administrators who don’t think libraries are just black holes for funding? A way to prove to students that they should use the library? If administrators and trustees are not inclined to fund libraries because their backgrounds did not include library use, or students are not inclined to use libraries because they are focused on graduation and employment instead of research, I don’t know that any such externally focused assessment will result in what seems to be, ultimately, the desired outcome – a reassertion of libraries’ relevance to our core constituents. It will, however, be a drain on library staff time and expertise – time and expertise that could be spent on core activities, like collection building, collection access, and public service.

Oakleaf concludes that our focus should be not to prove but to increase value (141). We should not ask, “Are libraries valuable?” but “How valuable are libraries?” she says. What about “How are libraries valuable?” But this is semantics. No matter what our approach to assessment, I’m afraid the answer will still depend less on what data we present than who we ask.

The COUNTERPOINT: Steven Bell

What’s the payoff for the library? That’s an important question when it comes to assessment and efforts to demonstrate the academic library’s value to its own institution and higher education. Amy Fry makes a good point that we could invest considerable time and energy to collect and analyze the data needed to determine our value in any or all of the ten key areas recommended in the ACRL Value Report – but why bother? She states that when it comes to questioning if library instruction sessions can be connected to better grades or students graduating on time, that’s “no brainer” territory.

But can we in fact assume that just because a student attends an instruction session – or because faculty have access to research databases – that they are indeed achieving institutional outcomes? If, as a profession, we thought that was no brainer territory why are there hundreds of research articles in our literature that attempt to prove that students who sit through library instruction sessions are better off than the ones who don’t – we clearly aren’t just assuming they are, we want to prove it – and in doing so prove why we make a difference to our students’ education and learning process.

As Barbara Fister points out in her response to the Report, provosts already acknowledge, anecdotally, that they value their libraries and librarians. And we also know that the library is the heart of the institution, and that libraries are like Mom and apple pie; everyone likes the library. You probably couldn’t find an academic administrator who would go on record trashing the academic library (well, maybe this one). But none of that may stop administrators, when push comes to shove, from taking drastic measures with library services to resolve a budget crisis. Being the heart of the institution didn’t stop Arizona’s Coconino Community College from performing radical heart surgery by outsourcing the library operations to North Arizona University’s (NAU) Cline Library. Admittedly, that’s a rare occurrence, and I can’t say for sure that even the best set of library value data could prevent it from happening. Yet one can’t help but imagine that if Coconino’s librarians had some rock solid assessment data on hand to confirm their value to administrators – be it how the library keeps students retained or helps students to achieve higher GPAs – they’d still have their jobs and be delivering services to their students at their own library (which was largely chopped up and pieced out to other academic units).

And better assessment and demonstration of library value can indeed result in a financial payoff for the institution if awarded government grants and the indirect costs associated with conducting research. Those indirect costs, typically a percentage rate negotiated between the institution and federal government, can make a huge difference in institutional funding for research. Given the size of some grants, just a slight increase – perhaps a percentage point or two – can make a real impact over time. Amy mentions the ARL MINES protocol, which is a process for making a concrete connection between researchers working on grant projects and their use the library resources to conduct that research. Often the contribution of the library is drastically understated, and therefore it is barely reflected in the calculation of the ICR (indirect cost recovery). My own institution is currently conducting a survey similar to MINES so that our “bean counters” (as Barbara likes to refer to them) can more accurately connect the expenditures for library electronic resources to research productivity – and the government’s own bean counters have very rigid rules for calculating increases to the ICR. It can’t be based on anecdotal evidence or simply having researchers state that they use library resources for their research. In this case, asking the users if we provide value doesn’t mean squat. Providing convincing evidence might mean an increase to our ICR of one or two percent –which over time could add up to significant amounts of funding to support research. That is a real payoff, but make no mistake that we have invested considerable time and expense in setting up the survey process.

For many academic librarians, it may be better to, as Amy suggests, focus on the core activities such as collection building and traditional services (reference, instruction, etc.) – and to keep improving on or expanding in those areas. But I like to think that what drives real advancement in our academic libraries is confronting new mysteries that will force us to seek out new answers that could lead to improvements in fundamental library operations. What happens when we fail to seek out new mysteries to explore is that we simply continue to exploit the same existing answers over and over again until we drive them and ourselves into obsolescence (for more on “knowledge funnel” theory read here).

Lately I’ve been advocating that the new mystery for academic librarians should focus on assessment. We need to get much better at answering a simple question that represents this mystery: How can we tell that we are making a difference – and how will we gather the data to quantitatively prove it? From this perspective the question would be neither “How valuable are academic libraries?” or “How are libraries valuable?” but “How are academic libraries making a real difference and how do we prove it?” Perhaps it remains a case of semantics, but any way we approach this new mystery, the road should lead to a better grasp of the value we provide and new ways to communicate it to our communities. Whatever you may think about assessment and the value question, take some time to review the ACRL Value Study. I’ll be at the Library Assessment Conference in DC at the end of October. I’m looking forward to learning more about how academic librarians are approaching the new mystery of assessment, and how we can all do a better job of quantifying and communicating our value proposition.

A Personal Touch

Earlier this week the Chron reported on the new Personal Librarian Program at Drexel University. Every incoming freshman student this year has been assigned an individual librarian, and students are encouraged to contact their personal librarians throughout the semester whenever they have questions about doing research or using the library. While Drexel is not the only academic library offering this service, the publicity around the Drexel program has inspired lots of conversation this week among librarians I know both in person and online via Twitter, Facebook, and blogs.

It definitely seems like there has been a rise in individual services to students at academic libraries over the past few years. Some libraries are experimenting with librarian office hours; sometimes they’re held in the library, and sometimes a subject librarian will offer consultations in an office in each discipline’s department. Many libraries promote individual consultations by appointment with reference librarians for students and faculty. We started offering this service at my library last semester and it’s working well. It’s been great to be able to offer more in-depth assistance to students without feeling the pressure of the busy reference desk.

As an instruction librarian I’m used to interacting with students in a class, but working with many students at once is very different from a one-on-one interaction with a student. Maybe it’s just in the air, but more and more often I find myself thinking about ways to work with individual students. I think these services are so attractive to me because it seems like they would encourage stronger student engagement with research and critical thinking. No matter how relevant (e.g., assignment-based), timely, interactive, or entertaining a classroom instruction session is, it can be difficult to fully engage every student in the room. But working with students one-on-one removes some of the obstacles–like fear of asking questions in front of the entire class–and lets us work at each individual student’s level of experience and need.

I have to admit that the numbers are a bit scary. The ratio of Personal Librarians to incoming freshmen at Drexel is about 1:100. How can academic libraries at colleges with a different ratio–say, 1:500 or even 1:1,000–offer these kinds of individual services? One thought is to start small, with students in a specific discipline or major, and I’m sure there are other groups of students that would work well for a personal librarian project pilot. And assessment should help us evaluate the impact of individual services as compared to group instruction, and help us decide whether to offer a personal librarian program. (Assessment is on my mind this week as I’ve been making my way through the new ACRL Value of Academic Libraries Report, but that’s a post for another day.)

If you’re experimenting with individual services in your library, what have your experiences been?