Making the space: Researching beyond IRDL

I’ve spent the last week at the Institute of Research Design in Librarianship (IRDL). Most of the workshop has happened in the beautiful William H. Hannon Library on Loyola Marymount’s campus. Last month on the blog I talked about my preparation for this week-long research workshop. The week has been a whirlwind and it’s hard to believe we’re finishing up today (Saturday). I have learned a lot — about the research process, the projects my cohort members are working on, and about librarianship at a variety of institutions. I feel energized and excited about conducting strong LIS research. My research project has changed and evolved and I’m headed back to Penn State with a stronger version of what I submitted back in January.

Throughout the week, I’ve been thinking about how I’ve been intentional about creating space for this learning and research. When I was preparing for IRDL, my research mentor mentioned in an email that I should set aside my work for the week in LA. I took their words to heart; I put on my out-of-office message, alerted my co-workers that I wouldn’t be responding, and haven’t replied to anything. I put my work in Pennsylvania on hold and that allowed me to concentrate on the material being covered. I had the chance to develop my project, connect with my peers, and apply what I was learning.

And everything was okay.

My colleagues respected my time to be away and I had the opportunity to immerse myself in this work. This time pushed me to spin my wheels, read more of the student engagement and involvement literature, and craft a journey map template for student engagement opportunities. During our workshop days, I got to spend time with my peers and work through the research process together. We spent an hour crafting 10 survey questions and an afternoon deciding on a set of questions for a focus group. What I learned was that in order to get the data you need, you have to be willing to devote uninterrupted time to finding ways to ask good questions. A good survey just doesn’t happen; it requires thoughtful decisions, defined variables, and a pilot test. This stuff cannot be rushed.  

So yes, it was great that I had this time to think, process, and experiment. This time was exactly what I needed. But I know that once I’m back in Pennsylvania, all those other priorities will return. IRDL has been good for lots of things, including forcing me to consider how I should spend my time when I come home.

The question I keep returning to is: how do you create this meaningful space for research work? How can I replicate the work environment of this week? Can I find ways to be just as intentional about setting aside work for this work when I’m back in Pennsylvania? I have never been good about blocking time and asking for that time to stay uninterrupted. In order for me to do this project, and to do it well, I’ll need to start defining those boundaries more clearly. It’s a habit to be developed.

But it’s not something that I have to do on my own. Community is always an important piece of my librarianship and with research, community support is important. We built LibParlor to create community and now, after a week in Los Angeles, I have a new community to lean on. We tell the students we teach that research isn’t a solo process and that’s a good reminder for us too. Throughout IRDL, I have seen the strength of collaborating with others for surveys, interview questions, and inferential statistics. It’s better to tackle that stuff with someone else and I’m thankful my research network community continues to grow. And I know they will help hold me accountable for the time I need for this project.

While I’m still figuring this out, I’m sure others have some ideas. So, how have you created this space? How have you found balance between the day-to-day of your job with the time to research? How do you depend on and support your research community?


Featured image of the William H. Hannon Library, taken by the author of this post.

Preparing for my IRDL experience

This past weekend, I spent a large amount of time at my dining room table, reading Collecting Qualitative Data: A Field Manual for Applied Research by Greg Guest, Emily E. Namey, and Marilyn L. Mitchell. And I was enjoying it.

Now granted, this wasn’t a book I just happened to pick up as a fun weekend read. This book, along with a few others, are part of the curriculum for the Institute for Research Design in Librarianship (IRDL). I’m a proud member of their sixth cohort. IRDL is an IMLS grant that aims to bring together an enthusiastic and motivated bunch of librarians that want to conduct research but need a little extra training and help. Early this year, I submitted an application, where I proposed my research project, included a one page cover letter about what I hoped to gain from this experience, and provided a letter from my institution that they would support me if I was accepted into the cohort. Once I received the good news, I booked a flight to Loyola Marymount University for early June, where a weeklong in-person workshop will take place. The workshop is the jumping off point for our project; beyond that week, we will meet virtually throughout the next year and talk to our assigned mentor, who is there to make sure our project stays on track. All of these support mechanisms are to ensure we get our project completed and to help each other along the way. Now that my spring semester is over, it’s finally sinking I’m less than a month away from our in-person workshop. My pre-workshop preparation has stepped up!

Beyond the training and getting to know a cohort of enthusiastic librarians who want to conduct research, I am excited to spend the next year on a meaningful and complex research project. Those that know my position know that I have spent almost two years building relationships, getting library colleagues to define student engagement in a similar way, and understanding how students at Penn State navigate student engagement. I think I’m finally at a point where I’m ready to learn more while also thinking about ways to make an impact and influence future directions. That’s where my IRDL project comes in.

The quick sound bite of my project: I’ll be using journey mapping techniques to have students at Penn State chart their student engagement journeys. What I want to know is how our students actually experience student engagement during their undergraduate careers and who are the people, units, resources, and opportunities they discover along the way. Of course, through this all, I’m also curious about how the library has or has not played a role in their student engagement journey. Ultimately, I want to get a more nuanced picture of what our students experience and begin to identify common points where the library could get more involved. While I understand that each student engagement journey will be unique to the student, I assume there will be some trends that emerge from these maps that can inform the work of both the libraries and Penn State’s Student Engagement Network.  At times, the project seems daunting, but the more I read in preparation for IRDL, the more I begin to feel ready to take on this project. I know I’ll be learning a lot as I go, and also get more opportunities to meet students at Penn State, which I am all about.

Well, my qualitative research book is calling to me, got to get back to reading! But I will definitely be documenting my time at IRDL and my student engagement journey mapping research in a variety of online spaces: on this blog, on my personal blog, on Twitter, and on LibParlor. So, more soon!


Featured image by rawpixel.comfrom Pexels

Something Is Better Than Nothing

As you read and learn more about design a basic principle appears again and again. Design for simplicity. In fact one hallmark of great design is that it makes the complex simple. That said, as Garr Reynolds put it in a recent presentation, simplicity should not be confused with simplistic. Simplistic is about dumbing things down because it is easier for us. Simplicity is about creating clarity where there previously was confusion. The latter best serves the end user.

I got to thinking about this after attending a recent webcast presentation sponsored by Library Journal and Serials Solution. The point of the webcast, Returning the Researcher to the Library, was to share ideas about how librarians could create a better return on their investment in electronic resources. With all the money we spend on electronic resources, who doesn’t want to create greater awareness about their availability and gather evidence that documents how students and faculty use the library’s e-resources for their research. The presenters shared some good ideas and research findings. One of the speakers shared her library’s experience with a recently implemented catalog overlay – you’d know it from its graphic/visual search functionality. After examining search logs the presenter pointed out that searches getting zero results in the old catalog did get results in the new catalog. What was the difference? The simplicity of the new overlay.

A good question was asked. Was there any analysis of the results from the searches in the new catalog? In other words, there were results but were they relevant? Other than one example involving a search that looked more like something a librarian rather than an end user concocted, the answer was no – there was no analysis of the results. All we really know is that the new, simpler interface provided some results whereas the old, complicated interface provided no results. That lead to the conclusion that from the user’s perspective “it’s better to find something than nothing”. Do you agree with that? Isn’t is possible that the something you’ll find is so irrelevant or worthless that it may be worse than finding nothing. Or the something found may only be one miniscule sample from a much greater body of information that will be completely ignored. “Oh great. I found something. Now I’m done with my research”. What you miss can often be much more significant than what you find. The results only show there were zero result searches in the old catalog. It tells you nothing about whether or not the searcher tried again or went and asked for help. In some cases finding nothing may lead the searcher to re-think the search and achieve improved results. Maybe you think I’m guilty of wishful thinking here.

I suppose what mostly had me puzzled was the suggestion that simple search interfaces, rather than instruction for research skill building, is the ultimate solution to better searching and research results. It’s true that at large research institutions it will be difficult to reach every student with instruction, and there are some strategies to tackle that problem. But here’s my issue with the assumption that simple search interfaces are the solution. I don’t care how simple the interface is, if a student lacks the ability to think critically about the search problem and construct a respectable search query it doesn’t matter what sort of simple overlay you offer, the results are still likely to be poor. Garbage in is still garbage out. That’s why library instruction still has considerable potential for improving student research in the long run.

That said, I find it difficult to argue against the potential value of catalog and database search systems that will find something that can at least get someone started in their research. These simplified systems also offer potential for resource discovery, and we certainly want students and faculty to become aware of what may now be hidden collections. Despite the shortcomings we need to further explore these systems. At least one system I examined at ALA allows librarians to customize the relevancy ranking to continually fine tune the quality of the search results. But let’s not proceed to dismantle library instruction just yet. We need to constantly remind ourselves that creating simplicity is not the same as making search systems simplistic. Research is an inherently complex task. Instruction can help learners to master and appreciate complexity. Then, on their own they can achieve clarity when encountering complex research problems that require the use of complicated search systems. That, I think, is what we mean when we talk about lifelong learning.

There’s More To “Finding” Than We Thought

A Pew Internet & American Life Project study about search engine users indicated that the vast majority of them expressed satisfaction with their search skills. According to the study, 92% of those who use search engines say they are confident about their searching and 87% of searchers say they have successful search experiences most of the time, including some 17% of users who say they always find the information for which they are looking. Now if most Americans are using Google to find the latest information on Paris Hilton or the Academy Awards ceremony, I imagine they find what they need. But in the event they don’t immediately and easily find what they seek, some poor search behavior is likely to emerge.

In his Alertbox newsletter, Jakob Nielsen shared the results of research that indicated that while search users have better skills now than they did five years ago, when their first efforts fail most searchers are incredibly bad at finding, and that’s typically because they don’t know how to search. According to Nielsen, users face three problems:

* Inability to retarget queries to a different search strategy (i.e., revise the strategy)
* Inability to understand the search results and properly evaluate each destination site’s likely usefullness
* Inability to sort through the SERP’s polluted mass of poor results, to really address whether a site meets the user’s problem (SERP=Search Engine Results Page).

As academic librarians we assumed that end-users only had trouble with our catalogs and library databases because they were oriented to librarian-style searching (which only appeals to librarians), and that making all library databases more like search engines in order to facilitate finding (which is what everyone else wants to do) would bring about a new golden age of end-user information retrieval. I see two significant flaws in that vision. First, end-users clearly have a hard time finding information on ultra-findable Google if their first effort fails, and second, the solution to the first problem is better search skills – the type of skills that librarians use to find information. Neilsen refers to current end-user search behavior as Goggle Gullibility because:

many users are at the search engine’s mercy and mainly click the top links. Sadly, while these top links are often not what they really need, users don’t know how to do better.

And while finding processes can sometimes be simple, at other times they are, according to Louis Rosenfeld, quite circuitous, iterative and surprising. In other words, finding involves a fair amount of searching. In fact, Rosenfield’s finding formula is “browse + search + ask = find”. That’s why we need to develop search systems based on the knowledge that there “is more than meets the eye when it comes to the process of finding” and not simply on an assumption that finding is simple, intuitive and completely different from searching. Searching is an integral part of finding. Searching involves decision making, and so does finding. Searching does assume more of a plan of attack, while finding suggests a more carefree and random approach. But as Rosenfield points out, “most of the systems we design don’t really support finding.” I’ll take that to mean both web search engines and commercial library databases.

Finding, as Rosenfield puts it, “is arguably at the center of all user experiences.” I agree. Everyone wants to find, both end users and librarians. But until systems better integrate browse, search and ask functions it’s highly unlikely that finding will be the simple, mindless task we think is an end-user’s version of search. Rosenfield thinks the answer to better finding is web design based on analytics. Studying users’ behavior and understanding what they are trying to accomplish is a well traveled path to creating better user experiences. The more we know about our users’ behavior when they search our systems, the better we can do at anticipating their needs and structuring search systems that facilitate their finding. This is especially true for our complex library websites where enabling finding is a challenge. As I’ve written previously, I think what we all want is to “create,” and both searching and finding are means to that end. I prefer “search first, find, and then create.”