Category Archives: Simplicity vs. Complexity

Use this category for items that relate to an ongoing discussion of finding a balance between givin users simplicity or expecting them to deal with complexity.

Must Scheduling be Sisyphean?

I was planning to post last week about something interesting I’d read in the library or higher ed news and literature, but I haven’t kept up with my reading as much as usual recently. The task that’s been occupying my time? Scheduling our English Comp library instruction sessions. It’s not the most glamorous or fun part of my job, but it’s one of the most important. Every semester the scheduling process seems to drag on and on, and I find myself thinking that there has to be a better way. But once the schedule is set my grumpiness fades away, conveniently forgotten until the beginning of the next semester. I always intend to spend time between semesters researching scheduling alternatives, but there’s usually a project that’s so much more interesting that it elbows scheduling out of the way.

We use Google Calendar to keep track of the library’s schedule (not just instruction, but reference, meetings, etc.), and I’m reasonably satisfied with it. It’s the process of scheduling classes and librarian instructors that I think could use some tweaking. In the past I’ve waited until a few days into the semester to get the final list of classes from the English Department (sometimes sections are added or canceled at the last minute, depending on enrollment). Then I’ve taken the class list and our calendar and slotted all of the sections into our library classroom schedule. And then I’ve tentatively assigned instruction librarians to the schedule, trying to make sure that no one is responsible for too many early morning, evening or weekend sessions. Once the instruction librarians have approved their schedules, each of us has contacted the English instructors for the library sessions we’re teaching. Occasionally there’s a bit of horsetrading when an English instructor requests a date change, but usually not too much.

This semester we tried something a bit different and asked the English faculty when in the semester they’d like their library session to be scheduled, emphasizing that we’d like their students to come to the session with a research topic in hand that they can use to practice searching for library and internet resources. I got a preliminary list of classes from the English department and contacted faculty a few days before classes began, but there were still a handful that I wasn’t able to get in touch with until the second week of classes. About two-thirds of the instructors responded with their preferred dates, and I was able to give most of them their first choice (I’d asked for 3 possibilities). I put the remainder of classes in our schedule as before and contacted those instructors to let them know. We also decided we’d try asking the instruction librarians to pick the classes they’d like to teach, so each of us chose our sections once the schedule was set.

I do think that scheduling went a bit smoother this semester, but it’s hard to know exactly why. We have significantly fewer sections of English Comp this spring than we had in the fall (64 rather than 126), which definitely impacts scheduling. But in some ways I feel like the amount of time spent scheduling hasn’t changed, it’s just been spread out more evenly: I’m fielding emails from faculty and putting sessions into the calendar in dribs and drabs over the course of the two weeks rather than in a couple of big, multi-hour scheduling binges. We’ll see if this method can hold up in the fall.

How does your library schedule instruction sessions? Are there any tips or tricks for streamlining the process that you can share?

Something Is Better Than Nothing

As you read and learn more about design a basic principle appears again and again. Design for simplicity. In fact one hallmark of great design is that it makes the complex simple. That said, as Garr Reynolds put it in a recent presentation, simplicity should not be confused with simplistic. Simplistic is about dumbing things down because it is easier for us. Simplicity is about creating clarity where there previously was confusion. The latter best serves the end user.

I got to thinking about this after attending a recent webcast presentation sponsored by Library Journal and Serials Solution. The point of the webcast, Returning the Researcher to the Library, was to share ideas about how librarians could create a better return on their investment in electronic resources. With all the money we spend on electronic resources, who doesn’t want to create greater awareness about their availability and gather evidence that documents how students and faculty use the library’s e-resources for their research. The presenters shared some good ideas and research findings. One of the speakers shared her library’s experience with a recently implemented catalog overlay – you’d know it from its graphic/visual search functionality. After examining search logs the presenter pointed out that searches getting zero results in the old catalog did get results in the new catalog. What was the difference? The simplicity of the new overlay.

A good question was asked. Was there any analysis of the results from the searches in the new catalog? In other words, there were results but were they relevant? Other than one example involving a search that looked more like something a librarian rather than an end user concocted, the answer was no – there was no analysis of the results. All we really know is that the new, simpler interface provided some results whereas the old, complicated interface provided no results. That lead to the conclusion that from the user’s perspective “it’s better to find something than nothing”. Do you agree with that? Isn’t is possible that the something you’ll find is so irrelevant or worthless that it may be worse than finding nothing. Or the something found may only be one miniscule sample from a much greater body of information that will be completely ignored. “Oh great. I found something. Now I’m done with my research”. What you miss can often be much more significant than what you find. The results only show there were zero result searches in the old catalog. It tells you nothing about whether or not the searcher tried again or went and asked for help. In some cases finding nothing may lead the searcher to re-think the search and achieve improved results. Maybe you think I’m guilty of wishful thinking here.

I suppose what mostly had me puzzled was the suggestion that simple search interfaces, rather than instruction for research skill building, is the ultimate solution to better searching and research results. It’s true that at large research institutions it will be difficult to reach every student with instruction, and there are some strategies to tackle that problem. But here’s my issue with the assumption that simple search interfaces are the solution. I don’t care how simple the interface is, if a student lacks the ability to think critically about the search problem and construct a respectable search query it doesn’t matter what sort of simple overlay you offer, the results are still likely to be poor. Garbage in is still garbage out. That’s why library instruction still has considerable potential for improving student research in the long run.

That said, I find it difficult to argue against the potential value of catalog and database search systems that will find something that can at least get someone started in their research. These simplified systems also offer potential for resource discovery, and we certainly want students and faculty to become aware of what may now be hidden collections. Despite the shortcomings we need to further explore these systems. At least one system I examined at ALA allows librarians to customize the relevancy ranking to continually fine tune the quality of the search results. But let’s not proceed to dismantle library instruction just yet. We need to constantly remind ourselves that creating simplicity is not the same as making search systems simplistic. Research is an inherently complex task. Instruction can help learners to master and appreciate complexity. Then, on their own they can achieve clarity when encountering complex research problems that require the use of complicated search systems. That, I think, is what we mean when we talk about lifelong learning.

Feeling Lost In A World Of Search Zombies

Maybe I’m getting more removed from mainstream search. I know that some aspects of online searching can be complex, and depending on the uniqueness of some disciplinary databases (think about using financial screening tools in NetAdvantage or ValueLine Research Center) search can reach the extremes of complexity. But I would never have thought to associate the word “complex” with three basic search functions: formulating a search question; evaluating the results; and revising the search strategy. True, these basic skils are hardly intuitive for college students, but it certainly seems within their ability to learn – and I know that many have. So I was surprised to read this in a recent Jakob Nielsen column:

How difficult is it to perform a search on Google? I’m not talking about the challenge of formulating a good query, interpreting the results, or revising your search strategy to reap better results. Those are all very complicated research skills, and few people excel at them.

Complicated research skills? If you take away those basic skills what is left to a search? Have we created a generation of search zombies who listlessly tap away at the keyboard with no strategy at all just hoping they’ll find some information, and then mindlessly settle for whatever their first Google page yields? On the positive side, this suggests to me that librarians are among the few professionals who do excel at these tasks. While it’s great to know we have an increasingly rare skill , I’d feel much better if, as a profession, we were making greater progress in helping more people to develop these basic search skills, or getting more recognition for what we can do.

This leaves me with two thoughts. First, if excellence in navigating the complexity of search (and mind you that Nielsen isn’t talking about library databases – he’s just referring to search engines) is a rarified skill, why the heck can’t we leverage our expertise to raise our profile in society. You would think that the ability to cut through the web wasteland would be a prized skill that people would seek out. Second, if everyone other than librarians lack these skills, then the state of searching and the public’s research ability must be far worse than we might have imagined. Perhaps the “good enough” (or is it now “barely good enough”) mentality has finally turned the masses into search zombies. What’s the cure for that?

Open and Closed Questions

Another way to introduce students to the idea of complexity in the research process is through open and closed questions. In Second-hand Knowledge: An Inquiry into Cognitive Authority, Patrick Wilson describes closed questions as matters which (for now) have been settled beyond practical doubt and open questions as questions on which doubt remains.

I suggest to my students that one way to focus their research is to pay attention to clues that suggest where the open questions are and to concentrate their efforts there. Wilson points out that previously closed questions can become open when new information comes to light. In class, you can illustrate this and attempt some humor with the line, “when I was your age, Pluto was a planet!” Then proceed to explain how the planetary status of Pluto became an open question with the discovery of the Trans-Neptunian objects Quaor, Sedna, and Eris. Then follow this up with an example of an open question in the subject matter of the class you are teaching.

The term “research” is ambiguous. For some it means consulting some oracle–the Internet, the Library, the encyclopedia–finding out what some authority has said on a topic and then reporting on it. Fine, sometimes that’s what research is. That kind of research can be interesting, but it can also be pretty boring. What makes higher education thrilling is discovering live controversies and trying to make progress on them. Academic libraries are not only storehouses of established wisdom, they also reflect ongoing debates on questions that are unsettled, in dispute, very open, and very much alive.

Why Students Want Simplicity And Why It Fails Them When It Comes To Research

The research process, by its very nature, can be both complicated and complex. For students it presents a gap between the known and unknown. They get a research assignment, usually broadly defined by the instructor, and then need to identify a topic without necessarily knowing much of anything about the subject. Then to further complicate matters the student must navigate unfamiliar resources, perhaps encountering new and unusal concepts along the way. A defining quality of a complex problem is that right answers are not easily obtainable. Excepting those students who are passionate about the study matter and research project, most students would prefer to simplify their research as much as possible. The problem, as a new article points out, is that applying simple problem solving approaches to complex problems is a contextual error that will lead to failure. I think this theory may better inform us about why students take the path of least resistance for their academic research, than our usual beliefs that they are just lazy, have adapted to their instructors acceptance of “good enough” research or that the blame lies with us for serving up too complex search systems.

The Cynefin (pronounced Ku-Nev-In) Framework can help us understand why students apply simple approaches to complex problems, and how that is a formula for poor research results. Cynefin is a Welsh word that signifies the many factors in our environment and experience that influence us in ways we can never understand. A recent Harvard Business Review piece by David Snowden and Mary Boone explains how the Cynefin Framework can help us to better match our process for problem solving to the actual context of any particular problem. In other words, as a decision maker – and being an effective researcher requires the making of any number of decisions (what database to use, what search terms to use, which results to explore, etc.) – one must understand the very context of the situation in order to think clearly about developing the appropriate decision. In their November 2007 HBR article “A Leader’s Framework for Decision Making” Snowden and Boone help us to understand how to make better decisions in multiple contexts. Some might call this situational leadership.

The four main contexts are simple, complicated, complex and chaotic, but here I’ll deal with just simplicity and complexity. Simple decisions have their place. It depends on the context of the problem situation. We resolve them by using patterns and processes that have delivered past success. In other words we approach simple problems by using personal best practices. The right answer is clear, evident and without dispute. There is no uncertainty. The danger lies in what the authors call “entrained thinking”. When managers and leaders approach a problem the natural reaction is to use familiar strategies and methods to seek the one right solution – the ones we have trained ourselves to use because they typically succeed. While those entrained methods may work well in simple contexts they may lead to disatrous results when the context is complex. The point of the article is that managers and leaders must first analyze the situation at hand to determine its true context, and then use decision-making strategies that effectively fit that context. In some situations that are extremely complex, the authors say that no leader may be able to devise an effective solution and that those involved in the situation must allow a solution to emerge. Great leaders recognize these dilemmas, and are able to construct the environment that generates discussion that leads to the generation of ideas.

Students come to our academic institutions after 15 or so years of research methods that may have always worked in their previous simple contexts. I need to know the names of Britanny Spears’ children…I use Google to find the answer. I need to know what year the War of 1812 started…I use Ask.com to find the answer. I need to know the reasons the American Revolution began…I use Wikipedia to find the answer. In these simple contexts there is always a right answer that can be easily obtained. If these strategies have served our students well, what do we think they’ll do when they get their first challenging research assignments? Right! They’ll apply their decision-making process that has previously led to great success. So what can we do about this? How can we help our students to understand that when it comes to college-level research they must first examine and understand the context of the decisions they will need to make before taking any action?

I propose that we add “identify and understand the context of the research problem and choose a decision-making style that matches that context” to that long list of information literacy skills that many of us list in some planning document. And it should be near the top of the list. There are times when a research question has but one correct answer and the simple context demands a simple research method. Go ahead and search Google. But when the research challenge is vague, involves uncertainty and requires navigating some complex issues, then students need to recognize it and overcome their temptation to seek out simple solutions. I’d like to think that if we can get students to think in terms of context it might help them to increase the effectiveness of their research skills. This skill could prove to be valuable for achieving academic success, but also for the many decisions our students will need to make in their post-college careers.