Category Archives: Just Thinking

Use this category for raising questions and thinking out loud or reflecting on writings for which there is no real specific topic.

Small Steps, Big Picture

As I thought about composing a blog post this week, I felt that familiar frustration of searching not only for a good idea, but a big one. I feel like I’m often striving (read: struggling!) to make space for big picture thinking. I’m either consumed by small to-do list items that, while important, feel piecemeal or puzzling over how to make a big idea more precise and actionable. So it feels worthwhile now, as I reflect back on the semester, to consider how small things can have a sizable impact.

I’m recalling, for example, a few small changes I’ve made to some information evaluation activities this semester in order to deepen students’ critical thinking skills. For context, here’s an example of the kind of activity I had been using. I would ask students to work together to compare two sources that I gave them and talk about what made the sources reliable or not and if one source was more reliable than the other. As a class, we would then turn the characteristics they articulated into criteria that we thought generally make for reliable sources. It seemed like the activity helped students identify and articulate what made those particular sources reliable or not and permitted us to abstract to evaluation criteria that could be applied to other sources.

While effective in some ways, I began to see how this activity contributed to, rather than countered, the problem of oversimplified information evaluation. Generally, I have found that students can identify key criteria for source evaluation such as an author’s credentials, an author’s use of evidence to support claims, the publication’s reputation, and the presence of bias. Despite their facility with naming these characteristics, though, I’ve observed that students’ evaluation of them is sometimes simplistic. In this activity, it felt like students could easily say evidence, author, bias, etc., but those seemed like knee-jerk reactions. Instead of creating opportunities to balance a source’s strengths/weaknesses on a spectrum, this activity seemed to reinforce the checklist approach to information evaluation and students’ assumptions of sources as good versus bad.  

At the same time, I’ve noticed that increased attention to “fake news” in the media has heightened students’ awareness of the need to evaluate information. Yet many students seem more prone to dismiss a source altogether as biased or unreliable without careful evaluation. The “fake news” conversation seems to have bolstered some students’ simplistic evaluations rather than deepen them.

In an effort to introduce more nuance into students’ evaluation practices and attitudes, then, I experimented with a few small shifts and have so far landed with revisions like the following.

Small shift #1 – Students balance the characteristics of a single source.
I ask students to work with a partner to evaluate a single source. Specifically, I ask them to brainstorm two characteristics about a given source that make it reliable and/or not reliable. I set this up on the board in two columns. Students can write in either/both columns: two reliable, two not reliable, or one of each. Using the columns side-by-side helps to visually illustrate evaluation as a balance of characteristics; a source isn’t necessarily all good or all bad, but has strengths and weaknesses.

Small shift #2 – Students examine how other students balance the strengths and weaknesses of the source.
Sometimes different students will write similar characteristics in both columns (e.g., comments about evidence used in the source show up in both sides) helping students to recognize how others might evaluate the same characteristic as reliable when they see it as unreliable or vice versa. This helps illustrate the ways different readers might approach and interpret a source.

Small shift #3 – Rather than develop a list of evaluation criteria, we turn the characteristics they notice into questions to ask about sources.
In our class discussion, we talk about the characteristics of the source that they identify, but we don’t turn them into criteria. Instead we talk about them in terms of questions they might ask of any source. For example, they might cite “data” as a characteristic that suggests a source is reliable. With a little coaxing, they might expand, “well, I think the author in this source used a variety of types of evidence – statistics, interviews, research study, etc.” So we would turn that into questions to ask of any source (e.g., what type(s) of evidence are used? what is the quantity and quality of the evidence used?) rather than a criterion to check off.

Despite their smallness, these shifts have helped make space for conversation about pretty big ideas in information evaluation: interpretation, nuance, and balance. What small steps do you take to connect to the big picture? I’d love to hear your thoughts in the comments.

Questioning the Evidence-Based Pyramid

As a first year health sciences librarian, I have not yet conducted a systematic review. However, as a speech-language pathologist, I learned about evidence-based medicine and the importance of clinical expertise combined with clinical evidence and patient values. As a librarian, I’m now able to combine these experiences, allowing me to view see evidence-based medicine more holistically.

In the past month, I attended two professional development courses. The first was a Systematic Review Workshop held by the University of Pittsburgh. The second was an Edward Tufte course titled “Presenting Data and Information”. While these are two seemingly unrelated subjects, I left both reconsidering how we literally and figuratively view evidence-based medicine.

One of my biggest takeaways from the Systematic Review workshop was that a purpose of  systematic reviews is to search for evidence on a specific topic in order limit bias. This is done by searching multiple databases, reviewing grey literature, and having multiple team members  to screen papers and resolve disputes. One of my biggest takeaways from the Tufte course was that space should be used well to effectively arrange information and that displayed content should have integrity. In his book Visual Explanations, Tufte poses the following questions to test the integrity of information design (p. 70):

  • Is the display revealing the truth?
  • Is the representation accurate?
  • Are the data carefully documented?
  • Do the methods of display avoid spurious readings of the data?
  • Are appropriate comparisons and contexts shown?

When I think about visualization of evidence-based medicine, the evidence-based pyramid immediately comes to mind. It is an image used in many presentations related to evidence-based medicine:

EBM Pyramid and EBM Page Generator, copyright 2006 Trustees of Dartmouth College and Yale University. All Rights Reserved. Produced by Jan Glover, David Izzo, Karen Odato and Lei Wang.

While there is a lot of information in this image, I don’t think it is very clear. I have spoken to librarians (in the health sciences and not in the health sciences) that agree. I think this is a problem. I don’t think all librarians need to immediately know what cohort studies are, but I do think they should understand its context within the visual.

From what I have gathered and discussed with other professionals, quality of evidence/limited bias increases as you go up the pyramid. The pyramid is often explained in a hierarchical way; systematic reviews are considered highest standard of evidence, which is why it is at the top. There are usually fewer systematic reviews (since they take a long time and gather all the available literature about one topic), so the apex also indicates the least quantity. So let’s take a look each of the integrity questions about information design and investigate this further:

Is the display revealing the truth?

Is it? How do we know if this truthfully represent the quantity of each type of study/information? I believe that systematic reviews are probably the least in quantity and expert opinion are the most in quantity. That makes logical sense given the level of difficulty to produce and disperse this type of information. However, what about the types of research in between? Also, is one type of evidence inherently less biased than the ones below? Several studies suggest that systematic reviews may be systematic, but are not always transparent or completely reported and are outdated. This includes systematic reviews published in Cochrane, the highest standard of systematic reviews. While there are standards, they are very frequently not followed. However, following these standards can be very challenging and paradoxical. It’s very possible that a cohort study can be designed in a way that is much more systematic and informed than even a systematic review.

Is the representation accurate?

When I see the word “representation”, I am thinking about visual representation – the pyramid shape itself. There is an assumed hierarchy not just in terms of evidence, but also superiority here. This is a simplistic and elitist way of thinking about this information rather than being informative and useful. If you think about it, a systematic review cannot be conducted without having supporting RCT’s or case reports, etc. Research had to start somewhere. It this was seen as more of a scholarly conversation, I wonder if there would be a place for hierarchy.

I have learned that the slices of the pyramid represent the quantity of publications of each level of evidence. However, this is not something that can be easily understood by looking at this visual alone. Also, if the sizes of the slices represent quantity, why so? Quality is indicated in this version with the arrow going up the pyramid. This helps to represent idea of quality and quantity. However, if evidence-based medicine wants to prioritize quality, maybe the sizes of the slices should represent the quality, not quantity, of evidence. If it is viewed from that perspective, the systematic review slice should be the biggest because it is ideally the highest quality. Or, should the slices represent the amount of bias? This is all quite unclear.

Are the data carefully documented? Do the methods of display avoid spurious readings of the data?

I don’t believe that any data is actually represented here. Moreso, it feels like it’s being told to us so we believe it. I understand this is a visual model, but this image has been floating around so much that it is taken as the truth. I don’t think one can avoid spurious readings of the data because data aren’t represented here.

Are appropriate comparisons and contexts shown?

I do think that this pyramid provides visual way to compare information, however, I don’t think contexts are shown. Again, should the amount of each level of evidence referring quantity or quality? Is the context meant to indicate research superiority? If not, perhaps a pyramid isn’t the best shape. By virtue of its definition, a pyramid has an apex at the top, indicating superiority. Maybe a different shape or representation can provide alternate contexts.

So, how should evidence-based medicine be represented?

I have presented my own perceptions sprinkled with perceptions from others. I’m a new librarian, and my opinion has value. However, I also think this concept needs to be re-envisioned collectively with healthcare practitioners, researchers, librarians, and patients.

Another visualization that has been proposed is the Health Care Literature Wedge. It would look like  a triangle with the apex facing right indicating progressive research stages. I do think there are other shapes or concepts to consider. Perhaps concentric circles? Perhaps this can be a sort of spectrum? 3D maybe? I really don’t know. Another concept to consider is that systematic reviews are intended to reduce bias pertaining to a research question. Instead of reducing bias, maybe we can look at systematic reviews as having increased perspectives? How could this change the way evidence-based medicine is visualized?

I think the questions posed by Tufte can help to guide this. And I’m sure there are other questions and models than can also help. I would love to hear other epistemologies and/or models, so please share!

References

  1. Chang, S. M., Bass, E. B., Berkman, N., Carey, T. S., Kane, R. L., Lau, J., & Ratichek, S. (2013). Challenges in implementing The Institute of Medicine systematic review standards. Systematic Reviews, 2, 69. http://doi.org/10.1186/2046-4053-2-69
  2. Garritty, C., Tsertsvadze, A., Tricco, A. C., Sampson, M., & Moher, D. (2010). Updating Systematic Reviews: An International Survey. PLoS ONE, 5(4), e9914. http://doi.org/10.1371/journal.pone.0009914
  3. IOM (Institute of Medicine). (2011). Finding What Works in Health Care: Standards for Systematic Reviews. Washington, DC: The National Academies Press.) Retrieved from http://www.nationalacademies.org/hmd/Reports/2011/Finding-What-Works-in-Health-Care-Standards-for-Systematic-Reviews.aspx
  4. McKibbon, K. A. (1998). Evidence-based practice. Bulletin of the Medical Library Association, 86(3), 396–401.
  5. The PLoS Medicine Editors. (2007). Many Reviews Are Systematic but Some Are More Transparent and Completely Reported than Others. PLoS Medicine, 4(3), e147. http://doi.org/10.1371/journal.pmed.0040147
  6. Tufte, E. R. (1997). Visual Explanations: Images and Quantities, Evidence and Narrative. Cheshire, CT: Graphics Press.

 

Personal Development As Professional Development

Like many of us I was dismayed by the results of the last US presidential election, and at one year in I’m even more concerned for the nation and the people who live here. One of the things I resolved to do in the aftermath was to make the time for some training that I’d long been interested in but hadn’t prioritized. Over the course of this year I’ve taken a bystander intervention workshop as well as a 5-week self-defense course, both facilitated by a local organization that focuses on violence prevention programs for marginalized communities. I also attended a one-day medical first aid training session offered by my university, and a one-day mental health first aid training held at my local public library and provided by the NYC Department of Health.

I consider these workshops to be more for my own personal than professional development: they were programs I attended on my own time rather than work time, and I’ve felt generally safer and more aware since, which I appreciate. But I definitely think these experiences have been useful for my work in the library, too. As a workshop participant I’m focused on listening to and learning the content, but I also pay attention to how the facilitators run the program. Do they lecture, use slides or handouts, or show video clips? For longer trainings, how often do they intersperse opportunities to participate in an activity (and breaks) with sitting and listening? How do they handle groups with folks who are reluctant to answer questions, or folks who take up more than their share of conversational space? I’ve learned so much about strategies for effective workshops from watching successful (and less-successful) facilitators work, strategies that I can bring to my work when I teach, lead a meeting or workshop, or give a presentation.

Most valuable, I think, is the opportunity these programs have given me to think about my community, both narrowly — family, friends, colleagues — and broadly, in my neighborhood and city. I’m more introvert than not, and talking about or working through sometimes sensitive topics with a group of people I’ve never met before is somewhat daunting to me. But for all of my hesitation I’ve appreciated the opportunity to listen to and learn from my fellow participants, diverse in age, experience, and background.

I went to these trainings because I wanted to learn strategies to deal with multiple kinds of potentially scary situations, but I’m grateful that they also provided me the chance to build empathy. The end of the semester is approaching with speed, the political situation continues to be disturbing, and everyone is stressed. I was struck last week by a Twitter thread by a social worker that reminded me how important it is, especially right now, to start with empathy. Let’s commit to being gentle with ourselves, our colleagues, our students, and our communities in this busy time of year.

Digging for Gratitude

A little over a year ago, I took a flight to Los Angeles to interview for my job at UCLA – it was the night before the election. At the time, natives and their allies were fighting to re-route Dakota Access Pipeline. I found out towards the end of my flight to LA, that the gentlemen in the aisle seat of my row was from North Dakota and thought natives were “making a big deal” out of it. I woke up the next morning to learn that my less preferred candidate won the election, and I cried in disbelief. I had no idea how I was going to get through my interview.

A year later, I am in my position at UCLA, and recent news of the Keystone Pipeline 210,000 gallon oil spill has come to light days before Thanksgiving, a holiday based upon the false notion of unity between natives and colonizers. I don’t mean to be a Debbie Downer, but I just wanted to place this article in it’s appropriate historical context of my life as a first-year librarian. While I am beyond grateful for my job, my amazing colleagues, and the sunny skies around me, I started in this profession during, what I believe is, a grave time in global history.

I approached librarianship as a career because I loved being able to provide individuals information. However, as I mentioned in my first post, I also embraced the critical possibilities within the profession. I would be lying if I said I have been able to sustain the enthusiasm for deneutralizing the library because between moving across the country, starting a new job, and the current political climate, I am emotionally exhausted.

The good news is I have still found outlets that affirm my place in this field. So here is a list of what has kept me going. I want to share this for anyone else feeling a lack of hope and/or motivation to keep sticking with the fight:

  • Multiple students have approached me with a research question that focuses upon a marginalized population.
  • The UCLA Medical Education Committee held a retreat to discuss diversity, inclusion and equity in medical education. This included speakers that used words such as “racism”, “oppression”, and “microaggressions”.
  • I have been able to collaborate with amazing South Asian women librarians for an upcoming chapter in Pushing the Margins: Women of Color and Intersectionality in LIS. On top of it, my co-authors and I were able to share our experiences about being South Asian women in librarianship in a panel at a symposium at UCLA. And even better, I was able to meet and listen to the other incredible authors that will be included in this book!
  • My colleagues and I were able to create an in-person and virtual exhibit to highlight Immigrants in the Sciences in response to the DACA reversal and the White nationalist march in Charlottesville.
  • UCLA’s Powell Library held a successful Conversation Cafe for International Education Week.
  • I attended a fulfilling professional development opportunity about systematic reviews.
  • I have shared tears and memories with several other LIS students through the ARL IRDW and Spectrum Scholar program.
  • I was able to visit Seattle for the first time and attend my first (of many) Medical Library Association conference.
  • I gained a mentor and friend.
  • Every time I teach, I learn something new about active learning, teaching methodology, and how to teach to specific audiences. Most importantly, I feel like I am truly in my element.
  • I met the Librarian of Congress! #swoon
  • I inherited two precious cats (librarian status achieved).
  • I’m way less clueless about being a librarian than I was when I started in April!
  • And now I am able to share my first-year experiences through ACRLog!

This is not an exhaustive list, however, it proves that in less than 8 months of working in my position, I have been blessed to create, pursue, attend, and feel a part of unique opportunities within my profession, especially at my institution. So while I might feel disillusioned and hopeless because of the world and its inequities, I have to admit that there have been several upsides.

Thank you for reading, and I hope you too can discover these golden nuggets amongst the rubble around us.

Reflecting on Reference Services

A colleague recently invited me to speak in an LIS graduate class she teaches on information services. I was delighted to have the chance to talk with her students; it was even more of a treat since I attended the same graduate program for my MLIS, and the information services course was the very first course I took in my program (mumble-mumble) years ago.

The students in the course are varied in their career goals, and not all are aiming for academic librarianship or public services work. So while I did speak about how my coworkers at City Tech and I think about reference work in the library at our large, public, technical and professional degree granting college in New York City, I also tried to contextualize reference services not just within the organization of the library, but also within the college, university, and city.

As I’m sure is not unusual for colleges like City Tech, reference for us is not just about answering questions about staplers and printers, or helping students navigate databases and the catalog to find sources for their research projects. Reference at City Tech also involves questions about the college and university. The library is the only place on campus that is open for many hours in the evenings and weekends (and we don’t even have overnight hours). We’re also one of the few spots on campus with a person sitting at a desk that’s highly visible (our reference desk is just inside the library entrance), and that features a sign that directs folks to ask for help (ours reads “Ask a Librarian”). At our reference desk we get all the questions: about technology, logging into wifi, the learning management system, registering for classes, filling out financial aid forms, etc.

So lots of what we do at the reference desk at my college looks like answering questions though also sending students to other places on campus. And that has led to discussion among our library faculty; do we still need a traditional reference desk when traditional reference questions are not always the kinds of questions we get?

Lots of academic libraries have shifted to reference by appointment only, or personal librarians, or other models, but at this point we don’t feel that those models will best serve our students at City Tech. Most of our students have come straight from the NYC public high schools, where they may not have had a school librarian. Many are in low-income households, or are in the first generation of their families to attend college. Some have library anxiety — City Tech’s library is only two floors in the middle of a building and can seem so small and unassuming to me, but I have heard students say that they found it to be big and confusing when they first got to the college. Having a staffed reference desk can help the library feel like a welcoming place for students, especially new students.

We schedule a library faculty member at the reference desk during all hours that the library is open while classes are in session, and most hours during semester breaks. That said, we have made some changes over the past couple of years. Moving a technical support staff member to a slightly different location allowed us to reduce staffing by library faculty at the reference desk from two librarians to one. This arrangement definitely serves students better, and relieves librarians of having to spend lots of time reviewing details of our printing system with students (as I alluded to in a post last year). This change has also proved helpful in accommodating some expected and unexpected staffing shortages this semester.

However, there is still some tension in managing information services in relation to everything else that my colleagues and I want librarians and the library to do with our campus community. I’m not quite sure how things will change for us in the future — while we are interested in doing more course-integrated instruction and other information services work with City Tech students, faculty, and staff, it’s unclear whether we’ll need to shift reference, too.