Forgive this less-than-well-thought-out post. I’ve been thinking a lot about assessment lately and the librarianly love of numbers in assessment, and I’m a troubled by the way that some academic libraries tend to measure how well they are supporting the academic mission of the institution.

Librarians keep a lot of statistics and measure a lot of things. Gate count, reference transactions, instruction sessions, website hits, visits to a specific tutorial or research guide, e-resource usage, etc. We are big on numbers. I have no problem whatsoever with measuring things like this and in many cases I think it’s essential. The thing I do have a problem with are the unsupported interpretations we often make based on these numbers and the direction they’re going in.

Reference desk transactions went down. This is a bad thing! We need to try and get them back up! Really? Why? Do you know why they went down? You probably have some theories, but do you know for sure? Is it because you’re less approachable or is it because there has been an increase in instruction sessions which helped students become more independent researchers? You need to look at the larger ecosystem beyond the reference desk to figure out why this happened and whether it’s a good or bad thing.

The tutorial I created has received more hits than any other one. It must be really useful! Oh yeah? Or is the tutorial for a class that has a lot of sections? Did an instructor require that students visit it? Are the people visiting it staying for a long time or just for a few seconds? Are they getting anything out of it? You can’t say that a web hit = someone getting something out of that page.

We’re teaching more library sessions than ever before. Students will be more information literate when they graduate! Maybe. But how do you know that? Teaching more doesn’t necessarily = learning more. If the instruction you’re providing is not course-integrated and emphasized at various subsequent points in their college career, it might be going in one ear and out the other. How can we determine that what we’re teaching is actually making our students information literate?

Sidenote: Years ago, a professional colleague complained that students in her information literacy sessions were not as engaged as they were years ago and reasoned that the caliber of students at her school had declined. The question I wanted to ask at the time, but didn’t, was have you considered that maybe the way you teach doesn’t work for the current crop of students? We come to unsupported conclusions all the time — not just when trying to analyze statistics. Don’t just assume it’s “them.” Maybe it’s you.

Statistics can tell us a lot of things, but they can also be manipulated to support just about any position. Without actually knowing why something increased or decreased, we should be hesitant about making any judgments.

We often take these assumptions right up to Administration, using these numbers as evidence that we are doing a great job, deserve more funding, etc. This reveals another flawed assumption; the idea that these numbers matter to administrators outside of the library. What do university administrators care about? Retention. Student success. Accreditation. Student satisfaction with the University. Etc. They don’t care about the number of information literacy sessions the library taught unless you can somehow show how those contributed to student success (i.e. student use of quality resources in their papers increased leading to better grades). They don’t care about the number of reference transactions unless you can show that reference support helped to improve retention. Sure, they may nod their head and say “great job!” but you’re not going to really get them excited and “on board” until you tie what the library does to the University’s goals and provide data that demonstrates how what you do contributes to those goals.

I don’t have all the answers on exactly how to measure how the library contributes to the larger goals of the University, but I do know that we’re doing our students a disservice when we make assumptions about how what we do is impacting them based solely on a bunch of numbers. And if we want to promote libraries to the people who hold the purse strings, we need to focus more on demonstrating how we contribute to their “bottom line” than to our own.