Reflections on library assessment and the Library Assessment Conference

Reflections on library assessment and the Library Assessment Conference

I wanted to write about the Library Assessment Conference as soon as I returned, but unfortunately, life got in the way. I got barely a week and a half before I was set to leave my job and, not surprisingly, there was a lot of wrapping up of projects and getting things to a good place to hand them over to colleagues. My last day was August 15th and after spending six days riding bikes in Sunriver, Oregon, I finally have some time to take a breath and reflect.

I went to the Library Assessment Conference two years ago, and for some reason, spent most of the time feeling like I should be sitting at the kids table. I don’t know if it’s because it’s so ARL-heavy or what, but I didn’t really feel like I belonged there. This time, I felt completely different. Sure, there were still a lot of people presenting on things I could never do at a less well-resourced institution, but I felt more like we were all struggling with  a lot of the same meaty assessment issues, whether we were at small community colleges or huge ARLs.

Lisa Hinchliffe and I presented some of the results of our survey of community college libraries about the elements that have helped and hindered them in their journey towards building a culture of assessment. It was a quick sketch of our findings and I look forward to publishing the results (eventually) and also having more time to present them at ACRL 2015.

Here were a few of the interesting takeaways for me from the conference:

 

One evangelist or expert does not a “culture of assessment” make

 

The always insightful Library Loon writes occasionally about something called “Coordinator Syndrome.” As a newly former coordinator, it’s something I can say without question that I experienced. I think administrators have good intentions when they create coordinator positions that are set up to fail. They recognize that x is important and they don’t currently have anyone with expertise in x in the Library. So they hire someone to be in charge of x, but they give this person no support, only a mandate to make sure a culture of x is infused into every part of the institution. As if one person (and a new person at that) could achieve this. This could be instruction, assessment, scholarly communications, or a whole host of other things. The simple fact is that no one person can create a culture of anything. They can bring expertise and provide support, but there has to be a general understanding and acceptance that x is everyone’s responsibility. One person can’t do the heavy lifting on their own.

In light of this, I was very impressed with how the libraries at Northwestern have structured their efforts to foster an assessment culture. They started with an assessment committee whose charge was to be evangelists for assessment. This was made up of various stakeholders from the institution. Having influential people on a committee will lend credence to the charge. When they had the funding, they hired an assessment librarian whose primary job was to act as a consultant, sharing their expertise to help librarians actually DO assessment work. The assessment librarian wasn’t there to be a cheerleader for assessment; that was the job of the committee. The assessment librarian was there to help librarians do assessment work when they’d already decided it was worth their time to do assessment work.

This, my friends, is how a coordinator position should be framed.

 

We are (nearly) all grossly underprepared to do assessment work

One theme threaded through several talks (these two stuck out for me) was the notion that librarians come out of library school ill-prepared to design assessments and analyze the results. Most of us stumble and bumble our way towards better assessment work, but it’s largely through trial and error and watching what others are doing. I look back on my very early assessment work and cringe. What was I actually even measuring???

That said, I question whether library school is the best place for students to be learning to do assessment work or research. These are the sorts of skills that library school students don’t think they’ll need (how many of us got a lot out of the required “research class” in library school?) until they do. And when they do is the best time actually to learn how to do it, because, like with our own students in information literacy sessions, it won’t stick unless you are practicing what you learned in a real situation. I think libraries and organizations that support practicing librarians need to do more to train and support librarians in doing assessment and libraries specifically need to give librarians the time to learn how to do these things right. Hiring an assessment librarian who acts as a knowledgeable consultant is one solution, but not every library can afford to hire an assessment expert. There are immersion programs and assessment conferences, but those aren’t enough to give people the nuts and bolts skills they need. I favor the idea of assessment mentoring programs, but am not sure what it would look like in practice. However, mentoring makes more sense, because I think people need feedback and support at their points of need.

This is something we struggled with on the Orbis Cascade Alliance Assessment Team over the past two years. We know there is a lot of assessment expertise within the Alliance (hell, we have Steve Hiller, my assessment idol), but figuring out the best way to get that expertise out of the heads of those who have it and into the heads of those who don’t is the challenge. We planned and executed a Library Assessment and User Experience Unconference (LUAU) where folks could share their assessment projects and ideas and have conversations around them (and for which we were just awarded the 2014 ACRL Oregon Award for Excellence), but, again, it’s a one-time thing and not the sort of point-of-need mentoring that many really need. Still, every little bit helps, and I know I came away from both the LUAU and the Library Assessment Conference with great new ideas.

 

Data privacy is the elephant in the room

If there was one thing that became abundantly clear at the conference, it’s that data is king. It’s funny, because I thought at the last Library Assessment Conference, there was a general trend toward mixed-method qualitative research, but this year it was all about hard data. Using student data was at the heart of each of the keynote’s talks. One keynote even said that we should be collecting transaction-level data regularly even if we don’t yet know what we might use it for.

Whether we like it or not, predictive analytics have become a major part of the higher education game. Student affairs and admissions folks have been using predictive analytics systems for ages in making admission decisions and in advising. At Portland State, advisors can see when the best time is for students to take a “bottleneck course” in order to succeed in it. They know that students coming in with a certain high school GPA would be best served by taking certain combinations of classes at specific times in their college career. They know which students are most likely to leave the institution and can provide more intensive support.

Part of me thinks this is great. We need to learn more about how our patrons use our resources and services. We can use what we learn about specific populations for targeted outreach. This could be so beneficial!

The other part of me is kind of horrified by some possible implications of predictive analytics. If we know what types of people are likely to succeed in college, why not only admit that type of person? And what if that type of person tends to be white or asian and have money? It’s one thing to use predictive analytics to help guide a student through school so that they can be successful, but predictive analytics are already being used in admissions decision-making at many schools.

Predictive analytics are also rather creepy. HP uses predictive analytics to determine which of their employees are most likely to quit and what steps they can take to retain them. How would you like your characteristics to be used by your employer in this way? Feel a little intrusive? The privacy implications are huge. Do our students know that we are collecting this information about them? Do they get the chance to say “no?” Libraries have always been passionate about safeguarding patron data. If we collect transaction-level data and correlate it with student demographic data, even if we get rid of student numbers, can we really guarantee that people won’t be identifiable in the data? With all the data breaches that have happened on campuses across the country, can we guarantee the security of this data? And when students discover that we are collecting this data (perhaps when they are contacted because they have never used the library and those who use the library are 5 times more likely to graduate) how comfortable will they feel checking out a book or accessing an article on a controversial or potentially embarrassing subject? What is the potential impact on intellectual freedom?

Yes, there are a lot of safeguards in place to protect students (FERPA, IRB, etc.), but I still think that libraries should hold themselves to even higher standards than admissions staff or advisors, simply because of our ethical obligations as librarians. This doesn’t mean that I don’t think we should collect data, but that with every project, we should weigh the potential benefits for students against the privacy risks and do everything in our power to ensure the security and anonymity of the data we collect.

No one talked about data privacy in presentations at the Library Assessment Conference, but there was a discussion on Twitter that sparked thoughtful posts from Andrew Asher and Barbara Fister.  I think it would be great to have a panel at the next LAC on data privacy and ethics.

 

Librarians are doing amazing, creative assessment work

Take a look at the powerpoints from presentations and the posters themselves from the conference. People are doing amazing things with assessment to learn more about our patrons, help inform decision-making, make the case for changes or funding, and demonstrate the value of things the library does. Libraries are doing amazing work analyzing data they already have using data visualization tools and doing longitudinal analyses. One library is using GIS data in ways I’d never thought possible, like in showing how students are using library spaces. Another library used data to determine which classes they should be providing instruction in and which are redundant.  All of the work coming out of the Assessment in Action program participants is so exciting. I think there’s a lot to learn from projects undertaken by other libraries, and I came from the conference with so many ideas that I might use in the future.

 

LAC is a great conference at which to present

As a presenter, I love how well-supported we are by the ARL staff at the conference. It’s nice to know that I don’t have to worry about anything other than being prepared to talk. I’ve had a lot of terrible experiences at talks that ranged from having to do DIY tech support just before my talk, to not being told I had to bring my own laptop until I got there, to not being told I couldn’t use my laptop (with my Mac Keynote presentation) until I got there, to presenter-unfriendly set-ups (like having the computer somewhere else so you can’t actually see your slides unless you look behind you at the screen). They tell you what they need from you ahead of time, what the affordances and limitations are, and make sure everything is loaded and ready to go before you even enter the room. Very nice!

All in all, it’s a conference I would recommend to any academic librarian interested in assessment. I came home energized and inspired, though in this liminal state between jobs, there’s not a lot for me to actually use. For those of you who attended the Library Assessment Conference, what was your biggest takeaway? What presentations inspired you?

7 Comments

  1. Dawn H.

    Thanks for articulating your concerns about student data and privacy. I currently work in a split position for a small college in the library and eLearning departments, and I sometimes struggle with the ethics of mining student usage data. We use the data to design better learning objects and experiences, but I do think those of us who teach and work in online environments should inform our students that most of their activities are track-able, from the links they click on to how long they spend on a content page, etc.

  2. I have had reasonable luck getting library-school students to understand that assessment is A Thing by baking it into projects and other work they do for me. I’m not teaching assessment methods so much — I am whoa unqualified to do that, so that’s what our three-credit Research Methods course and our occasional one-credit Assessment course (neither of which I teach) are for — as assessment consciousness, the awareness that in the Real World(tm) they’ll have to show impact as well as assess-to-improve.

    Another upside is that baking this kind of requirement into a project makes the project more well-rounded and real-worldish. I’m dead sure students feel that even when they don’t articulate it right away.

    So it can be done, I believe. It’s just something that needs a reminder boost several places in the curriculum. Thinking solely in terms of “a dedicated course” is the curricular equivalent of Coordinator Syndrome, in a way!

  3. Hey Dorothea! I do the same. With a project proposal that my students have to do for my class, they have to show how they will measure impact/efficacy. You’re right that it’s getting them into the “we will need to show how this is worthwhile” mindset that’s so critical. But I think the actual down-and-dirty assessment skills are the ones people worry about, and those are still better learned on the job with the support of one’s employer and professional orgs.

  4. Ellen Hoffmann

    Insightful report. I am a retired university librarian and follow very little of what we used to call library literature, but always enjoy your posts.

  5. Meredith, thanks for this great overview and your thoughtful comments, not to mention the links. I teach the medical library courses, as well as info org and reference, at Texas Woman’s University SLIS, and have relatively recently been convinced that assessment needs to be part of the curriculum. I attended an unconference at TLA on the topic, and ended up inserting a course module on assessment into my medical library management course (taught online). I was fortunate enough to have been able to convince the TLA speaker, Sheila Hatchell, to present a talk for students enrolled in this and all SLIS courses. If you have not heard of her – I recommend checking out the Minnesota DOT study site: http://www.dot.state.mn.us/library/Library-ROI-Study.html

    I found myself nodding in response to your comment about library school, but I’m hoping that incorporating a module on the topic (with readings, interview of a librarian involved in assessment, and a reflection) will convince students of the value of assessment, helping to create that culture in advance. And me – I have a lot to learn!

  6. Sarah Aerni

    Hi Meredith,
    I worked in the library field as an Assessment Coordinator (at the University of Pittsburgh) for a few years and attended the first LAC in Charlottesville, VA in 2006. Presenting there was a great experience for me too and I’m glad to read your review of this year’s conference and see how things have evolved over the last 8 years.

    In terms of how librarians and library school students should learn about assessment, I feel strongly that we should be able to work with people who specialize in data analysis to learn it. I think there’s a place for that in library school, by taking classes that would be cross registered with departments like statistics or applied data analysis. It would be similar to how I took a course in the law school on copyright law while doing my master’s degree. Courses in statistics and applied data analysis departments are already prepared to teach students coming in from a wide range of fields. It helps make the presence of librarians known if they join these courses too, but I found so few students in my cohort who were willing to step up to the challenge of a data-driven course (and I found so few who were quantitatively inclined to start). But that’s where I’d want to learn those skills (both quantitative and qualitative) to put to use when I’m out in the field practicing. Of course, there will be much more to learn on the job and from others, but having some confidence to get started based on coursework from the master’s degree is the place to start, in my opinion.

    Thanks for continuing to write your thoughtful blog and best wishes for your new job.
    Sincerely,
    Sarah Aerni

Comments are closed