“Survey Says…”

Four years ago, I gave my first user survey to students. Still new to my position as librarian/technology integrator, I inherited a traditional library that had fallen out of use, and was desperate to understand what students wanted from their library. I placed a QR code to the survey in the library and got back 69 responses from students who came to the library (out of a student body of 680). I knew this wasn’t capturing data from non-users, but I couldn’t quite figure a way around that.

Flash forward to now. In December and January, I designed a new survey and had 318 responses (out of a student body of 710 students) from library users and non-users. 

Image of a white board depicting a brainstorm of questions for a student survey.

“How Are We Doing?”

Goals Inform the Survey Design

The goal of the survey was to gauge how our library service is currently meeting the needs of our users. And for those infrequent- or non-users, we wanted to learn a bit more about the reasons why they don’t come to the library more frequently, or at all. One afternoon, we—the library assistant and I—sketched out a brainstorm of potential questions on the whiteboard. (Feel free to look at the questions we used and adapt them to your needs!) We shared this brainstorm with our student library advisory committee for their feedback.

Screenshot of a question in Google forms that depicts the “Go to section based on answer” feature

Google Forms

I built the survey with Google Forms. The “Go To Section Based on Answer” feature allowed me to create a decision tree format so that the infrequent- or non-users had a separate set of questions from the regular users.

Testing Phase

We tried our survey out a few times—answering as a frequent user and a non-user—to see how the mechanics worked. Then we asked our student library advisors to take the survey for a test drive and let us approximate how long they spent on the survey. We had fewer questions for the infrequent- or non-users, so it made sense that there would be a time variable. We deleted these early “test” responses before accepting responses from the general student body.

Administering the Survey

Back to that nagging question of how to capture data from our non-users. With five years of collegial capital built up, I was ready to call in a favor from my most data-oriented and detail-oriented department: the science department! These folks love data and teach most of our students for all four years, and I hoped they would be willing to help us out. They generously agreed! I also know that, because these colleagues are so detail-oriented, I would need to craft a brief email with context for them, as well as for students: the purpose and goals, survey length (5 minutes or less), that it would take less time for infrequent- or non-users, responses are anonymous, and that we were looking for honest, thoughtful feedback. 

Feedback about the librarians from students.

Results and Implications

While not at 50%, the number of responses is statistically significant. We are still unpacking some of the fine-points of the data, but the broad brushstrokes are easy to read:

  • 50% of respondents have visited the library 10+ times this year; 16% have visited the library 5-10 times this year.
  • 104 students answered that they have visited the library 0-5 times this year. The top reason cited was “no time,” with the second reason as “my friends don’t go there.” “What would bring you to the library more often?” Again, lack of time was cited as a top comment, followed by therapy dogs, and food!

We have a busy, vibrant high school library where we strive to provide excellent service, a welcoming environment, and support for our users’ academic, social-emotional, and personal needs. But now we have actual data from our users about how they perceive our physical space and environment, collection, and service from the librarians. Overall, the way that we manage the library meets their needs. The survey data will have implications for our budget, collection development, management of numbers of students, and more. I will also be sharing a summary of this data in our upcoming school newsletter, with my evaluator, and my building principal. 

“Taking the pulse” of our students every 3-4 years, and letting that user data inform the direction of the library helps the stewards (aka school librarians) maintain a relevant, student-centered high school library.

Narrative feedback from student survey about their favorite parts of the library.

mm

Author: Iris Eichenlaub

Iris Eichenlaub is the Librarian/Technology Integrator at Camden Hills Regional High School in Rockport, Maine. She is the 2017 Knox County Teacher of the Year, and was named an Inspiring Educator in 2017 by the Maine Education Association. Iris serves on the board of the Maine Association of School Libraries as the chair of professional development. Follow the story of the Edna St. Vincent Millay Library via Facebook (@ESVMLibrary or https://www.facebook.com/ESVMLibrary) or Instagram (@ESVM_Library or https://www.instagram.com/esvm_library).



Categories: Advocacy/Leadership, Blog Topics, Collection Development, Student Engagement/ Teaching Models

Tags: , , , , , , , , , ,

2 replies

  1. Wonderful! I love the comments best. Congratulations!

  2. This is exactly what we are looking to do here! Thank you so much!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.