Gathering Evidence of Library Impact: A Practical Guide to Asking a Good Question
A
Artefacto
In our previous blog post, we discussed the benefits of one question survey. We looked at why qualitative open-ended questions are often the best way to gather meaningful information from your users and measure library impact.
But not all feedback is impact evidence. In order to gather meaningful impact evidence from your users, it’s important to know how to ask a good question.
And that's easier said than done. That's exactly what this guide is here to help with. We'll talk about what makes an open-ended question genuinely useful for gathering impact evidence and the common mistakes that cause even well-intentioned questions to fall short.
Before you write your question
Before you start writing your question, it’s worth taking a step back and being clear about why you are asking it in the first place. This might sound obvious but when you skip this step you can end up with a question that gathers a general impression rather than impact evidence.
A good starting point is to think about the outcomes you're hoping your service has achieved. What difference do you want to make to your users? What would success look like? Keeping the outcomes at the front of your mind can steer you towards a question that actually captures valuable impact evidence, rather than general feedback.
It’s also worth remembering that your users are giving up their time to respond to your questions. Being intentional about what and why you ask shows that you respect that, and you’ll most likely get meaningful answers in return.
Writing the Question
Without venturing too far into methodology, it is worth keeping a few principles in mind when writing questions to gather impact evidence from your audience.
Keep it clear, short and singular. One of the most effective things you can do is ask one thing at a time. Double-barrelled questions, those that combine two enquiries into one, create confusion and often produce unclear answers. For example, when asking about digital resources, it is tempting to combine two valid questions into one: “Did you find our online resources easy to use and did they help you find what you needed?” But respondents will not know which part to answer. It’s better to ask them separately, starting with “How easy, or not easy, was it to find what you were looking for using our online resources?”, and following up on the other point afterwards.
Use plain, accessible language. Avoiding jargon removes barriers and signals respect for your respondents' time. No one should need specialist knowledge to understand what you are asking. A question like “Don't you think having free access to OPAC is really valuable?” not only leads the respondent, but may also alienate those unfamiliar with what “OPAC” means in this context. A more accessible version such as, “How, if at all, do you use the library's free digital resources, such as eBooks or online databases?”, is clearer, more neutral and open to a wider range of people.
Watch out for wording that nudges responses in a particular direction. Leading questions are a common pitfall in evidence gathering, precisely because they assume a positive outcome has already occurred. This can feel flattering to ask, but it influences how people respond. When asking about general library use, for example, “How has visiting the library improved your life?” assumes that it has. A stronger alternative is: “What, if anything, has changed for you since you started using the library?” This open framing invites honest reflection, including the possibility that nothing changed at all. That, too, is valuable information.
A few further things to bear in mind:
- Avoid words like “always,” “never,” or “every time” can feel unrealistic, and may cause respondents to disengage.
- Be specific about timeframes where relevant. “Since attending” or “in the past three months” helps ground the question and makes it easier to answer accurately.
- Test your question on a colleague first. If they hesitate or ask for clarification, it is a sign the wording needs another draft.
Types of impact questions
Impact questions come in different shapes depending on what you're trying to find out. Are you curious whether someone's behaviour has actually shifted after attending a workshop or event? Wondering whether a specific activity led to a meaningful outcome? Trying to learn whether knowledge and skills have genuinely grown or whether someone has taken what they learned and used it in their real life? Each of these calls for a different kind of question, and the way you phrase it makes all the difference between a rich, honest answer and a polite but empty one.
Here are four types of impact questions, with an example of what works well and what tends to fall flat.
Before/after behavioural change questions invite someone to compare what they do now with what they did before. A good version might ask: “How has your approach to searching for information changed since the session?” This gives the person something concrete to reflect on. A weaker version, like “Did the session change the way you search for information?”, is easy to answer with a simple yes or no, but reveals very little.
Specific outcome questions ask someone to recall a real moment when they put something into practice. Asking “Tell me about a specific time recently when you used a library resource to help with something” requires an actual episode from the person's life. Compare that to “Has using the library been useful?”, which again tends to produce a quick yes or no response rather than a story.
Questions about skills and capabilities get at what someone is now able to do that they couldn't before. "What kinds of searches can you do now that you wouldn't have known how to do before attending the session?" puts the focus on new abilities. “Do you feel more confident using the library?” sounds similar, but confidence isn't the same as capability and it's easy to say yes without being able to demonstrate anything.
Questions about real life applications explore whether skills have transferred into someone's everyday life, whether it’s study, work or other everyday situations. A question like “How have you used the library's resources since the session?” opens up the full picture of where learning has landed. Asking “Are you using the library more now?” only counts visits, not whether anything has genuinely changed in how the person finds or uses information.
Time spent writing a thoughtful question really pays off. A well-worded impact question can make a real difference to the quality of evidence you gather. Ask one thing at a time, avoid leading language and keep your question clear and focused on the outcomes that matter. A single well-crafted question can tell you far more about the difference your library makes than a whole survey of vague ones ever could.
Time spent writing a thoughtful question really pays off. A well-worded impact question can make a real difference to the quality of evidence you gather. Ask one thing at a time, avoid leading language and keep your question clear and focused on the outcomes that matter. A single well-crafted question can tell you far more about the difference your library makes than a whole survey of vague ones ever could.