Why Run Usability Tests?
Usability tests can help you:
- Identify when and where users succeed and fail
- Gain understanding of users' behaviors and goals
- Evaluate the effectiveness of UI customizations
You will see first-hand what researchers really do when they are conducting online research. You may find that what they do is different than what they say they do, or what they might think is the "right" way to do things.
Creating the Test
Define the goals for the study. What you are trying to learn? What questions are you trying to answer? Have a clear objective for each question or task in the test.
- Create tasks around user goals. What do users typically do on the site?
- Create tasks that include any features that you have customized (even if they are not common tasks).
- If you have any concerns that something on the site is not working, make sure to include it in your test.
- Test out the sequence of tasks to make sure that one task doesn't bias the outcome of another.
- The tasks should be written to avoid using the same labels that are used on the screen, to avoid leading participants.
- Once you have a list of tasks/question ready, go over each task/question and do/answer it yourself to make sure that each task works.
- Run a pilot study: before running your test with all of your participants, just run it with one participant to confirm everything works and participants are not confused by the tasks.
The tasks that you create can be open-ended or specific, depending on what you want to learn.
Here is an example of an open-ended task:
Think of a current (or recent) project that involves online research. Spend five minutes using this site to do that research.
Here are some ideas for specific tasks:
- You have an assignment to write a 3-5 page paper on the economic impact of climate change. Find a peer-reviewed article that you could use in your paper.
- (For faculty) Create a reading list for a course you are teaching and email it to yourself.
- (For librarians) You are helping a student find material for a paper on developments in autism treatment over the past 3 years. The instructor told them that they can use any type of document except for newspaper articles. Show me how you would do that on this site.
For more tasks, see Primo Usability Guidelines and Test Script.
Recruiting Participants for the Usability Test Sessions
Define what user groups will use the Primo interface (for example: undergraduates, graduates, faculty/instructors, and librarians). In addition, consider including participants from various fields of study (for example: English and Physics). Each user group represents users who may have different goals, abilities, and expectations while using Primo.
How Many Should You Recruit?
It depends. We recommend starting with at least three users from each of your target user groups. After running several tests, step back and think about whether you are still learning something new or if you are hearing the same issues repeated. As long as you are still discovering new findings, continue to test with more users.
Running the Tests
- In addition to the test's moderator (the person who asks the participant questions and gives them tasks), have a second person act as the note taker to capture observations.
- Make sure the testing environment is similar to the participant's real environment. You might even consider meeting them where they normally do their online research.
- Before starting the test, make participants feel comfortable. Explain to them that you are evaluating the site and not them personally; there is nothing that they can say or do that is wrong. It's also ok to ask a couple of non-study questions just to establish a connection and trust.
- Sit next to them (not across from them), so you could observe how participants use the screen.
- Be objective in how you form questions; do not ask leading questions. For example, do not ask “How easy was it to …”. A better way to phrase it would be “How easy or difficult was it to …”
- Ask open-ended questions which allow more explanation (rather than Yes/No questions). For example, "Please tell me more about ..." or "What did you expect to happen when ..."
- Be neutral, avoid showing a positive or negative reaction whether a user completes a task or not.
- Record the session so you can share with others on your team who cannot observe the sessions. Showing a video clip of where a user struggles is more compelling than describing it in words.
- After each session, debrief with the note taker and any observers about what they observed.
Analyzing Test Results and Recommending Changes
- Be objective - take both positive and negative feedback into account
- Pay more attention to what users do rather than what they say they do
- Describe what you saw/heard/observed, before deciding how to respond
- Do not jump to conclusions after a couple of sessions
- Rank and prioritize issues according to severity and risk, choose the most important ones to address, identify issues that may need further testing
Test iteratively. Make small changes, and then test again.
- Turn User Goals into Task Scenarios for Usability Testing – Nielsen Norman Group
- How a Complete Novice Learned User Testing in 10 minutes
- Usability testing hints, tips, and guidelines
- Talking with Users During a Usability Test
- Usability testing in the Academic Library – Digital Commons
- Why You Only Need to Test with 5 Users - Nielsen Norman Group
- Templates and Downloadable Documents
Primo Usability Test Script for Librarians
- Introduce yourself
- Reassure the participant that you are testing the site, not them
- Encourage the participant to 'think out loud'
- Ask them to sign a consent form/recording release (if applicable)
- Start recording the session (if applicable)
Record Basic Information
- Participant's name
- Date of session
- Moderator’s name
- (For librarians and faculty) What is your role?
- (For students) What is your year in school and your field of study or major?
1. Think of a current (or recent) project that involves online research. Spend 5 minutes using this site to do that research.
A general task is to get the user comfortable and in the mindset of doing online research. This also allows you to observe more closely how the user conducts online research on their own.
2. You have a short writing assignment on the spread of the Zika virus. Find an article from a peer-reviewed journal that you can use for your paper.
Check to see if users have any issues with locating articles from a peer-reviewed journal (a common user goal)?
Do users notice and/or use the 'articles' option in the search form's autosuggestion list?
Do they use Advanced Search? If so, do they experience any issues with the form?
Do they use the result filters? If so, do they experience any issues with using filters?
3. Now limit your results to only articles written by Mark Fischer.
In Primo, filters scroll independently from the list of results and we wanted to know if that poses any usability issues. So, we set up a task where a user would use the 'Creator' filter, which requires scrolling to view.
4. Imagine that one of the result items is perfect for a friend. How would you email it to them?
Tests the ease of use of emailing result items.
5. Run a new search on early childhood education programs. Adjust your results so they do NOT contain any newspaper articles.
Tests 'Clear filters' and 'Exclude this' functionality.
6. Now find 3 peer-reviewed articles on this topic and add them to a list that you can get back to later. Use Primo to make this list.
Tests the usability of the 'save' functionality.
7. This site allows you to access information on library material you have checked out. For purposes of the test, you have an item coming due, please renew that item.
Do users know where to go to log in?
Do they experience any issues with signing into their library account?
Are users able to renew an item?
8. Go to the list of articles you just made and add a label to each item.
Tests whether users can locate the list of saved items and use the 'label' functionality.
Post-Test Survey Questions
These questions could be asked verbally, or you could ask them to fill out a survey form.
What did you like most about the site? Why?
What would improve the site? Why?
Stop the recording (if applicable).
Give them their incentive—for example, a gift card (if applicable).
Thank them for participating in the test.