Writing usability test questions sounds easy at first. You just have to ask users what they think about your product, right?

But as you start brainstorming questions, you’ll quickly discover that it’s not so easy after all. For starters, people will interpret your questions differently based on how they’re worded. Plus, writing questions that provide you with objectively useful product insights is challenging. Which users should you ask? What type of product experience do they need? There are so many variables to consider, you can see how creating the right usability test questions takes time.

To help you figure out what types of questions to ask participants, we put together 23 examples of usability test questions you should definitely ask—along with why—as well as 11 you definitely shouldn’t.

Why choosing the right usability test questions matters

It’s one thing to ask the opinion of users—they’re usually willing to share—but it’s another thing entirely to ask questions that get to the heart of their experience.

Usability testing is a form of research that helps you understand your users on a deeper level, their needs and expectations, and how they use your app or website.

Usability tests help you:

  • Learn more about how users feel about your website or app. You’ll be able to determine whether these platforms offer value, are effective at helping users complete tasks, and more.
  • Uncover how much users know your website or app. You’ll learn whether users use your platform the way you intend them to or if they’ve found workarounds because the platform isn’t intuitive.
  • Determine what issues users are encountering. You’ll get a better understanding of what frustrates users and what stops them from using your website or app more than they already do.
  • Design better workflows that get users to a solution faster. Based on the insights, you’ll be able to make changes to your website or app that improve the user experience.

Before you ask your first question, you have to define the objective of the usability test. For example, decide if you want to learn about a particular workflow or how easy it is to use your app, or about how you’ve formatted and laid out information. A clear objective, that’s set early on, helps you create the right types of questions.

Your usability test questions have to be specific, relevant, thought-provoking, clear and concise, and guide the conversation vs. lead it. This is easier said than done. These questions have to also get to the heart of why users use your products the way they do. Not all users are conscious of the ‘why’ behind their actions, so your questions have to help them talk through their usage patterns.

In addition, your usability test can also include an exercise component where users log onto your app or website and talk through what they’re doing. As they navigate the system, observe their actions and ask thoughtful follow-up questions as to why they complete certain tasks the way they do. Their responses will give you deeper insights into how well users understand your products and their pain points.

Without this approach, the data you collect won’t always match your objective or give you insights that you can use.

To help guide usability testing, there are four types of usability test questions:

  1. Screening questions
  2. Pre-test questions
  3. In-test questions
  4. Post-test questions

We’ll look at examples of usability test questions for each of these shortly, but before we do, keep in mind that in order to stimulate conversation and get meaningful feedback, include a mix of open-ended questions with follow-ups and multiple-choice questions. The more details you capture, the better.

Screening questions

At this stage of your usability test, you have to decide on what types of users to include in the process. Depending on the objective of the test, create a list of criteria of what defines an ideal participant. Do you want new users who’ve just started using your platform, or do you want power users with extensive experience?

Defining your ideal participants ensures that the data you collect comes from users most likely to give valid and relevant insights.

1. How old are you?

If part of your call for participants includes asking them their age range, use this screening question to verify that they are, in fact, within the desired range.

Confirming age is important because depending on the kind of usability testing you’re doing, participants in the wrong age range could skew your results. Since these results aren’t an accurate depiction of the user experience, they can’t be used in your analysis.

2. What’s your income level?

If your app includes payment tiers or you offer paid services on your website, ask participants to confirm their income level. This ensures that your usability tests target people who have the income to support using your product.

Consider segmenting participants based on the tiers they use. For example, group together users on the basic plan and group together users on the premium plan. Each tier includes different app features, so the experiences of paying users will differ. Segmenting them allows you to focus on the specific features users have access to.

3. What’s your highest level of education?

Depending on the content on your app or website, tracking education level allows you to ensure participants understand the industry you target. Let’s say your target audience includes law professionals, like clerks and lawyers. While a non-law professional might find a use for the information on your website, if they don’t actively work in the industry, they won’t understand the terms you use on your website or the processes you offer in your app.

By confirming participants’ education level, you avoid having to take time out of your test to explain terms or industry jargon.

4. When was the last time you used the website or app?

Whether you’re conducting usability tests with active, existing users or new users, this question will help you identify the right users. If you’ve recently made updates to each platform, then target users who’ve used the latest version. If you want to test how easy it is to use your products, then look for new users who have less experience with it.

5. How often do you use the website or app?

Depending on your objective, you’ll need users with different levels of experience. For example, new, moderate, and power users offer different types of insights. The amount of experience each user group has determines how deep their insights will be.

New users can talk about the onboarding process, while moderate users can talk about the general day-to-day features and the workflows they use the most.

Power users have used every aspect of your app or website, so their feedback is deeper than any other user group. They may even have suggestions that you haven’t considered but make sense to incorporate into one of your platforms.

6. How much time do you spend on the website or app?

Again, this question confirms which of your participants are active users. The more time participants spend using your platforms, the more experience they have using it.

You can also segment this group to get information from new or experienced users. New users can talk about how intuitive your platforms are, and experienced users can talk about the features they use.

How you choose to segment your participants depends on the objective of your usability test. The test questions you ask will also differ based on each group.

Questions to avoid

These are questions that lead participants instead of guide them. Leading questions skew your results and contribute to missed insights.

  • Is this feature slow or irrelevant? This is leading because any question that gives positive or negative suggestions influences participants’ answers.
  • Do you like the newer version of the app better? This leading question implies the new version is ‘better’ than the older version. Users will ultimately say the newer version is better, even if the older version had features they preferred.
  • Do you save a lot of time using the website? This question implies that it should take very little time to complete tasks on the website. A user might not be eager to share that it takes them a long time to find what they need.

Pre-test questions

With your preliminary list of participants confirmed, you’re ready to administer pre-test questions. These questions are meant to gauge the participants’ experience and their usage patterns. Similar to screening questions, pre-test questions eliminate participants who don’t fit the criteria you’ve established and can risk skewing your results.

Your pre-test questions should be designed to uncover whether or not participants know enough about your website or app to give the type of insights needed for you to make improvements or changes.

7. What do you use the website or app for? How often?

If you offer services on your website or app, this question helps you understand which services and related features users use the most. If you’re testing certain features, this question helps you identify the participants who use them.

Also, if your plan for your usability test is to focus on specific sets of users — like power users vs. new users — this question confirms their usage patterns. If a potential participant classifies themselves as a power user but only opens your app once a week for a few minutes, it’s safe to remove them from your list since they can’t give you the insights you’re looking for.

8. Which features do you use most?

This question will help you identify which features users rely on the most and which ones they use the least. Based on the answers participants give, ask ‘why’ to find out what it is about these features they like and don’t like. The data you collect will help you decide whether to keep features, remove features, explain to users why they’re useful, or make adjustments to meet user needs.

9. How satisfied are you with the available workflows?

When you designed your website or app, you had specific workflows in mind. This question uncovers whether these workflows are intuitive or whether users have come up with their own.

The questions up to this point have been open-ended. This question offers an opportunity to use Likert scale questions. These are questions that ask participants to share how they feel about something. The results of this question help you easily compare users’ experiences. This, in turn, helps you do different kinds of analysis. For example, you can compare different pages on your site or different sections based on a standard review formula.

This approach also helps clear up any vague, generalized responses users gave to earlier questions.

Use the following scale to track responses:

  • 1- Satisfied
  • 2- Somewhat satisfied
  • 3- Neither satisfied or dissatisfied
  • 4- Somewhat dissatisfied
  • 5- Dissatisfied

Once each participant answers, follow up with ‘why’ to allow them to explain their thoughts in more detail.

10. What other apps did you use or research before choosing this app?

This question helps you verify which competitors users compared you to. Are users comparing you to competitors you’ve identified, or are there new ones you should be aware of?

The list of competitors you generate from this question will help you figure out how to offer something better or completely different from the competition.

11. Why did you choose this website or app?

This question will identify which features and workflows triggered conversion. You may have your thoughts on what features are the most valuable to users, but their ideas might surprise you.

Use the data from user responses to update your advertising and promotions. For example, if the majority of users say your price is better than the competitions’, lead with this in your advertising campaigns. Potential users who are looking for an app, product, or service like yours and are comparing based on price are more likely to stop to find out more if you call out your pricing.

Questions to avoid

  • Have you used X product before? Unless the purpose of the usability test is to compare against a specific competitor, this question doesn’t offer any value or give you any insight other than yes or no.
  • Do you visit the website or app daily? This question also results in a yes or no answer that doesn’t give any insights you can analyze.
  • Do you use X feature? This question might inadvertently eliminate people who may not use a specific feature but still have valuable insights that will benefit your research.

In-test questions

This stage of usability testing is the heart of your research and yields the information you’ll use to uncover bottlenecks, pain points, and opportunities. The goal with these usability test questions is to have a conversation with participants vs. following a formal question-and-answer format. A casual conversation will help the usability test flow naturally and put participants at ease. The more at ease they are, the better the chances of them sharing their honest opinions.

12. When you log on, what’s the first thing you do? Is there another way to complete this task?

This question helps you identify how users use your app. Are they using it the way you intended, or have they created their own workflows because something is missing. This is also a great follow-up question to ask to get deep into why users use your app the way they do and to uncover suggestions they have to make it better.

13. How do you use X feature?

If you offer multiple features on your website, this question will help you get a deeper understanding of each one. The results of this question may prompt you to update your website navigation to simplify the options available in the menu.

You may find that while users like certain features and find it hard to find others, a simple solution might be to offer an onboarding tutorial to new users. This way, it’s clear early on where to find information on your website or in your app.

14. What parts of the website or app do you use the most? Why?

This is a great open-ended question because participants are free to talk about what matters to them. This question will also give you tremendous insights into what the majority of users find the most helpful about your website or app.

You can also use the answers to this question to verify updates currently planned in your product roadmap. If updates are in line with user preferences, you have a better chance of appealing to your target audience.

15. What parts of the website or app do you use the least? Why?

This question will tell you what users don’t like and can do without. Rather than see the answers to this question as a negative, look at the responses as an opportunity to get feedback on how to make these parts of your platform better. The information you get from power users is especially helpful here because they have the most experience with your product and can offer the most insight into what changes to make.

16. do you like the interface? Is it easy to use?

65% of people are visual learners, so having a visually appealing app interface or a professionally designed website will help keep users logged on longer. Ask follow-up questions about what specifically users like about the interface and what they would change.

Also, discuss how many steps it takes users to get what they need in your app or website. Determine what the ideal number is and compare it to what users share. If it takes longer than you expect for users to complete tasks, the information you get from this question will help you simplify your interface.

17. What do you think about how information and features are laid out?

This also relates to how easy it is to use your app or feature. Use the information from this question to make changes to your platforms so that it’s clear to users where to find what they need.

18. What do you think of X page? How easy is it to find?

This question is geared towards users who’ve used your app extensively. Review each page you want to collect data on to find out specifics on what users like and don’t like about them.

Questions to avoid

Avoid asking questions that result in yes or no answers. If you do ask one, follow it up with ‘why’ or ‘why not’ to get to the insights you need.

  • Do you like X feature? This question doesn’t tell you what users specifically like, or dislike, about a feature.
  • Do you like X page? Similar to the question above, this question doesn’t tell you what users like or don’t like about different pages of your website.
  • Is the user interface easy to use? This question doesn’t give users the option to explain their experience navigating the website or app.

Post-test questions

This is the final stage of the interview, and it’s a chance for you to ask follow-up questions you missed or to ask for clarifications. These usability test questions can be more general, but you should continue to use open-ended questions to maximize the amount of information you receive.

19. Overall, what’s your experience been with the website or app?

This question gives users the option to freely share insights that haven’t come up yet in your questioning. Ask them for specific examples based on what they share.

20. If you could change one thing about the website or app, what would it be? Why?

You can ask this question to both new and experienced users since they have different experiences. The insights you get from this question will help you make changes to almost every part of your product. For example, you can change the onboarding process for new users or change the menu options on your website or app for more experienced users.

21. What one thing are you most excited about with the website or app? Why?

Part of your response to user questions during the testing session might be to tease about updates that are coming to your website or app. Use your usability test to gauge the interest level of users. Their response, provided they’re the types of users these updates are meant for, will help you determine whether to move forward or make adjustments before launching the updates.

22. Why will you continue to use this website or app? What will stop you from using this website or app in the future?

These questions help you do two things:

  • Pinpoint exactly what makes your users the most satisfied about your platforms.
  • Identify what causes them frustration and what you need to revisit to reduce churn risk.

The most important of these two is reducing churn. The information you gather here helps you enhance your roadmap and plan for changes you know will improve the user experience.

23. How likely are you to refer this website or app? Why or why not?

This question lets you figure out your Net Promoter Score (NPS). This tells you how loyal your customers are and the likelihood they will share your product with others. The way this question works is you ask users to score how likely they are to share your product on a scale of 1-10. Users who score:

  • 9-10 are considered promoters and are your most loyal users. They’re most likely to refer you to other people.
  • 7-8 are considered passives because they’re a little less engaged. If an app were to come along with a lower price or newer features, this group of users would likely churn.
  • 0-6 are considered detractors. They aren’t happy and are at risk of leaving or sharing negative feedback about your app or website with other people.

To get your NPS, subtract the percentage of detractors from the percentage of promoters. What’s left is an indication of how many promoters you have compared to detractors. The higher the number, the better because you have more happy, loyal customers than unhappy ones.

Questions to avoid

  • Is there anything else you want to add? This question doesn’t guide participants, so they aren’t likely to share helpful insights. You’ll be tempted to read a lot into the answers but don’t — users are just sharing random thoughts. The point of testing is to be rigorous and specific about what you ask.
  • Will you keep using this website or app? This leads to a yes or no answer that doesn’t tell you anything you haven’t already gathered from previous questions.

Use transcription to manage and organize usability test questions

Depending on how often you use usability tests and the number of questions you ask, a transcription service like Rev will help you save time. Instead of interviewers worrying about taking notes and capturing quotes, they can focus on asking questions and conversing with participants.

Once you’ve recorded the audio that captures the answers to your usability test questions, upload the file to Rev. Within 12 hours, you’ll have a transcript to review and results to analyze. With Rev, it doesn’t matter if there was noise in the background of your recording, if there was more than one person speaking at a time, or if there were accents; the transcription has a 99% accuracy guarantee. Plus with the transcript editor, you can review the final transcript before sharing it with your team.

Rev’s transcript editor in action

For usability tests where recording equipment isn’t available or isn’t working properly, Rev also gives you access to a voice recorder app. With the app, you can record high-quality audio interviews and seamlessly upload them to Rev for a full transcript. Plus, with the automated transcription feature, you can get a rough draft of your usability test interview within five minutes. This is handy when you have a quick turnaround of your analysis and need access to transcripts quickly.

Learn from your usability testing questions

The more usability tests you run, the more you’ll learn about the process. Use each of your tests as a learning opportunity and to improve the process. The goal with usability tests and the questions you ask should be to have a smooth flow and accurate results that you can use to enhance or change your website or app.

Talk to a Transcription Expert