How to Interview Users like a Pro | Do’s and Don’ts of User Research Interviews

User interviews are a critical tool for discovering user needs, evaluating product experiences, and identifying opportunities for innovation. But shoddy interview techniques can seriously undermine the validity of your insights. User research can feel intimidating, especially when you are new to it, but following a few fundamental guidelines can make a big impact in your data quality.

Whether you are conducting user interviews, contextual inquiries, or usability testing, the Do’s and Don’ts in this guide will show you what separates the user research pros from the amateurs. These techniques will show you how to structure your interviews, how to word your questions, and how to respond to users in a way that facilitates impactful and trustworthy research insights.

To start off, it is important to keep in mind a few key principles that guide user research best-practices.

Principles of Good User Research

The following principles can be utilized in general to help you make decisions before and during user interviews. The Do’s and Don’ts below all follow these general best practices for user research.

1)    Our main goal is to learn.

During the interview, you may feel other competing goals (such as selling users on a concept or explaining how to use a product) which need to be pushed aside to gather the insights we want. Every part of our research plan should be contributing to us learning from and about users.

2)    Users are honest with people they like.

Because we are all human beings, we cannot just jump into a session firing off questions and expect to get in-depth answers. If we want people to be open and honest with us, we have to build the right kind of rapport and environment: neutral, attentive, and inviting. The Do’s and Don’ts below will help you build that specific type of rapport with your interviewees.

3)    Objective data is the best data.

The way we present ourselves and our designs, the way we phrase questions, and the way we respond to people will all affect how they respond to us. It is important to conduct our interviews in a way that allows us to learn users’ true experiences, attitudes, and behaviors, not to influence them to tell us what we want to hear.  

4)    The user is the master, the VIP.

This principle alone will get you very far in user and UX research. When in a research environment, treat users as the star of the show. This means you talk less, and the focus is not about you or your product, but about the participant, their process, their experience, and their perspective. Remember that they are the experts in their own usage and their own thoughts.

Now that we have learned the principles that guide user research interviews, let’s talk about specific strategies to implement them. Below are seven do’s and don’ts of conducting user interviews.

The Do’s and Don’ts of User Research Interviews

DO DON'T
Do be neutral and curious about all user feedback, especially negative feedback. Don't be defensive or try to sell your product to the interviewee.
Do start with open, broad questions and narrow in as you go according to the user’s responses. Don't start with closed questions that lead the user down a predetermined path.
Do allow users to form and express their own opinions and perspectives. Don't bias users with overly positive or leading language.
Do time the reveal of any key information and designs to occur after general questions have been asked. Don't show designs or reveal information to users too early before you learn their unbiased thoughts and experiences.
Do reflect users’ questions back on them to understand why they asked and what they expect. Don't answer users’ questions about how the product or design works before learning why they asked and what they would expect.
Do probe deeply into users’ comments and behaviors to identify core needs and root cause. Don't accept users’ surface-level comment as the full response and move on.

1. ABC = Always Be Curious

DO be neutral and curious about all user feedback, especially negative feedback. DON’T be defensive or try to sell your product to your interviewee.

Negative feedback and criticism can be uncomfortable to sit through, but it can also lead to some of your most impactful learnings. Responding with open curiosity towards negative feedback will tell users that you are on their side and genuinely want to hear their experience. Conversely, defending or even explaining your product after criticism is a quick way to ruin your rapport with the interviewee, shut them down, and destroy your opportunity to gather valuable data.

Especially as a new researcher, you may feel an itch to explain, defend, and sell the experience of your product. This is a natural, human experience that even experienced researchers have to fight! Try to separate yourself emotionally from your company, product, and designs. Your goal is NOT to make users like your product or explain why certain things were done a specific way. In fact, doing either of these things will only turn your interviewee off.

The way I often hear this creeping in is comments like, “We had to design it like that because…” or “You could do it this way to avoid that problem.” Statements like these tell the user that you do not really care what they think or feel and signal that you are more interested in influencing them than learning about their true experience.

To gather the most valid data, make sure to respond to all user feedback, regardless of if it’s positive or negative, with a genuine, neutral curiosity. Below are a few examples of how to respond to users’ negative feedback.

How to respond when users give negative feedback about your product in an interview?

  • “That’s so interesting. Could you tell me more about that?”

  • “That is really good for us to know. Could you walk me through what you are talking about?”

  • “Thank you for sharing that! I would love to hear more about your experience.”

  • “Oh no! That doesn’t sound ideal. Could you tell me more about it so we can learn how to improve?”

This final example makes the most sense for highly emotional comments and frustrated interviewees. Try to match their intensity level, but step it down slightly. So if they are royally ticked, you might intensify this statement (i.e. “Oh no! That sounds very frustrating. Could you tell me more about it?” or even, “It sounds like we are failing you a bit there. Could you explain what went wrong?”).

These kinds of responses from the researcher will build empathy with your users and show them that you are on their side. In turn, they will be more honest and open with you: win-win! The more you practice this style of communication, the easier it will become.

2. Open -> Focused

DO start with open, broad questions and narrow in as you go according to the user’s responses.
DON’T start with closed questions that lead the user down a predetermined path.

Any question you ask can be open-ended or closed-ended. For closed questions, you provide a set of specific responses, often yes or no, for your research participants to choose from. Closed questions are often used in surveys because they can be answered very quickly and provide a structure to the responses. Open questions provide less, if any, structure for how they should be answered, allowing a broader range of responses.

An example of an open question is, “What do you think of this idea?” If we wanted to be even more open, we could just present the idea and be silent. (That’s a tactic that actually does work!) Closed versions of this same question could be, “Do you like this idea?” with the user responding yes or no, or “Is this idea good or bad?”

Which type is better – open or closed questions?

You will commonly hear people say that open questions are better than closed questions. While that can sometimes be true, in reality they just serve different purposes, and using either for the wrong reasons will actively work against your research goals. That said, there are a few reasons that open questions are typically preferred in user research.

Why interview experts recommend open questions over closed questions:

  1. In user interviews you have the opportunity to converse with participants to get longer, more in-depth responses from them. If you have decided to conduct user interviews over other more quantitative methods, it will generally be because you need to gather deep, qualitative insights, which would be difficult or even impossible through closed questions. If you truly need to ask a series of closed questions (this is not very common), surveys would be an easier and cheaper way to do that.
  2. Open questions allow users to respond according to what is most important to them, instead of what is most important to the researcher. This means you can differentiate between organic feedback (something the user brings up on their own and is likely important to them) and prompted feedback (something that may or may not be important to them, but they mention it because you asked about it specifically). This distinction is incredibly useful in interpreting your findings.
  3. It is much easier to accidentally be leading and biased when asking closed questions versus open questions. This is because closed questions are more specific, and the more information you include in the question, the more you are influencing your participants’ responses before they have a chance to come up with their own. Closed questions in an interview take a lot of skill and forethought to avoid leading the interviewee.

When to use closed-ended questions in a user interview

Closed questions can help us probe deeper, link from one topic to the next, and provide some structure and uniformity to our data across interviews. However, it is imperative that you always start with open-ended questions at the beginning of each new topic or idea to maximize learning. After you ask one or two closed questions, you will typically need to open it back up again to keep the interview flowing.

Here is an example of a user interview script that starts open, gets more focused, then opens up again:

  1. (Open, broad) Can you tell me about your experience using our website?

  2. (Open, narrower) Why do you use our website?

  3. (Open, specific) What is the first thing you do when you open our website?

  4. (Closed linking question) Have you ever used Feature X? Why or why not?

  5. (Open, broad) What has that experience been like?

If users never mention Feature X in the first three questions, that is good to know. It may mean the feature is not very memorable or important to them. But what would you and your team do with that data? Not much probably – if you left the interview after those three questions, you would likely need to do a second interview to learn more before you could make improvements to the product.

Additionally, what if instead you had skipped question 4 and went straight to asking, “What do you think of Feature X?” and the interviewee had never used the feature before? Maybe they would say they are not familiar with it and you move on, but maybe not. People want to be nice and sound natural in conversation, so asking them an open question about something they have no significant opinion of may facilitate them reporting a quickly formed, loosely held opinion that is of little real importance. This is called the query effect. In the above situation, asking the closed question first helps you avoid this conundrum.

In summary, start with open questions in your interview to give users control to guide the interview and tell you what is most important to them. If you have more focused or specific questions, always save them for after open questioning. Use closed questions strategically throughout the interview to transition between topics and gather targeted data.

Use the other tips in this list to carefully word your closed questions to ensure you are not falling into common traps that give closed questions a bad name.

Asking a closed question is not the end of the world, but doing it too often may cause bias creep. Bias creep is when the path your interview takes is heavily skewed by your series of too-specific questions, which leads the user to talk about things they do not find important in reality. The risk is that you gather a lot of data about topics that are not of real relevance to users, and miss out on important discoveries.

What if I accidentally ask a closed question in a user interview?

If you notice while you are speaking that your question will be closed when you meant for it to be more open, you can append it with something like “or not,” or “if at all.” This will help you get out of asking a closed and leading question. You can also append the question or follow it up with, “and why?” So, “Which design do you prefer?” becomes, “Which design do you prefer, and why?” This prompts users to give a more in-depth response than the purely closed version. Another way of handling it is to follow-up with an open question.

3. Progressive Disclosure

DO time the reveal of any key information and designs to occur after general questions have been asked.
DON’T show designs or reveal information to users too early before you learn their unbiased thoughts and experiences.

Showing users a design or an idea can frame their thinking for the rest of the interview, making it harder (or impossible) for them to think outside the box. That is where progressive disclosure comes in. Using progressive disclosure, we time the introduction of our own ideas, concepts, and designs to go after we have gathered users’ original feedback and product ideas. This gives us a chance to gather the most unbiased feedback we can at every point in the interview.

A discussion guide using progressive disclosure might flow like this:

Section 1: Open, discovery questions about users’ daily behaviors and workflows.
Section 2: Discussion about their experience with your product and similar products.
Section 3: Questions to discover their pain points and ideas for improvements.
Section 4: Present new design sketches and get their feedback on them.

Ordering the interview sections this way allows users to bring up what is important and meaningful to them at every given point. By the time you get to Section 4, you will see whether or not users naturally bring up the concepts from your new designs. If you were to instead start with Section 4, you would influence your users to think in terms of the designs you showed all throughout the entire session. They may have mentioned your concepts frequently, but not because those are inherently meaningful to them.

Additionally, if you were to ask them the questions in Section 4 (presenting your designs/concepts) before those in Section 3 (discovering their frustrations and potential improvements), they would most likely only give you the same ideas you already showed them. When we put the sections in the correct order (like the example above), users will focus their suggestions around their biggest pain points, and the improvements they discuss will give you key insights into their goals and how they think.

In order to get the best quality data at each point in the interview, be very intentional about when and how you introduce new concepts and present any designs or visuals. Before any section where you will be showing or guiding the user in any way, ask yourself if there are any general questions you need to get out of the way first.

4. Avoid Leading Users

DO allow users to form their own opinions and express their own perspectives.
DON’T lead users with overly positive or specific language.

Be intentional with the way you present concepts, ask questions, and respond to participants to avoid leading and biasing them. Leading questions and prompts will generally include an opinion or unfounded assumption in their wording. There are three key characteristics that make a question leading.

Characteristics of Leading Questions (You Should Avoid)

  1. They insert an opinion by including either only positive or only negative language.

This tells the user that you have taken a side on the issue. If they disagree with you, it will be socially awkward for them to express that. Even if they happen to agree with the assumption in your question, this sets up the expectation that there are right and wrong answers (which is not the case). Look for words like “easy,” “difficult,” “better,” or “worse” used as the sole descriptor in the sentence.

Leading: “What do you think of the improvements we made to our product?”
Improved: “What do you think of the changes we made to our product?”

  1. They force the user to disagree with you to give criticism or negative feedback.

It is impolite in typical conversation to disagree with someone or criticize something they are showing you. To get the most honest feedback, word your questions so that users can respond in any possible way without having to disagree with you.

Leading: “How easy was that?”
Improved: “How easy or difficult was that?”

  1. They state your own inferences or assumptions as facts.

Rephrasing facts as questions is a great technique for probing deeper during an interview. Where you get into trouble is when you mix up your own assumptions with what the user actually said or did, and instead put words in their mouths or thoughts in their heads. Not only can this ruin a question, it can ruin your rapport with participants.

I call this mistake research-splaining: when the researcher states their own assumptions as facts, as if explaining to the user what they are experiencing, rather than learning about the truth. Research-splaining is one of the most insidious mistakes one can make as a researcher. Remember that we conduct research because we don’t have the answers. Practice separating observations from inferences, and when you are in an interview, use observations to probe into users’ behavior and comments. Stay neutral, curious, and try to avoid making strong inferences until the analysis phase of your research.

Leading: “You had no idea what was going on – can you tell me about that?”
Improved: “You just said you ‘weren’t sure’ – can you tell me more about that?”

Leading: “Why were you so frustrated with that?”
Improved: “You let out a few sighs there – what was going on?”

Leading: “You thought that feature was really helpful – what was helpful about it?”
Improved: “You mentioned that feature several times – what do you think of it?”

Notice that the improved versions of these questions are always more open and less specific than the leading versions. They provide optionality and freedom for the interviewee to respond honestly.

What if I accidentally ask a leading question?

It happens, even to experienced researchers! If you realize part of the way through your question that it is leading, try adding “or not” to the end. For example, “Does that make sense?” becomes, “Does that make sense, or not?” In similar fashion, if you accidentally include a polarizing word, like “easy,” you can include an “or (opposite).” So, in this case, “What made that easy?” becomes, “What made that easy or difficult?”

Another technique is to follow up with additional questions that would allow them to express other opinions. If you accidentally asked, “Was that easy to understand?” then you might follow-up with, “Was there anything you didn’t understand?” and, “Is there anything could we improve to make it more understandable?”

 

5. Reflect Users’ Questions

DO reflect users’ questions back on them to understand why they asked and what they expect.
DON’T answer users’ questions about how the product or design works before learning the reason for their question.

Users naturally ask lots of questions during interviews and product testing. It is incredibly helpful to hear what sort of questions people have about a certain topic or workflow because you will need to address those during the design process.

But remember that in a research setting, your goal is to learn, not to train and teach. When users or interviewees ask a question, they are giving you a very valuable insight into their thought process, goals, behaviors, and pain points. If you answer a participant’s question, you will miss the opportunity to probe and learn more.

This is a little contrary to typical conversation patterns. In normal interactions, it might seem odd or even rude to reflect someone’s question back on them. If someone asked you what you wanted for lunch and you responded, “That’s very interesting – what would you expect us to have for lunch?” you would probably get a perplexed look.

But that is exactly the sort of interaction you want to have with participants in an interview. Here are a few ways to respond to users’ questions during an interview.

How to respond to users’ questions during interviews

  1. Use a simple probing question to learn more. Common ones are, “That’s an interesting question, can you tell me more about that?” or, “Why is that important for you to know?” You could even use something as simple as, “Why do you ask?” – just make sure you do not sound defensive when you say it. Your goal is to understand why they are asking the question, not challenge them.
  2. Ask them to answer their own question, tell you what they would want, or what they expect to happen. You can compare their response to how the product actually works, which shows whether a system is immediately intuitive or not. If you instead tell users the “answer” to the question, it is unreliable to backtrack and determine how well it matched their expectations. “What do you think/expect?” is a simple way to do this.
  3. Tell them you don’t know. This response is commonly used with questions about upcoming features and products. If users ask me whether a feature will be released soon, or if a bug will be fixed, I always reply that I’m not sure, even if I do know. Let them know you will take their question as a note back to your development team.
  4. Defer to the end of the interview. There may be some questions you do want to answer, but answering during the session may influence their responses and bias your data. For example, if a user describes a pain point and asks if there is an easier way to complete the task, you might want to give them that information, but not before you learn everything you can about their experience and expectations. Here is a type of response I use often: “That’s a great question. Would it be ok to just get your take on it for now – then at the end I can answer your questions?”

There are almost no questions at all that really need to be answered during the actual interview. If you can make your participants’ lives easier by answering a few of their questions, then note them and come back to them at the end.

Overall, avoid the urge to demonstrate, explain, and train your interviewees. Remember your main goal: learn! Everything else can wait until after the interview portion is over. If you set this as an expectation from the start of the session, your users will understand. Try including this line in your introduction at the beginning:

“It is really helpful for us to know what questions you have during the (interview/activity), but just know that I might not answer them right away. In fact, I will probably respond by asking you to answer them. This is because we need to learn what you think, expect, and want. So please vocalize any questions you have, and if there is anything I can help you with, I will make a note of it for discussion at the end.”

6. Probe to Root Cause

DO probe deeply into users’ comments and behaviors to identify core needs and root cause.
DON’T accept users’ surface-level comment as the full response and move on.

Imagine you come back to your product team after conducting research and say, “Users like to open their phones first thing in the morning.” Their first question would be, “Why? What do they do that for?” If you neglected to probe deeper into this observation, it wouldn’t matter if you had observed this from 100,000 users – it isn’t very useful on its own. And that is because this is surface-level observation, not an insight or root cause.

Many comments users make are like this, and your job as a researcher is to explore deeper to understand the reasons behind their comments and behaviors. People do not look at their phones in the morning for the sake of looking at their phones; there are one or more user needs buried in there. These needs could be satisfied in a multitude of ways, but only if we discover them. And designing based on surface-level learnings can lead to a shallow experience.

This is the crux of user research: delving into users’ comments and behavior to identify core needs, motivations, and pain points. The best designers and engineers in the world couldn’t build an excellent user experience without these inputs. Until we observe users saying or doing something, it is really just an assumption. Our goal with user research is to shorten our list of assumptions and lengthen our list of objective findings and data-informed insights.

Often, we need to peel back multiple layers to get to the true need. This could mean that you ask four or five why questions to dig deeper after a user comment or observation. Sometimes those questions may feel very basic and arbitrary, but we still need to ask them. Even if we think the answer is obvious – sometimes users can surprise you! The less metal steps we have to take to go from observation to insight, the more reliable our findings will be. Here is an example of how this might play out:

Example of Probing Questions in a User Interview

Researcher: What is the first thing you do in the morning?
User: First thing, I open my phone.

R: Why do you open your phone?
U: I need to look through my email and my schedule for the day.

R: What are you looking for in your email and schedule?
U: I look for if anyone has emailed me since I stopped working yesterday, and when my first meeting is.

R: Why is that important to you?
U: Because if I don’t have any concrete meetings scheduled for the morning, I like to go for a jog. Otherwise, I go jogging around lunchtime.

In this example, if we had stopped after the first few questions, we would have an incomplete understanding of the motivations and purpose behind the user behavior. Some people will divulge their true user need early in the questioning process, and some will require more rounds of questions to get there.

How do I know when to stop probing in user interviews?

You will reach a point in probing where you have learned a core user need or pain point, and probing any further would result in broad answers that are not particularly useful to your product application. A core user need expresses a value or goal that is intrinsically important to users, meaning that it drives their behaviors and attitudes. If you have received an answer and you aren’t sure if you should stop or keep going, ask yourself: is this attitude or goal is the cause of their behavior, or the result of it?

True user needs are the causes of behavior, not generally the other way around. Below are a few examples of surface-level observations and their corresponding user needs.

  • Surface-level observation: Check phone to determine when my first meeting starts.

  • Core user need: Find the best time in the day to go for a jog.

  • Surface-level observation: Check my credit card transactions daily and weekly.

  • Core user need: Make sure my dining out stays in budget so I can meet my savings goal.

  • Surface-level observation: Email my assistant 2-3 times per day.

  • Core user need: Stay informed on my top tasks and meetings so I know how to best use my time.

Going one step too far during your exploration may not always help you make specific design decisions, but it can often still be enlightening for overall product vision. In the examples above, if you continued to ask why, you might get answers like, “I need to exercise to be healthy and feel good,” “I want to buy a house so I have a nice place to live with my family,” or “I have to be as productive as possible to maximize my profits and live the life I want.” These are all excellent motivations that help you ensure your product is providing value to users and helping them achieve their broad goals.

 

Following the tactics and principles laid out in this guide will help you gather high-quality, reliable, trustworthy, and most importantly, decision-worthy insights.


Need some help crafting your perfect interview plan, or mastering your user interview techniques? I offer consulting, mentoring, and research services. See my research services or reach out to julia@usermoxie.com to learn more!

 

For a tutorial tailored specifically for usability tasks, see our guide for How to write Effective Usability Tasks.

 

What did you think of these user interview tips? What other tips do you have about interview techniques? Let me know in the comments below!

Next
Next

How to Write Effective Usability Tasks (5 Key Strategies)