Summary: User interviews have become a popular technique for getting user feedback, mainly because they are fast and easy. We use them to learn about users’ perceptions of our designs, not about their usability.
A user interview is a UX research method during which a researcher asks one user questions about a topic of interest (e.g., use of a system, behaviors and habits) with the goal of learning about that topic. Unlike focus groups, which involve multiple users at the same time, user interviews are one-on-one sessions (although occasionally several facilitators may take turns asking questions).
Interviews give insights into what users think about a site, an application, a product, or a process. They can point out what site content is memorable, what people feel is important on the site, and what ideas for improvement they may have. They can be done in a variety of situations:
- before we have a design, to inform personas, journey maps, feature ideas, workflow ideas
- to enrich a contextual inquiry study by supplementing observation with descriptions of tools, processes, bottlenecks, and how users perceive them
- at the end of a usability test, to collect verbal responses related to observed behaviors
First and foremost, think of an interview as a type of research study, not a hype session or an informal conversation. Then, consider the following tips to make user interviews most effective.
Ask product stakeholders what they want to learn. From their desires, determine the main goal, ensuring that it’s realistic. Too broad of a goal, like learn about users, is a likely to make interviews fail, because it will not focus your questions in a direction relevant to your design needs. A concise, concrete goal related to a specific aspect of the users’ behavior or attitudes can bring the team to consensus, and direct how you’ll construct the interview.
People are more likely to remember, talk, and let their guard down if they feel relaxed and trust the interviewer and the process. Here are some tips for an effective interview.
- Have a video call or phone call (or at least some interaction) with the user before the interview itself.
- Before the interview day, and also at the start of the actual interview, explain the reason for the interview, and how the data from it will be used.
- Make the user feel heard by taking notes, nodding, frequent eye contact, offering acknowledgments like “I see,” and repeating the words the user said.
- Let users finish their thoughts. Do not interrupt them.
- Don’t rush the user. Pause. Slow down your pace of speech. Talking slowly has a calming effect and indicates that you are not anxious and that you have time to listen.
- Start with questions that are easy to answer and that are unlikely to be interpreted as personal or judgmental. For example, instead of “What was the last book you read?” try “What do you like to do in your spare time?” The latter is open-ended, while the former assumes the user read a book recently; those who did not may feel stupid.
- Show some empathy by asking related questions. But recall that it is difficult to act sympathetic without also being leading or making assumptions. For example, imagine that a user said he could not reach the customer-support team. You can show some concern by asking the user to elaborate: “You couldn’t reach support. Can you tell me more about that?” You could even try a question like “How did that make you feel?” but only if the user did not already indicate how he felt. For example, if the user already verbally or even nonverbally expressed frustration when recalling the event, then asking how he felt would seem as though the interviewer had not been listening. As an empathetic human being, you might want to say, “That must have been frustrating,” or “I’m sorry your time was wasted like that.” But those would be leading points. Instead, asking a question that relates to the users’ feelings can show that you are listening and feel for their plight. At the absolute end of the interview, you can express some of these more apologetic sentiments.
- Be authentic, and don’t fake empathy. Acting can make you appear disingenuous. It is better to be yourself; don’t say something if you don’t genuinely feel it.
Keep in mind that there’s a big difference between rapport and friendship. The user does not have to really like you, think you’re funny, or want to invite you out for a cup of coffee in order to trust you enough to be interviewed.
While we will likely think of questions while sitting with the user, do bring to the interview a list of questions you aim to have answered. A question list ensures that you will:
- be able to get the team’s feedback about your questions before the interview
- remember everything that we wanted to know and ask users about as many of the right topics as possible during the interview
- construct clear, non-leading questions better than we would in the moment
- overcome our own stress or fatigue by having questions on hand to refer to
Of course, the whole reason we are doing interviews is because we don’t already know or feel completely confident about what people will say. Yet, anticipating answers to the best of our ability can help us better prepare for the interview.
Think about what you would do if you hit a dead end — in other words, if the user did not have a response for your question. Are there ways in which you can help the user to find an answer? For example, imagine we are working on a travel approval website, and that a participant was recruited because she has requested travel within the last 6 months. Let’s pretend that some of the research goals of the interview are:
- Do people remember how they requested travel?
- What’s memorable about the process?
- What do users feel is easy about requesting travel now?
To begin, ask users if they can recall a time when they requested travel. Prepare additional questions in case they can’t remember a relevant event right away.
- In each question, ask for just one thing. Instead of “Do you use a navigation system, and if so, which one?” try “How often do you use a navigation system?” then follow up with “Which one or ones do you use?”
- Jog the memory by asking about specific events rather than about general processes. Remembering an incident will nudge the user’s memory and enable them to talk about precise occurrences.
- After you ask about an event (e.g., a conflict in a travel request), wait a few moments to give the user and opportunity to think about that event. Then begin asking questions about the event, such as, “When did that happen?” or “What were you expecting to happen?”
Ideally, our questions should elicit rich, unbiased answers from the interviewee.
- Leading questions prime the user by inadvertently suggesting a response. For example, a question like “Why do you enjoy using the Approvals application so much?” suggests that the user uses the product and enjoys using it. A better question might be “Why do you use the Approvals application?”
- Closed questions elicit “yes” or “no” answers. For example, if an interviewer asks, “So, you use the Approvals product each week?” then the participant could sincerely respond with just a, “yes”, and not elaborate. A better question might be “Can you tell me about how you use the Approvals application?”
A caveat: while closed questions are less likely to elicit wordy answers, they are easier for users than open-ended questions. Sometimes, you can precede an open-ended question with a closed one to ease the user into a topic or protect users from feeling stupid when they don’t remember an event.
For example:
- “Do you remember when that happened?”
- “Yes.”
- “When was it?”
(This type of question sequence is okay during a user interview, but is less appropriate in a usability test, where we want to limit interaction with the user as much as possible.) Vague, ambiguous questions are difficult to understand and often confuse participants. They can also make people feel uncomfortable or guilty for not understanding what you mean. To figure out if a question is too vague, consider informally testing it with random people to see if they understand what you mean.
Some participants like to talk and give very long answers to questions. Others need prompting in the form of followup questions to deliver the same amount of information. Be ready to address both situations.
Some researchers confuse the user-interview method with the usability-testing method. While the methods do have some commonalities and a user-testing session may include an interview at the end, the differences are many and important. Some of these differences are summarized in the table below.
Interview | Usability Test | |
---|---|---|
Whether a design is easy to use | No | Yes |
What makes a design easy or difficult | No | Yes |
Whether people believe they would use a design | Yes | No |
Whether people would use a design | Maybe | Maybe |
Unlike behavioral data that captures how participants interact with a design, data from interviews is self-reported — it reflects users’ perceptions and feelings about a process, a site, or an interaction. Like any self-reported data (included that from focus groups and surveys), interview data is tenuous because:
- Human memory is flawed, so people don’t recall events fully or accurately.
- Participants don’t know exactly what is relevant for the interviewer, so sometimes leave out details. They usually don’t think minor interactions are important enough to bring up.
- Some people are proud or private, others are shy and easy to embarrass. Thus, not everybody will share every detail with a stranger.
Interviews are a quick and easy way to get a sense of how users feel, think, and what they perceive to be true. Do them, but complement them with observation-based research to attain an accurate and thorough sense of what users really do and a higher feeling of confidence with the information you collect.