Discovery interviews are a universally useful, flexible, and simple method to understand entire user journeys, build personas, and ground all work that follows.
Transcript
[00:00:00] Hello, everybody. If you are interested in data design, product development or any of the sorts of things I'm talking about on this podcast, subscribe to my newsletter called seven things@seventhingsdotthomasessel.com.
[00:00:17] Beginning work on any project. It is critical to gain a realistic and accurate impression of the process or use journey you are trying to innovate in. This starts with some secondary research such as online research or stakeholder interviews. But to get an accurate look into a user journey and identify where we can really make the biggest difference.
[00:00:40] We have to conduct discovery, interviews.
[00:00:51] Hello, and welcome to product nuggets. I'm Thomas. So as I've mentioned before, there is a time and place. Or every research method, but this one [00:01:00] really is at the core of design research. As the title implies, you get all the insights you didn't know existed and answers to the questions. You didn't know how to ask.
[00:01:09] They call these insights, unknown unknowns. You're basically having a purposeful conversation where you provide the prompts for users to share their reality and what they perceive as important about an experience service or problem space. These conversations are focused on bringing clarity to a specific user journey for a specific user group.
[00:01:30] What I aim for to happen. As an output in the end is a complete process diagram that maps the user journey out in detail, marked up with problems or pain points and opportunities or gain points for each part of the process. This will help me to highlight, prioritize and select what problems I want to solve for our users.
[00:01:51] With the technology that me and my team are about to build. When is the time to do this? Well, this is one of the first research activities I perform on a [00:02:00] new project, and that will get started. As soon as I believe I have a rough idea of the problem space that I'm exploring on a completely new project.
[00:02:08] I might even spent the first two weeks just interviewing users and mapping their experience. However, I never really stop each user interaction is an opportunity to add granularity to your knowledge or track changes. In that reality over time. So each time I conduct user testing, I add a discovery session to it.
[00:02:26] This also helps set the context for any exercise that follows. In the previous episode, I mentioned that it can be done on its own or for the first half of a multimodal session where it can be followed by an exercise such as a usability test or ethnographic research. You can also use the exercise to get feedback on the process map you've been creating itself based on the interviews that you're conducting to answer specific questions.
[00:02:53] It can be helpful to have a draft of that process map to talk over and have users point out to you. What's not [00:03:00] quite right yet now for any kind of research activity. There is not just the activity itself that you have to do, but there is a few operational things that you need. Let's do around it first and after.
[00:03:10] So what does that look like for discovery interviews as often? You'll start with the objective. I'm trying to think about what it actually is that you're trying to achieve with the product that you're building and then. Tying that to more specific ask in terms of what are the insights that you're trying to generate with your research?
[00:03:30] What are the specific kind of research questions that you're trying to explore in the sessions that you're about to set up and all of this should be tied to a desired outcome for the user? So if your application is already quite advanced in your thinking, then you might tie to a specific user journey such as signing up to a product or using a core feature.
[00:03:49] If you're still very early on in the discovery, then you might keep it a little bit more broad and keep the kind of main objective in mind. If you think in about my favorite example [00:04:00] of a flight booking app, then that would be probably describing having a pleasant Jody from point a to point B and everything that, that entails and that all.
[00:04:09] Obviously be broken down into many different things at the end. So depending on where your product is out in terms of maturity, uh, that's how you kind of have to frame the focus of the conversation. But it's important that you have a focus that you have the right focus for the right user and that you're using that to guide the overall conversation really important because with many research methods, it can be really tempting to just.
[00:04:33] Follow threats and investigate areas that are just really interesting, but don't forget. We're not trying to go after everything that is interesting around our problem space. We're trying to look at specific issues in the user journey that we can fix and being really clear about our research objectives upfront really helps to stay focused and on track.
[00:04:53] And you have your overall objectives and the specific questions that you want to ask. Then you can start writing an interview [00:05:00] guide. This is basically. A bullet point list of the kinds of questions that you want to ask at the specific phrasing. There is a lot of thought that goes into creating this interview guide because you want to make sure that the conversation flow as well and that you're not biasing your participants and a whole range of other things that we'll get into in a minute.
[00:05:19] But this is like your core asset of preparation that you'll need is your interview guide. And while you're creating that, you'll be recruiting participants. And those should just be representative of the kinds of people that will be using your product in the end. So make sure that it's not stakeholders or, you know, anyone else just offering an opinion.
[00:05:38] It should be really representatives of your target user audience. Once you've got all this in place, then I would recommend running a pilot where you're taking your interview guide and you're sitting down just as a test subject to run through the interview process itself. It sounds a little bit silly to test.
[00:05:55] A test, but this can really prevent you from making some mistakes [00:06:00] and polluting the data that you're hoping to generate down the line. Then it's of course, actually time to conduct the interviews themselves. And once you've done that, you will have to synthesize your insights. Now, how do I actually structure the session itself?
[00:06:15] I try to make sure that it just generally flows smoothly. It shouldn't be too casual because you don't want to influence your participants, but you still have to make sure that since we're all humans in a room, the conversation kind of flows nicely as you go along. And I'll say a little bit more about that in a minute.
[00:06:34] I typically introduce. The whole conversation with some kind of set up section where first of all, I introduce myself and tell, might say one or two lines about the project that I'm working on to set the context, trying not to give too much away here. And then there are a few oddities in user research that I'm preparing my participants for.
[00:06:54] So first of all, I tell them that this is not a test of them. There are no right or wrong answers here. And really. [00:07:00] I'm just trying to get some insight of their past experience, whether that be good or bad, there is nothing that they can do wrong here whatsoever. I asked them to be Frank and honest about what's bad as well.
[00:07:11] If I think it's fair to say that most humans try to not insult you in a conversation. That's just common human nature. But really, even though you're not looking for insult, you're asking people to be really Frank and honest about their experiences. So I asked them for it. And I say that the more honest they are about the shortcomings of a given process or product at this point in time, the more to helping me because now is the time to try and work on these problems and fix them.
[00:07:40] And it will be much more expensive down the line when we've already invested quite a bit into the development of a product based on wrong assumptions. So, please be honest. If we have agreed to record the session, then of course I remind them of this to say that this is for note taking purposes only.
[00:07:55] And then I also kind of warn them about how I will be behaving [00:08:00] because I will be trying to not bias my participants with affirmative statements. Or body language. So I prepare them to say, look in this conversation, I'm going to remain fairly neutral. Don't be put off by that. It's just part of the process.
[00:08:15] And this is I'm doing this because I don't want to influence you. Then we get to the actual conversation itself. And I typically start a little bit more broadly and casual keeping things. More natural and flowing at this time to kind of help participants adjust a little bit, to get used to the whole situation, but also to avoid jumping from one question to another.
[00:08:36] What I'm trying to do is establish a narrative that starts broad and almost kind of feels like small talk, but that then gradually finalists down to some very specific questions that I'm really interested in. So for example, I might ask about their week first and what have they been working on? And then gradually I.
[00:08:55] Try and direct the conversation towards the activities that I'm actually curious about. And [00:09:00] within a couple of minutes or seconds, I'm already deep in the interview and my participants hardly even noticed it because it still feels so conversational. So it makes them feel at ease. And it is a guide similar to the core of the subject that I want to be talking about.
[00:09:14] And then I ask all my typical questions that I might have about their experience with a given process, their experience with. Competitors of something that I might want to develop. I say, what's what was good or what was bad about it. And anything that you can imagine, this is like the core body of the conversation, and it's really up to you to structure that.
[00:09:34] And at the very end, I always encourage everyone to keep space for a kind of more open conversation. That can be directed by my participant. So I might say something like over the course of this conversation, has anything come up that you thought was interesting that you want to be talking about, that you want to share with me that might be valuable.
[00:09:55] And then I just listened to them saying that that's a very good sort of question to ask as a [00:10:00] conclusion. And then my final question would be, which other person might be. Experienced in the topic that we've been talking about, and this is a really good way to acquire further test participants. If that's something that you've kind of been struggling with.
[00:10:38] Yeah, more important than anything else. Almost really. As part of this conversation is getting answers that are untainted by bias that are truly reflective of a participant's experience and opinions. So, how do you remove bias? Here's a few tips. So first of all, you can avoid adjectives in your questions.
[00:10:56] So an example of this might be you asking how [00:11:00] annoying is this activity for you that you are discussing by saying the word annoying. You're already implying that it is annoying to some degree and therefore biasing participants and other check you can do is. Avoiding guests or no questions, because most questions that can be answered by yes or no, include your own sort of ideas and opinions about something to use the previous example.
[00:11:24] Again, if I say, is this annoying to you, then that is basically doing the same thing. It's biasing participants to think about annoyance. Good questions for these kinds of interviews tend to be open-ended. And just fairly open in general. So instead of asking, how annoying did you find this? You'd say, how do you feel about this experience or even better?
[00:11:46] You can say, tell me about the last time you did X or you could say describe to me a time when you did. Such and such, those are all kinds of like open-ended questions that you can complete that almost work in any [00:12:00] situation. Another way to get to something like an objective truth. If there is such a thing is to drill down into emotional responses.
[00:12:07] If a participant describes their experience and they keep using emotional words, like this was really wonderful, or I love this, or this was really frustrating to me, then this is a good time to drill deeper into that and say something like, can you elaborate on that? Or why is that the case? And what you're looking for is to get to some kind of objective root cause if they say, Oh, this was really frustrating.
[00:12:29] Then at the end of the day, it might be because, uh, say the loading time of a website was too slow and that's something very objective and actionable, something that you can actually try and work on fixing. And that's going to be much more helpful than the general emotional sentiment. And the last thing that's important in terms of avoiding bias is everything about yourself, your tone of voice and your body language.
[00:12:52] That is something that is heavily influencing people. So for example, if your participant is in the middle of making an argument and you're [00:13:00] nodding along and you're smiling, then that is. It kind of positive reinforcement for them to think that, Oh, what I'm saying at the moment, that's the right thing to say.
[00:13:08] That's what they want to hear, but there is no right thing to say because you're just trying to get their experience it good or bad, but by enforcing them positively, they'll naturally continue saying things along the lines of what you've been nodding too. So try and remain neutral as much as you can be like a robot in this scenario, if it helps thinking that way, once you've done your research.
[00:13:31] What do you do with it? Why is it useful? So what I typically do is throughout the conversation, of course I take notes and that we'll record it and we'll listen to it again. And then I just make a massive post-its. Everything that I thought was interesting goes on a, posted on a wall or a virtual whiteboard, such as Miro.
[00:13:50] And you can use different colors if you find it helpful for like green things, for good red things for bad, but. Really, that's not so critical. You just trying to get all [00:14:00] the insight onto the same board. And then I tried to group things, affinity, map them into themes and see what kind of themes are emerging.
[00:14:07] I look at for recurring comments and the size of themes. Both of these could be indicators. Of some kind of priority to a certain topic. Maybe that is something that I can prioritize for development or it's something that I actually need to do more research on in the following sessions. And then once I've got my themes and some kind of indication of priority, I can further prioritize them for.
[00:14:32] Build. So maybe through value or feasibility, or I can even try and get some quantitative validation by creating a survey that is asking specifically for those things that I've uncovered. But in a nutshell, this is kind of what I typically do. I collect all the post-its. I see what themes emerge from them.
[00:14:50] And then I use them for prioritization and to decide what further research is necessary. All right. And this is the process that I use to conduct [00:15:00] discovery user interviews. And you can see it's really not all that complicated, but there are a few sort of frequently asked questions that come up time and time again.
[00:15:09] So I'll go through those real quick. What happens if participants ask you a question? Don't be tempted to answer it, just throw it back at them. Say, well, what do you think? Of course, again, answering a question like that will bias them. We'll make them agree, disagree, and you don't want to have that kind of conversation.
[00:15:27] But if they are unsure about something, then that is a really good point too, to actually just throw that question back at them and see how they feel about that situation. And it'll probably uncover some interesting insights. Another question is how do you make sure that what you understand is actually what they, they are saying on what they think they're saying, how do we make sure that nothing gets lost in translation?
[00:15:50] I would say it is okay to paraphrase things for this purpose. So if somebody has just told me their experience, then I might say something like, just to make sure that [00:16:00] I understood you correctly. What I'm hearing is blah, blah, blah. Then you paraphrase and then you ask them, you know, is that reflective of what you meant?
[00:16:09] How many people do you need for user interviews? I typically talk to five to six people per user journey and user group, but that is depending on the variance of feedback that you're getting. Normally after five or six interviews, I start to see diminishing returns where people mostly repeat what the person before said, but sometimes it might happen that that's not the case.
[00:16:33] If your inserts hasn't petered out yet continue having more interviews, you will notice when it kind of starts becoming a bit of a futile effort. How do you find users? If you already are running a company and you have an existing user base, then it's easy, right? You'll just try to recruit your test users from there.
[00:16:53] Otherwise I recommend just doing it one by one. This can be done by. Putting messages on community boards [00:17:00] and Reddit threads, Facebook groups, et cetera, wherever you are, a specific user group might hang out or you can also pay for them. There are companies out there that recruit users for you, that schedule interviews for you, especially in the sort of consumer technology space.
[00:17:16] That can be a very good Avenue to take because you can get a very broad range of users in front of you. What's the difference between user interviews and stakeholder interviews while besides the obvious that there is a difference between users and stakeholders, but in terms of the format stakeholder interview, it's much more focused on the strategic goals of a project, user interviews, and more focused around their personal experience in a given problem space.
[00:17:43] Should you reuse participants. So in other words, inviting some who you've already had an interview with in again, to discuss a similar topic with them. My answer to that is yes, but not exclusively. The huge benefit of doing that is that if you're having interviews [00:18:00] with users who. Give you feedback and you then respond to that feedback.
[00:18:04] If you make changes to your product or kind of direction in line with the feedback that they gave you, that is a really rewarding feeling for them because you clearly listening and you taking their responses seriously. And so they'll be much more eager. To take part in sessions again and try to be very helpful.
[00:18:23] And they'll also become some of your strongest product advocates in the community. However, you don't want to keep asking the same two or three people time and time again. Do you want to mix those with a stream of fresh inserts and participants as well? Can you ask for satisfaction scores, NPS, or if they would buy a product?
[00:18:46] I would only ask these kinds of questions in order to get insight as a kind of queue to have a discussion about it afterwards, but never to actually. Take the result of it. Seriously. I've mentioned before [00:19:00] that people are really bad in anticipating the future behavior and they also want to be nice. And both of those things compounding mean that if you are asking if they would buy a product, you're probably not going to get a correct answer.
[00:19:12] It may be truthful in the moment, but it won't. Be predictive of what they'll actually be doing in the future. If that is something that you're after you can use a range of other methods. So you could do some kind of vapor test with setting up a website and a kind of like social media ad campaign, trying to get responses to that and do that in a quantifiable way.
[00:19:32] But in qualitative interviews, that's not really going to give you much insight except. To have a conversation about their motivations behind the response. So if you're asking, would you buy this product? And they say, no, then you can say, okay, well, why is that where they say yes, say like, okay, what swayed you?
[00:19:50] And you kind of continue having a conversation like this, but I would never use their actual response as data to anticipate the future events. [00:20:00] How many people can you have in the room for a testing session, I would recommend not more than two and the participants. So that would be you and maybe a note taker or somebody else on your team who wants to get more exposure to users or a stakeholder, but they really have to be quiet.
[00:20:17] The only people talking should be the person who's interviewing a participant and the participant themselves. I've talked about how to influence stakeholders with. User feedback on episode five. So please check it out. If that's something you're interested in. And lastly, how do you get better at doing user research?
[00:20:35] I know it's quite a lot to take in. There's a lot to think about, but don't get overwhelmed. You're not going to get it right perfectly the first time around. And frankly, I probably still make a lot of mistakes, but just take the time to reflect on each session and how you can improve. If you're recording them for note taking, then listen to those recordings.
[00:20:53] Watch those recordings. Also as a means to think about how you can improve, just listening to yourself, as much as you do to the [00:21:00] participant, you will learn a lot and be a research expert in no time. Right?
[00:21:11] And this is it for today. I really hope you enjoyed this episode and I wish you the best of luck for your upcoming user interviews. If you've liked this episode, please rate and subscribe, and I would also love your feedback. You can reach me on Twitter at Thomas underscore ESOL, but send me an email to hello.
[00:21:29] At Thomas ersel.com. This show is produced by myself and the music is from blue dot sessions and your opinions expressed are my own. Thank you for listening until next time.