[00:00:00] Hey, before we get started, I want to let you know about another project of mine. It's my newsletter called seven things. Every Saturday morning, I send out a list of seven things from across product development, startups, design, and data that I was curious about each week. Sign up at seventhings.thomasessl.com.
[00:00:19] And now on with the show.
[00:00:26] In a perfect world. Every product team would have a full-time design researcher at their disposal. Sadly, this isn't always the case, but even without a dedicated researcher to build successful products and businesses, research is always a must.
[00:00:47] Hello and welcome to product nuggets. I'm Thomas Eslam and today I'll start my crash course on user research. This is something that anyone on a team can do, and that every team should do not just at the start or to validate an idea at the end, [00:01:00] but all the time to become really good at research may take practice, but anyone can do it.
[00:01:05] You don't need to know to code you don't need to learn design tools and you don't need to study business. You just need to pick up a few methods to build empathy with your users and keep practicing and exercising them time and time again. In this episode, I'll be glossing over a number of things, but I want you to get started as quickly as possible to get some decent results without having to study all aspects of user research.
[00:01:30] First. I know that getting started can be a bit overwhelming. If you just Google how to start user research, then there's an absolute ton of methods you're thrown into and voices that are kind of agreeing or sometimes disagreeing with each other. And it's quite confusing. That's why I'm not going to force the kind of proper way onto every product team.
[00:01:51] But. Instead, I want you to follow a sort of 80 20 approach and I'll try to help you build the foundations quickly so you [00:02:00] can get stuck in and build on that.
[00:02:06] I really love it. User research and actually think it's tremendous fun. Even after many years, it never fails to amaze me how unexpected different people behave, think and react. And so the longer I've been doing this for the stronger, my conviction becomes about the importance of user research. It's always challenging my assumptions about what works and what doesn't.
[00:02:30] And it teaches me a ton about just how people's minds work more. Generally. It's just really fascinating. Beyond that the purpose of user research is to get a picture of your users product experience that is grounded in reality, not wishful thinking or assumptions so that you can build on and improve that reality.
[00:02:51] The number one and two guiding questions I continually try to answer is what is the problem we are trying to solve with our product. And [00:03:00] sometimes when it's not clear who is experiencing this problem, in other words, who are our users as such all research is fundamentally about learning about existing human problems.
[00:03:13] Yeah, but what about building products for the future? Oh, here you ask. And still my view is that anything you are researching, designing and building should address a need that exists today. That doesn't mean that the solution can't be something that's never been there before. Something really exciting, maybe even right out of a scifi movie, but the problem that this is solving needs to be real today.
[00:03:37] Different methods help you fill in different knowledge gaps, but there is nothing that you can't get more detail on from early product ideas and the fundamental value of your product or product market fit to the desirability of individual features and how they are organized purely aesthetic sensibilities and tastes and granular feedback on individual interactions, words, or animations and [00:04:00] buttons and the like.
[00:04:01] I think about research in three categories, discovery testing, and validation and discovery being what I often refer to as unknown unknowns, where you really kind of trying to understand the problem landscape better and ask a lot of open-ended questions that you wouldn't necessarily anticipate. The answers to testing of course, is putting something that you've created in front of users to get responses and to see what works and what doesn't.
[00:04:29] And validation or quantitative analysis in terms of validating what you have seen in your interactions with users one-to-one or in small groups on the large scale. So things like analytics, for example, As you can imagine, there is an absolute ton of different kinds of methods that you can use for research to cover these three buckets.
[00:04:52] But I promised you in 80 20 approach, and that's what it really is because the vast majority of research [00:05:00] sessions and needs that I've had in the past can be covered by this very short list of methods that I'll go through with you now. There are discovery interviews, which are basically one-to-one conversations that I'm having with users in a controlled environment to identify opportunities and dig into those unknown unknowns that I mentioned earlier.
[00:05:22] There are ethnographic observations in which I visited users in the context in which they interact with a product or service. And those are particularly helpful in areas where the context of a service is really important, such as a hospital in healthcare services. For example, Validating discovery at scale can be done with surveys, check out my early episodes on those.
[00:05:49] Then of course there is user testing or usability testing in which I put something that I've made or that my team has made in front of yours and see how they [00:06:00] interact with it, how they get on and what can be improved. I use card sorting for determining taxonomies or how content or information needs to be organized.
[00:06:10] And they use analytics and in product tests, such as AB testing to quantifiably determine whether the work that we've been putting into a product has paid off and has actually improved the experience in a measurable way. Lastly, just one word on focus groups, which is don't do them. It's been shown time and time again, that focus groups don't really work because all the participants in the group will influence each other.
[00:06:40] And you're getting what is referred to as group. Think if you're looking to get input from several participants in a short period of time, I still always really strongly encourage you to have either one-to-one sessions. Like I mentioned before, or use more reliable quantitative methods, but [00:07:00] not this sort of halfway in between state where you're getting maybe more results, but of very poor quality.
[00:07:07] If I counted correctly, those are six methods that will get you through really the vast majority of your research needs. You can also combine, let's say two of these methods and indeed, a big chunk of the research I do in a day to day is in a format of one session, split into two segments discovery interviews.
[00:07:26] And testing because discovery is really always worthwhile doing and they try to always test something. I, or my team made every session that you have with users in front of you, it's an opportunity to test something. So I tried to test every time that may be a prototype or it may be a diagram or a service blueprint, or a revenue model.
[00:07:47] There is always something you can test. As I said, there are many more methods. Trust me. If you have a rough understanding of the above, you'll already have a huge advantage over most other folks out there. [00:08:00] All you need to do is make sure you somehow cover the three buckets of discovery testing and quantitative validation.
[00:08:21] When it comes to things to do or avoid, I've put together a list that applies to most of these methods that I've encountered. But here it goes, even with this reduced set of methods at your disposal, you still need to choose carefully, which methods serves your research question best. You've gotta be really clear about the objective of your entire research and why you're doing it in the first place.
[00:08:43] And then what method is most suited to answer the questions that arise from this. For example, I was recently asked to validate market value for a product in one-to-one user testing sessions, but that of course didn't really work because it didn't give me the [00:09:00] necessary scale that I would really need it to answer that question.
[00:09:03] So I suggested other methods to approach this problem. So be clear, your format for your research matches your objective. Then put effort into choosing the right participants, recruit users, not stakeholders, not your mom or your friends or your boss, but people who are actually the ones that you are designing for recruit users that are most suited to give you unbiased feedback.
[00:09:29] And unless your product focuses on a very narrow persona, try and get a wide spread of different profiles. How many people should do test with, for qualitative methods, such as interviews and user testing, really five to eight people per sort of research topic, or question is all you need beyond that group responses are starting to get quite repetitive and you kind of having marginal gains with each session.
[00:09:55] For quantitative methods or my dad always used to say, measuring [00:10:00] percent starts at 100. The more helpful answer to this is as many as it takes for you to get a representative sample size. And unfortunately that is very dependent on exactly what your product is trying to do or what you are trying to research, but just be conscious of it.
[00:10:17] Then keep research focused on your overall goal. Not everything that is interesting. It is very easy to get carried away with this. And this is true for surveys and one-to-one testing sessions as well. People start to either talk about. Areas that you are really interested in, or when you're making a survey, for example, you might come up with all these other things that would be really interesting, but interesting.
[00:10:43] Isn't really the point you need to get the insight that helps you solve a problem. So don't mix those two things together to get the depth that you need and to not confuse participants by jumping around topics too much. Think of user research as a scientific experiment, what is your [00:11:00] hypothesis and your assumptions and your one key question and how can you get to the root of it that said within the topic that you're researching, be flexible and follow the energy.
[00:11:11] What I mean with that is don't try to rigidly get answers to the questions that you've prepared. This is particularly true for interviews, but guide your users within reason. If. There's real energy in a particular area. And the participants are getting really excited to talk about one particular aspect of your product.
[00:11:30] It may be worth just follow down that Avenue for a little bit, to see what you can discover that you hadn't previously thought about. If this doesn't prove to be valuable in a matter of minutes, then guide them gently back to your actual conversation agenda. The next point is one of the most important ones, which is reduce bias.
[00:11:49] Don't quote, unquote, validate your assumptions, but do everything in your power to remain open-minded and neutral. This can be about how you present yourself into body [00:12:00] language and not agreeing with participants too much or disagreeing or showing your opinion in any kind of way. But of course, it's also about the questions that you're asking the easiest way to check your questions is to see, can they be answered with yes or no?
[00:12:14] This is true for surveys and in conversations. And I've actually gone into a little bit more depth here. In my second episode on surveys, you can check that out there for quantitative methods. Look for bias in your data. Are you looking at results across all demographics equally, for example, or are you segmenting them and making sure that they are appropriately represented the next point?
[00:12:39] Embrace silence. One easy way to avoid leading participants is to allow for silence. This can be really powerful, long pauses, make people uncomfortable, but here we can use that to your advantage. They will try and fill that silence and they'll do so by digging deeper and deeper into the mind to find something, to talk about, give her the conversation you're [00:13:00] having.
[00:13:00] It will be related to your topic and that deep insight where participants really think about an issue and not just the most superficial observations that they can just think of in the moment is really key. So use silence to your advantage. At the risk of getting a little bit too specific here, but this is a very common mistake that I'm seeing, which is never asked to anticipate behavior.
[00:13:25] Those are questions like, would you do such and such, or how much would you spend on this particular product or service? Humans are not malicious, but they're just not horrendously bad in anticipating their own reactions in an imagined future scenario. There is absolutely no value in ever asking those questions.
[00:13:46] My next point is trust the process. I've said this before, but remember that your questions and everything around your research, including your experience and that of your participants is already somewhat prime [00:14:00] towards what you are doing when it feels like you are wasting precious time, stick with the process, the value will become apparent once you analyze your findings.
[00:14:08] And this is also true for individual sessions have been countless times. Well, I was about to cut the session short because it didn't seem to turn up much valuable insight, but then the conversation suddenly shifted to reveal invaluable insights that I would never have anticipated. So never a board session.
[00:14:26] And finally involve others. This helps generate empathy in your team and helps get stakeholders on board if you just invite them along user testing sessions. So take notes for example, and on the letter point, I've dedicated a whole episode. To influencing stakeholders with user research. So you can go back.
[00:14:45] I can check that one out, but the more you involve others, the less time you also have to spend on writing reports, getting buy in and all those other things that really don't get your team forward. Sharing your findings throughout the process is of course, another way to [00:15:00] involve others continually and just generating visibility for the work that you're doing at the inset that you generating.
[00:15:06] Over time that also helps build that empathy with everyone who needs to have it.
[00:15:16] You can keep these guidelines in mind whenever you carry out research. I hope you find them useful. Please let me know about anything I've missed or points you disagree with. Or maybe you have more questions about. You can reach me on twitter@thomas_slorsendmeanemailtohelloatthomassol.com.
[00:15:37] Nuggets is produced by me and the music is by blue dot sessions. Any opinions expressed on my own. Thanks so much for listening until next time.