Research Bias (Part 1)
Published 2/5/2018
In today's episode, we start a discussion about the authority and persuasive ability of research and how bias can have major effects without us realizing it.
Today's episode is sponsored by Linode.
In 2018, Linode is joining forces with Developer Tea listeners by offering you $20 of credit - that's 4 months of FREE service on the 1GB tier - for free! Head over to https://spec.fm/linode and use the code DEVELOPERTEA2018 at checkout.
Transcript (Generated by OpenAI Whisper)
When you hear the phrase, research-based, do you have an immediate sense of trust? If you're like most people, then you probably do because research seems to be the gold standard for proving something. Now, the reality is, research and the quality of research varies in terms of how effective it can be improving one thing or another. This week we're going to talk about research, all three episodes this week are going to be about research, and we're going to use this as a way of framing a discussion for user research. As developers, we're going to have people who consume whatever it is that we're creating. 99.9% of the time, probably, people who are listening to the show, you're either writing code that constitutes a product that someone is going to be using, or you're writing code that another person will be reading. Maybe you're working on a library and you actually have developers who are using your code. Either way, at the other end of this chain, there's someone who is affected by the work that you're doing, and there's a whole field dedicated to this subject, user research. We're going to talk about some of the more concerning aspects of when research can steer you the wrong direction in these three episodes. My name is Jonathan Cutrell listening to Developer Tea. My goal in this show is to help driven developers uncover their career purpose and do better work and have a positive effect on the people they have influence on. This show covers a wide variety of topics. Which is an excellent topic for this show because for you to be a great developer, for you to practice good work habits, you should be doing some kind of research. Maybe you're doing research without realizing it, and that's one of the things we're going to talk about in these three episodes. We need to understand why can go wrong when we're researching. We need to understand how things can get thwarted, how our perceptions can feel correct, but in actuality, they're heavily biased for some reason or another. We're going to talk about ways that research can become a little bit thwarted. We're going to talk about those types of biases that can affect research. We're going to start out by giving you kind of an intuition for how research can become thwarted. We're going to talk about specific biases in at least today's episode. We might continue talking about biases specifically related to research as we move into Wednesday and Friday. How can research go wrong? Let's think about this for a second. Imagine that you are conducting an interview. An interview is a kind of research. You're researching the person that is going to end up in this job. If you start the interview out, let's say with a question that is extremely difficult to answer. Let's say you are in charge of ordering the questions and you start with an intentionally difficult question to answer. The person who responds fails miserably at answering. The candidate for this particular job, they fail miserably at answering. Then subsequently, they ace the rest of the questions. We would like to believe that this kind of thing is easily evaluated that we can evaluate all questions equally. This isn't true. The ordering of questions, for example, is incredibly important. For example, if a person fails miserably on the first question, they are much more likely to be distracted throughout the remainder of that interview. Furthermore, first impressions make a huge impact on the way that we perceive other people. Even when we intentionally try to roll back this perception that we've gained as a result of first impressions, we still can't escape that kind of lower level brain perception that the first impression actually is stronger than subsequent impressions. How do you answer this? Where should you put this question? You may think you should put it at the very end, but unfortunately, the things that happen at the end of something aren't easier to remember. They leave a lasting impression. Should we put it in the middle? Where should we put the most difficult question? These are the kinds of questions I'm not going to prescribe a perfect answer for you, but this is the kind of thing that shows you that research can be biased. Another example of this is if you were to bring in a candidate that everyone already believes is good. Or on the flip side, bringing in a candidate that everyone already believes is going to fail. It's very possible, and perhaps even likely, that the questions and the tone of the interview is going to follow some kind of confirmation bias. This is maybe the first bias we can discuss, but the confirmation bias is a preconceived notion that is then confirmed through that followed research, those following questions. Let's use a different scenario to prove this one as well. If you are looking for reasons to not support an old browser, if you're looking for reasons that IE8 should not be supported. If you Google, for example, why we shouldn't support IE8. What you've already done by doing this small ad hoc research about whether or not you should support IE8, you've already searched for other people, other opinions that align with your opinion. By framing it as a why I shouldn't support IE8, you're already setting up yourself to find reasons why you shouldn't. If you ask the question, should I support IE8? You might have a better outcome, but then you start getting into other biases, like for example, you might run into something like culture bias. In this particular scenario, we're talking about a subculture, particularly the web developer culture. By and large, the opinion of web developers is that support for IE8 isn't really necessary. This is also supported by the fact that if you don't have to write specific code for old browsers, then your job becomes a little bit easier. When you Google this question, then you're not really using data. You're using opinions. You have to ask the question of whose opinions are you actually seeking? Whose opinions are actually going to be found in your Google search? How can we reverse this kind of bias? Maybe it makes more sense to use data, right, usage data specifically, to make your decision. In this particular scenario, instead of framing your question around, should I support IE8 or not, you could decide your own criteria, like for example, what percentage of people do I need in this particular user base? What percentage of people justifies the extra work necessary to support IE8? Now this removes an opinionated perspective of whether or not IE8 needs to be supported. We're barely scratching the surface on research. Really everything that we've discussed so far is very informal versions of research. We're going to talk a little bit more about slightly more formal versions of user research right after we talk about today's awesome sponsor, Linode. Today's episode is sponsored by Linode. With Linode, you can get started with only $5 a month and you can have a Linux server up and running in just a few minutes that has a gigabyte of RAM. You already know this if you've been listening to Developer Tea. We've been going through some of the cool things about Linode in this month's episodes, or I guess in this year's episodes, it is now February. I want to point out that if you want to get a taste of the kinds of things that Linode does for developers, you can go to their docs. They have development docs. These are guides and tutorials that you can get started with for free. You don't even have to use Linode to understand how Linode is contributing to the development community. You can learn from these. You don't even have to go and sign up for a Linux server through Linode. You can go and learn from these tutorials. For example, here's some of the tutorials that Linode has available. How to install R on Ubuntu and Debian. How to install Go on Ubuntu. Install Java on Cintus 7 or use nightmare.js to automate headless browsing or how to install Node.js. Use scrappy to extract data from HTML tags. This is a Python thing. How to set up a task queue with celery and rabbit mq. They have Ruby on Rails tutorials. They have Perl and Node.js, Java and continuous integration tutorials. This doc base is actually continuing to grow. This is just one of the many ways that Linode is supporting you as a developer. It's not just getting a server up and running and then leaving you on your own. They have 24-7 support available. These guides and tutorials are not limited to just development. They have tutorials that are specific to, for example, security upgrades and backups. They have tutorials that are specific to game servers. They have tutorials that are specific to the Linode platform itself. I encourage you to go and check out Linode because not just because they have excellent pricing on their servers. $5 a month is one of the best investments you can make into your career. But beyond that, they have excellent support for developers and you can get started today and get $20 worth of credit just for being a Developer Tealistener. Head over to Respect.fm slash Linode and use the code Developer Tea2018 for $20 worth of credit at checkout. Thank you again to Linode for sponsoring today's episode of Developer Tea. So we're talking about various types of biases that affect research. And again, we have this compression thing that happens in our brain where when we hear the word research, a lot of us translate that to proof. We translate it to, this has been proven. It's not just someone's opinion, it's actually been tested. There's some evidence to show that this is the case. And the unfortunate reality, as we've already mentioned, is that a lot of research falls pray to bias. Bias is very difficult to identify and it's very difficult to avoid altogether. But being aware of biases and also being aware of ways to avoid these biases can help you perform better research. So we can learn about this through the lens of user research. One of the things that happens all the time in this particular bias is so prevalent for web developers is so prevalent for people who want to write an app. They have an idea, right? You've probably had an idea as a developer and you want to start creating an app around this idea. And so you go and you talk to your friends or you talk to your family or even to your co-workers and you tell them about your idea and you pitch it and maybe an elevator pitch style. And then once you've told them about your idea, you ask, what do you think? This is a trap. This is a huge trap that many people, unfortunately, they fall prey to because the truth is in almost every case, when you take the time to present an idea to another person, unless the idea is wildly terrible, which most of us don't have wildly terrible ideas, most of the time the other person is going to agree with you. They're going to encourage you. They're going to be positive about the thing that you've just presented them with. Now, why is this? Well, this is called the acquiescence bias and it happens all the time. And part of the reason for this is once again, kind of fundamentally baked into our survival. Our ability to cohabitate with other people. Our ability to get along with other people is fundamental to our ability to survive. We have to be able to be friendly with one another. We have to be able to live in harmony with other people. And so we have this built in bias to not create conflict when it's not necessary. Now, conflict is obviously something that happens, but usually when someone presents to you an idea that you have very little, you know, buy in, you have very little effect in your life if they choose to move forward with that idea, you don't really have any consequence of them failing, right? They're just presenting to you an idea. Then you're very likely to agree with them. You're very likely to encourage them. You're not very likely to provide negative feedback that dissuades them from their idea. Now, why is that? Well, you're creating conflict. You're providing critique to something that obviously that person is excited about. So we have to be aware of the acquiescence bias because this can create problems in our business. We could go down a road thinking, well, five people or ten people or hundred people that I talked to said that this was a good idea. And so therefore, I'm going to move forward with this idea. And it may end up being a flop or it may end up being, you know, a mediocre idea at the end of things, right? So how can we avoid this? Well, one way is to ask more specific questions. Ask questions like, how much would you pay for this? Because the truth is, I could think something is a good idea, but not good enough for me to use simultaneously. There are plenty of apps in the App Store or there are plenty of developer services or programming languages or development boot camps for that matter that seem like a good idea, but I wouldn't put my money on the table for them. I wouldn't spend the time in order to learn them. So asking these more specific questions can lead you to understand beyond just is this a good or bad idea. You can start looking more for viability of the idea in the market. Another very specific question. If you don't want to ask somebody how much they would pay for it, you can ask them to explain a moment or a situation in their lives where this would help solve their problem. If you're doing user research, for example, and you ask a user directly, do you think that this is a good feature? And they say, yes, furthermore, you could say, how would you use this feature? How often would you use this feature? Have you run into scenarios using this app or using this product where you needed something like this? Crafting your interview questions or crafting your research questions so that you can avoid simple answers and instead incite and elicit more complicated, more thoughtful answers, this is going to result in a much higher quality of response. And more importantly, it's going to help eliminate some of the acquiescence bias because what you're doing is you're getting past the agreement phase, right? You're getting past the agreement phase and you're moving into a more analytical phase. This is not a relational answer. This is a thoughtful use case answer that you're looking for. This is one of many different things that could happen over the course of user research. And we're going to explore some more of these during this week. And of course, we continue exploring bias on this show all the time. We talk about neuroscience. We talk about psychology. We talk about behavioral economics and the choices that people make. And we're going to continue talking about all of these things on this show because I truly believe that as you continue in your career as a developer, these are the subjects that help you move from being a coder to a truly great developer. Your skills then expand from software design to solving truly solving problems with other people, being able to collaborate better with other people, being able to vet your ideas, right? These are the ways that you move up in your career. These are the ways that you level up. These are the ways that you connect with that career purpose and ultimately have a greater impact. Thank you so much for listening to today's episode. I hope you're enjoying this discussion on bias and on research. I'm grateful for the people who have subscribed to this show. In fact, I took a look at the analytics and we actually have someone in every state in the United States that has listened to this show. And we've had people in every inhabited content on the planet listen to this show. That's really exciting. So I encourage those of you who are willing to be challenged on a regular basis. And you have three episodes a week. I encourage you to subscribe so you don't miss out on future episodes just like this one. Thank you so much for listening to today's episode and until next time, enjoy your tea.