Cognitive Biases That Can Kill Your Product
Published 1/11/2016
In today's episode, we discuss two cognitive biases that could kill your work, whether you are creating a startup product or working for a well established business.
Mentioned on or relevant to today's episode:
Today's episode is sponsored by Hired.com! If you are looking for a job as a developer or a designer and don't know where to start, head over to Hired now! If you get a job through this special link, you'll receive a $4,000 bonus - that's twice the normal bonus provided by Hired. Thanks again to Hired for sponsoring the show!
And lastly...
Please take a moment and subscribe and review the show! Click here to review Developer Tea in iTunes.
Transcript (Generated by OpenAI Whisper)
Hey everyone and welcome to Developer Tea. My name is Jonathan Cutrell and in today's episode we're going to be talking about two cognitive biases that can affect your code or your product negatively. Today's episode is sponsored by Hired.com if you are a developer or a designer and you're looking for a job and you don't know where to start, hire.com may be the perfect place for you. We will talk more about what Hired does later in the episode as well as a special offer that they have for Developer Tea. Listeners. But first I want to jump straight in to these cognitive biases. You know, I've talked about cognitive biases in the past and I'm definitely going to be talking about more of these in the future as well as things like logical fallacies. These are things that are really interesting to me and they help frame the way that we can think better, that we can make better decisions based on research and based on logic. So I would like to invite you to set aside your preconceived notions about the way that you should think, about the way that you should make decisions and instead approach this from the perspective that you're trying to learn something rather than you already know something. The reality is it can be very difficult to escape these biases and in fact some of them are completely inescapable. So it helps to know that we actually can fall victim to these, everyone falls victim to some kind of bias at some point, but being aware of the bias that helps you make better decisions and you can make decisions that kind of go against what you think you should do or what you feel you should do because of that bias. Now the biases that I'm going to talk about today, both of them are listed on Wikipedia, of course I will include a link in the show notes to the entire list as well as the specific ones that we're going to talk about today. And I would recommend that you go and take a look at some of these. All of these have been, they've been established by research, these are not just things that people thought of and put on Wikipedia, they've actually been established by studies, by research that shows significant correlations, shows significant backing for these biases, for the existence of these biases at least in those who were tested, in those people who were a part of these studies. So these are worthwhile things to consider in these biases even if they aren't present in everyone, they have been shown to be present in some people. So it's worthwhile to take a look at these. Now you may be asking yourself, Jonathan, why are you talking about cognitive biases? Why is this starting to sound like a show about psychology or about mental health or something? But the truth is, as developers, our main tool is our brains. Our primary tool is the way that we think, in the way that we make decisions, the way we take in the world around us, and especially if you are at a startup or if you are creating a product that users are engaging with, then understanding the psychology of your own decisions, as well as the psychology of your user's decisions, is an incredibly important part of your job. So as a developer, you should understand these biases because they are going to be important for making good decisions in how you spend your time, what you put your energy into, and what you create. So let's talk about the first cognitive bias that we're going to discuss today. It is called the Simovice Reflex, and it's named after a doctor, a Hungarian doctor named Ignat Ignat Simovice. I'm not very good at pronouncing Hungarian names, but Dr. Simovice, he actually discovered the importance of washing your hands between seeing patients. In fact, a lot of children were dying from basically a bed fevers, what they called it, and he did a lot of research to figure out what was going on, and ultimately he came up with this idea that he and his colleagues had cadaverous particles because they also performed a lot of autopsies in the same hospitals where these children were laying sick. And so what he came up with was the idea that cadaverous particles were on their hands, they were being transmitted to these children in the hospital. Now this was before germ theory ever came along, before Pasteur and the other scientists who were responsible for uncovering the massive amount of information around germs, in the importance of washing hands wasn't a common belief, it wasn't a common norm, and so people didn't expect that you could actually get somebody sick by touching something and then touching that person. And in particular, in this case, touching a dead cadaver and then touching a already compromised child, a child who was already sick. So the semivice reflex is actually the response from the people around Dr. Semivice, those who were responding to him as if he was actually going crazy. In fact, he ended up in an asylum late in his life because people thought he was going crazy because he had discovered this, he even had evidence to show that the bed fever, that the children were experiencing went down 10 times, it went down 10 fold. He had that evidence and people still didn't believe him because it wasn't an established norm that you could transmit disease just by touching something. So this cognitive bias can present itself in many different ways, but the basic idea here is that we have a tendency to reject a belief, reject evidence if it contradicts our already standing norms, our already standing beliefs or our already standing paradigms. And this obviously applies to the world of programming because very simply imagine somebody coming along and saying that they have found a programming language that is demonstrably better in every possible way to your favorite programming language. Now that may be difficult to swallow, that may be difficult to accept, but if they can show evidence to back up their claim, then it only makes sense for us as Developer To investigate whether or not that claim holds validity. Similarly, if somebody tells you that there is a bug in your application and they show you that that bug exists, but you choose to ignore that bug because your application is working fine and perhaps you think that your application is bug free, well, they have shown you evidence that you can't deny or that you can't ignore. And if you choose to ignore it, then you are participating in the civil vice reflex. So the better way of handling this is to always be open to new information. People allow your preconceived notions to affect whether or not you choose to accept the new evidence. This is difficult because many times we receive a lot of information over time, even evidence over time that builds up a repertoire of information in our minds and it solidifies over and over the things that we already believe. And especially if we receive the same evidence over and over, that makes us less likely to appreciate or even validate any evidence that comes in that contradicts what we already believe. So be aware of this bias. The value or the amount of evidence that you have seen in the past doesn't necessarily have any effect on the evidence that you receive now or in the future. Now does that mean that you shouldn't use your experience and use all of the things that you have learned over the years to inform your decision making? Absolutely not. You definitely should be using that information and you can also use that information to filter the evidence that is coming in. But remember your cognitive bias. Remember that just because evidence goes against what you already believe does not make it invalid. Now with that we're going to talk about today's sponsor, higher.com. On hired software engineers and designers can get five or more interview requests in a given week. Each offer has salary and equity offered right up front and they have full time in contract opportunities on higher.com. Researchers can view the interview requests and accept or reject them before they ever even talk to the company that that interview is with. Hired works with over 3,000 companies. That's a lot of companies. 3,000 companies from startups all the way up to a large publicly traded companies. There are employers from 13 major tech hubs in North America and Europe and the best part for you is that it's totally free. Now if you get a job through hired they normally give a $2,000 bonus as kind of a thank you for using their service. But if you use a special link in the show notes you can double that bonus to $4,000. That's just for being a developer to you listener. So if you get a job through hired let me say that again for those who didn't hear it. You would get a $4,000 bonus right away when you get that job through hired. If you use the special link in the show notes you can find those at spec.fm. Thank you so much to hire.com for sponsoring today's episode. So we're talking about cognitive biases in today's episode. I've chosen two to discuss and of course there are many others that you can go and research online. There is a lot of information about cognitive biases because these are kind of universal things. We're going to so much of our lives because it's talking about the way that you think and the flaws and the way that we think as it relates to logic. What we've established to be logical thinking that we can all kind of agree on. So the first bias was the the Symmovice reflex, the idea that we reject new evidence if it has opposing views to what we already believe to our existing beliefs or existing paradigms or perhaps evidence that we already have seen in the past. The second bias that we're going to talk about today is called the neglect of probability bias. In studies for the neglect of probability bias studies showed that people make illogical decisions to deal with the situations that have little or nothing to do with probability. In one example, a person suggests wearing your seatbelt half of the time and not wearing it the other half to avoid a potential situation of being stuck in a burning or a sinking car. Now a statistic show that you're much more likely to die as a result of not having your seatbelt on than you are to die from a car sinking or a car burning. And it's not really feasible to determine if and when you are going to get into an accident, much less what type of accident you will get into. So it's not really logical to choose to wear your seatbelt half of the time and not wear it the other half. That decision would reflect the neglect of probability bias. And we do this all the time. We do it when we're talking about particularly when we're talking about threats to our life, people who decide that they don't want to fly on an airplane. They're more afraid of flying on an airplane than they are of driving, even though they are statistically hundreds of times more likely to die in a car crash than they are to die of an airplane crash. So how does this apply to development? How does it apply to programming? Well we can make similar decisions based on users of our applications. For example, we may spend an inordinate amount of time testing edge cases that occur in one or maybe two instances out of every 10,000 users. If we were to look at those statistics, we may make much more informed and logical decisions that use our time and energy in much better ways. Another example is when we engage in over-engineering. This happens in many different forms. For example, over optimization of a web server in the off chance that the server will suddenly be hit overnight by millions of people and needs to be scalable for that reason. This is an example of neglect of probability because it's very unlikely that unless you have a really specific push, a really specific advertising campaign, that suddenly your site is going to blow up overnight. When a user experiences a one in 10,000 French case, we should provide a very simple catch to push the person in the right direction. But we shouldn't spend an inordinate amount of energy trying to custom tailor the perfect air response message to every single French case error because that energy is not really providing a lot of return. Another example is when you have one or two users who are asking for a feature that is incredibly unlikely to be demanded in your product by a larger number of users. In that case, you should delay the decision for how to handle that feature until much later until a significant portion of users are asking for it. Usually it is better to keep the software simple and lose one or two customers with highly specific needs than blow up your software to the point that it kind of grinds to a halt that you can't make progress or fast changes to that software. Your product doesn't have to support every single user request. This is a part of this cognitive bias that we're talking about the probabilities of how many people is this actually going to affect. It may feel incredibly important because one person or two people or even ten people are asking for it, but you have to look at the probability that your entire user base or a significant portion of your user base is going to need or even care about that particular feature. Now you should note that a part of your user base may be representative of the silent majority and you have to be aware of that and you should be doing user testing to uncover these things, but you don't have to support every single user request. To be successful, you must provide value to the statistically significant portion of your target user base. You won't make everyone happy, but you must make enough people happy to not sacrifice the core of your user base, especially don't sacrifice the core of your user base for the one or two fringe cases, the one or two in ten thousand, those one or two people who want really specific things or who are very demanding of your customer service, for example. Be certain that when you are planning your energy, when you are deciding what you're going to do, if you do sprints, for example, when you're deciding what user stories you're going to accept in the next week or two weeks sprint, make sure you look at probabilities, look at what this is going to affect and the probability that this code is actually going to matter. If you start looking at probabilities, then you can help yourself avoid the negative effects of the neglect of probability bias. I hope you've enjoyed this episode of Developer Tea. I hope you will go and look at these different cognitive biases if you would like for me to talk about a specific cognitive bias or a specific logical fallacy. I would love to hear from you. You can always email me at Developer Tea at gmail.com. Even better than emailing me, you can join the spec slack community by going to spec.fm slash slack. Of course, becoming a member of that slack community is totally free and it always will be. So go and check it out spec.fm slash slack. Link and all of the other links for today's episode will be found in the show notes at spec.fm. Thank you so much to today's sponsor, hire.com. If you are a developer or a designer and you're looking for a job, hire may be the perfect place for you to start. Not to mention, if you use the special link in the show notes, you could get a $4,000 bonus if you get a job through hire.com. Thank you so much for listening to today's episode. Make sure you subscribe in whatever your favorite podcasting app is. And until next time, enjoy your tea.