We carry our minds with us through our careers. Our perceptions of the world around us are based on our biases or predispositions. In today's episode we're talking about models of thinking and how we can identify biases that affect our working habits and relationships.
If you're enjoying the show and want to support the content head over to iTunes and leave a review! It helps other developers discover the show and keep us focused on what matters to you.
This is a daily challenge designed help you become more self-aware and be a better developer so you can have a positive impact on the people around you. Check it out and give it a try at https://www.teabreakchallenge.com/.
Whether you’re working on a personal project or managing your enterprise’s infrastructure, Linode has the pricing, support, and scale you need to take your project to the next level. Get started on Linode today with a special $20 credit for listeners of Developer Tea.
P.s. They're also hiring! Visit https://www.linode.com/careers to see what careers are available to you.
Transcript (Generated by OpenAI Whisper)
Think about a recent, simple decision that you made. How rational was your decision-making process? This question is really difficult to answer if not impossible, and most likely the answer that you provide doesn't have enough information to really be accurate in the first place. We talked about this in the last episode, our striving for information, sometimes we can strive for information even when we don't necessarily do anything with that information. In today's episode, I want to talk about two more biases, this is something we talk about a lot on this show, two more biases that can change the way that you approach your work and your relationships. My name is Jonathan Cutrell, you're listening to Developer Tea. My goal on this show is to help driven developers find clarity, perspective, and purpose in their careers. As we've said probably a hundred times on the show before, understanding the way our minds work can be an incredibly powerful tool that we have. We carry our minds with us throughout the rest of our careers, throughout the rest of our lives, and understanding how our minds work, how our minds change, how we as humans perceive the world around us can help us understand more effective ways to approach our work and our lives based on what we know about our biases, about our predispositions. Of course, people probably blow out of proportion the effect and to what degree biases affect our interactions with the world because it's not going to be the same for everyone. Some biases are more pronounced in some people than others, for example. So take everything that you're hearing about biases on this episode with a big helping of salt. But with that said, by understanding these models of thinking, really, biases are not some special mechanism. They're models of thinking. Their ways that our thinking is distorted from reality, by understanding them, we can figure out times when we might be actually experiencing these things. So we're going to talk about two of these biases today. We'll talk about the first one. We'll take a quick sponsor break and then we'll come back and talk about the second one. The first bias that I want to talk about today starts with a basic story. And we're going to put you into this story. Imagine that you're working on a team and you are tasked with improving the reliability of the server. And in order to improve the reliability of the server, you have a certain amount of resources. Let's say you have a certain amount of time. It's a five or ten working days. And in those ten working days, you get to decide what to do with that time. So we're going to present two different scenarios, two different error rates and solutions here. First error rate, you're having an error rate of eight percent is a really high error rate, but you can cut it in half. You can go from eight percent down to four percent by implementing a particular solution. Let's say a load balancer, right? And the second scenario, you're having an error rate of one percent was much lower, but you can totally eliminate those errors by going with some solution. Let's say upgrading the server. Let's say that both of these solutions cost the same amount of your resources. Which scenario would you prefer? Which scenario would you choose? Interestingly, a lot of people end up choosing the scenario where they go from two percent to zero percent, rather than going from eight percent to four percent. And the reason this is interesting is because the absolute number of people experiencing this error, in other words, the risk involved, is cut by four percent in one scenario. And it's cut by only two percent in a different scenario. And this bias exists because humans desire certainty. By taking something all the way down to zero, we are willing to pay more. We're willing to experience more pain, to expend more of our resources, our energy, and taking something from a very low number to zero, then taking something from some higher number to just a lower number. Even if the difference between that high and low number is much more than the difference between the very low number and zero, as in this example. And the way that this tends to play out is mostly an economics question. If we have a solution that totally, and this goes for kind of hundred percent solutions as well, if we have a solution that totally eliminates a problem, we're willing to pay usually quite a bit more than if a solution almost totally eliminates a problem. This is called the zero risk effect, you can also Google something called the possibility effect. It's essentially the same kinds of mechanisms at work in our brains. And as a developer, what this might cause you to do is spend an an ordinate amount of time on very low return tasks because they take you from 99.9 to 100, or they take you from 0.05 to zero. This sense of completion may also have something to do with our desire for closure or resolution, it may have something to do with us kind of freeing up some mental space so we can focus on one thing rather than focusing on multiple things. And at the end of the day, this bias doesn't necessarily have to be a bad thing. For example, let's imagine that you have a bunch of different debts. And one of those debts has a higher interest rate than the other one, let's say one has an 8% and the other one has a 5% interest rate, but the 5% interest rate is very low. Maybe you only have 100 or 200 dollars left to pay on that one, and you have a couple thousand dollars on the higher interest rate. Well, rationally speaking, it makes more sense to pay your 8% loan first because you're going to end up paying less in overall interest. But psychologically, you may stay more motivated if you pay off one of those two loans quickly, namely, if you pay off the 5% loan quickly. These two different methods of thinking about paying off loans are called snowball, which is paying off the small ones and then eventually paying off the biggest ones and then the avalanche method, which is paying off the highest interest loans first. The avalanche method is the rational method. But when you take into account human behavior, you realize it's quite possible that the motivation of seeing debts being paid off may help me to continue to enact other behaviors beyond simply paying off the debt. Additionally, as you can focus on fewer and fewer payments, your kind of mental capacity to understand the debt may increase. So I want the message to be loud and clear here that are biases are not always bad. Biases sometimes have good reason for existing. Sometimes they help us in ways that we wouldn't be able to intuitively expect. But when they go wrong, unless you know about that bias, unless you understand what's going on that circuitry that's in your head, you may not know why it went wrong. You may mindlessly spend that extra effort to take something from 1% to 0, rather than taking something from 20% to 10. We're going to take a quick sponsor break, and then we're going to come back and talk about the bias you may have about your co-workers. But first let's talk about today's awesome sponsor, Linode. With Linode, you can get access, root access to a Linux server for as low as $5 a month. If you need something more, you can get a dedicated CPU or even a GPU compute plan, which is suitable for things like AI and machine learning or video processing. And all of this is on top of native SSD storage with a 40 gigabit internal network and industry leading processors. Linode also has a new single page app cloud manager. You can find it at cloud.linode.com, by the way, that's open source. So you can go and make that better. If you wanted to contribute to Linode, Linode is a company of developers building products for developers. So you can expect these products to be well thought out and more intuitive than products that are not as developer focused. You can also get a $20 credit as a new Linode customer by heading over to Linode.com slash Developer Tea. That's Linode.com slash Developer Tea and use the code Developer Tea2020. That's Developer Tea and then the numbers, 2020 at checkout. Thanks again to Linode for sponsoring today's episode of Developer Tea. So we're kind of back to the core of the show we've talked about bias so many times on this podcast. And we've even talked in passing about the biases that we're talking about on today's episode. But this is one that we haven't really focused hard on because it's kind of a nuanced bias. It's called outgroup homogeneity bias. So what does that mean? Well, it means that someone that we perceive as outside of our group, outside of our tribe, outside of the people that we identify most with, we see the people who are outside of that as more homogenous. In other words, less diverse than the people who are a part of our group. So developers will tend to see the broader community of developers as more diverse than perhaps the broader community of designers. And you can kind of intuitively understand this one. If you imagine the types of people that you expect to be designers or the types of people that you might expect to be accountants or doctors in your mind, it's very likely that you don't imagine all of the kind of diverse possibilities for who might fill those roles as you do when you think about developers. Now, it's not to say that every group has equal diversity. Certainly, that isn't the case for software engineering. But as a general rule, we are more biased to believe that other groups are less diverse than they actually are. And this can have some really interesting consequences. For example, if you hear that someone is a part of a different group, and this isn't just about jobs, of course, this is also about any other kind of grouping factor that you may identify with versus not identify with, someone of a different political party, someone who is from a different place than you are with different nationality, or even someone who identifies with a different school than you do or plays a different sport than you do. These are all kinds of ways of creating in and out groups. And the more distance that you have between you and this out group person, the more likely you are to create archetypes, stereotypes. And we can come up with some reasonable hypotheses as to why this is. In our minds, if we don't interact with people, very often, if we don't interact with a certain group of people very often, then according to our survival brain, it's unnecessary to learn all the nuanced details about those people. Instead, it makes more sense to kind of put a placeholder. Think about this as a sketch or a wireframe. Whereas you're in group, you have a high definition photograph of what those people are like. This isn't necessarily the same thing as stereotyping because it's difficult to create stereotypes solely based on whether you identify as a part of someone's group or not. Instead, this is more about relationships. It's about how your brain kind of optimizes for understanding the likely relationships that you need to make. But here's what our brain didn't really predict. What our level of evolution couldn't predict. And that is that we encounter people who are in our out groups far more regularly now than we did even 50 years ago. Certainly much more regularly than we did a thousand years ago. And so our preconceived notions about what a given person is like are very often going to be wrong. They're going to be limited. And for that reason, we may make judgments ahead of time about what our kind of relational dynamics should be like with this person. So what do we do about this? Well, there's not really any prescriptive way to combat bias necessarily. By understanding that this exists, by understanding that you view out groups as having much less diversity than your in groups to, it makes sense for you to build relationships with those people and teach your brain about the diversity in those groups. Before I'm meeting with someone who is a part of your out group, remind yourself that this person is an individual that are in and out groups. These social constructs that we have don't necessarily define the way that an individual will be. This is very important for managers. It's very important for people who are in hiring roles. It's important for people who are in sales roles, even. This becomes important. The more and more kind of exposure that you have to mixing groups, people who are not necessarily in your day to day in group. The more exposure you have outside of those in groups, the more likely it is that your assumptions about those people can come back to haunt you and make your relationships at very least less than optimal. Unfortunately, we can't really eliminate biases selectively. We can't choose to not have them. This is actually something called the bonus one, the bias bias. The idea that if you're simply aware of a bias that you can somehow escape the effects of it, and this is simply not true. So instead, we need to come up with alternative ways, alternative routes for our behaviors. Finding ways to counteract these biases and sometimes ways to simply appreciate them. Thank you so much for listening to today's episode of Developer Tea. Thank you again to Linode for sponsoring today's episode, however, to Linode.com slash Developer Tea. Also, by the way, Linode is hiring over to Linode.com slash careers to check out the open positions. Thank you so much for listening to today's episode. Today's episode was produced by Sarah Jackson. My name is Jonathan Cutrell and until next time, enjoy your tea.