Plato's Cave and Defeating Biases
Published 8/14/2017
In today's episode, we talk about what you can see - and what you cannot.
Today's episode is brought to you by Linode.
Linode provides superfast SSD based Linux servers in the cloud starting at $5 a month. Linode is offering Developer Tea listeners $20 worth of credit if you use the code DEVELOPERTEA2017 at checkout. Head over to spec.fm/linode to learn more about what Linode has to offer to Developer Tea listeners!
Transcript (Generated by OpenAI Whisper)
If you had an education similar to mine, then you probably at some point read the allegory of the cave. It's been a long time since I read this particular piece of literature, but it was written by Plato and the allegory of the cave essentially goes like this. There's some prisoners that are locked down in a cave and all they can see is some shadows on the wall. And they give the shadows names, and essentially this represents their entire reality. These shadows do. And Plato's whole point here is that the philosopher, the one who can think philosophically rather than only through their senses, they have elevated out of the cave and they can see what is causing those shadows. They can see the men and the lights, etc. Today's episode we're going to talk about things that might be acting as caves for us. You're listening to Developer Tea. My name is Jonathan Cutrell. My goal on this show is to help you become a better developer. Sometimes that means becoming quite simply a better thinker or a more well-reasoned person. And while my goal on this show is not to stir up controversy or make you feel incredibly uncomfortable necessarily, sometimes that means you having a moment where you think maybe I should change the way that I look at this. Maybe I should change the way that I think about these subjects. So hopefully this is one of those episodes. And again, my goal is not to make you feel frustrated or uncomfortable or hopeless, but rather to provide you with a sense of hope for the future and hope for your career and to always inspire you to step upward, to become better at what you do. And quite honestly, a lot of that process of becoming better at what you do is an inward process, looking inward at yourself and, you know, bettering who you are. And I believe that one of the ways that we can better who we are is really wrapped up in this discussion today. So we're talking about this idea, the allegory of the cave, but perhaps more importantly, we're talking about the idea of what limits our vision and what limits our vision in a way that we actually can't even perceive it in the allegory of the cave, for example. The prisoners are not aware of anything other than the cave, right? So they don't really know that the cave exists in some sense. And there are things that we today as developers and really just as people living in this time period of the world, there are things that we can't really see that limit our vision. And I'm not trying to get too philosophical here, but these are simply biases. We talked about biases on the show before, but these are biases that we may not recognize exist. Biases that limit the way that we think in kind of a top-down filter way. It's almost like our minds are running inside of some kind of virtual machine and we don't realize that they're running in this virtual machine that has specific limits, right? So if you think about this concept of biases that limit us, it's important to recognize what that can actually do. It can change the way you think quite realistically about any subject. Now if some philosophy major, if you stumble on this episode and you want to pick a fight about the meaning of the allegory of the cave, I certainly don't mean to pick a part that particular piece of literature because that's not what this episode is about. Instead, we're talking about this concept of biases and we're kind of talking about a specific bias. This is what you see is all there is. I'm taking this phrase directly out of Daniel Connman's book, Thinking Fast and Slow. You all know that I read this book earlier this year and I'm continually seeing it being applied in very practical ways. Highly recommend the book, of course, but what you see is all there is. There's so many implications of this bias and really some of the most important ones, for example, for us in the year 2017 is, for example, the prominence of a particular viewpoint or the prominence of a particular subject or a particular set of facts. What this bias does is it creates a perception narrative. It creates this picture of the world because all you see is this subset of reality. All you see is the things you come in contact with on a day-to-day basis, that shapes your picture of reality. This is the way the human mind works. Everyone has this same kind of bias, right? That shapes your picture of reality. The problem is that very often, in fact, pretty much all the time, that subset of reality, the picture that you have that you've taken in through the things that you can see, the things that are available to you, that picture is incomplete. As it turns out, this is the source of a lot of problems that we have both at a cultural level, but also at a personal level, even in our work and our day-to-day lives, what you see is all there is, the limited vision, the cave that is surrounding you. This can cause a lot of problems because you have this incorrect perception of reality and the truth is, the things that you can't see are still very close to you. In other words, this reality, that the superset of reality that you're unable to see, you're unable to get beyond your cave, the superset is still affecting your life and you're acting inside of your subset, right? So in other words, you have a altered picture of reality and your life is affected by the actual reality. So what does that mean? Well, effectively what it means is, because of the things that you're unable to perceive, because of the things that you can't see, your perception is skewed. You have a bias towards the things that you've experienced. You have a bias because you can't experience all of reality. Now this sounds kind of hopeless, doesn't it? It sounds really difficult to overcome, but as it turns out, there are ways that you can fight against what you see as all there is, this kind of bias. We're going to talk about those right after we talk about today's sponsor, Linode. Most of you listening to the show have listened to previous episodes. Linode has been a sponsor for quite a while. We've talked about Linode before. Linode has such a great offering. They have 24-7 customer support and their plans started just $5 a month. Okay, $5 a month, that's so cheap. $5 a month for a gigabyte of RAM on a server. Of course, you can go up to $10 a month for a two-gigabyte server and you can get these servers running in under a minute. If you don't have a Linux server in the cloud, Linode is a fantastic option. They have eight data centers. They're high memory plans. If you're looking for something more powerful, they start at $16 gigabytes for $60 a month. That breaks down to $3.75 per gigabyte of RAM. Now remember, Linode's servers are all run on SSD storage. They have a 40-gigabyte internal network. They're running on Intel E5 processors and you can do pretty much anything you can do with Linux. You can do on Linode. So go and check it out. They have a seven-day money back guarantee. Go and check it out. Spec.fm slash Linode. Use the code Developer Tea 2017. That's Developer Tea 2017 to get $20 worth of credit when you check out. That's data fam slash Linode. Thank you again to Linode for sponsoring today's episode of Developer Tea 2017. So how do we overcome this bias of being kind of hold in to our perspective? This is really kind of a meta bias because it creates other specific biases that we aren't going to cover in today's episode. How can we expand our vision or how can we overcome this problem of not being able to see the things that are blinding us? One of the first things we have to do is recognize that our minds are largely trying to protect us. They're lazy machines that don't want to do too much thinking. So we have mechanisms in our brains that are trying to protect us with as little energy as possible. This is why we create, for example, categories that we can easily put people into. This gives us a very quick algorithm to decide whether or not a person is going to give us some kind of positive advantage or is going to be a potential threat. Of course, this is a gross oversimplification of what the brain does in order to protect us. But this is kind of a fundamental understanding that we need to have to be able to talk about biases because in order to change our biases, we must be willing to change the way that we think. In order to change the way that we think, we have to be willing to let go of some of our perhaps our beliefs or change our conclusions, change the way we think about something. And sometimes this can be resisted because of some kind of internalized protection mechanism. So in order to grow, you have to be willing to question your previously established assumptions, your previously established beliefs or your previously established perspective of the world. Because once again, those are all made up based on what you can see. So let's start there and say the first step is being willing to recognize and then therefore adjust the way you see the world. The second piece of this puzzle is allowing yourself to create the plausible state that you previously didn't expect to be plausible. So this is a mental exercise. Once you hear a given phrase, for example, pigs flying, this is a specific example used in Daniel Coneman's book, once you hear that phrase of pigs flying, then your mind has initially created the mental picture of a pig with wings on it. Now there may be other things that are called to your mind, you may have other associations, perhaps a person that you know that used to use that phrase all the time, for example. But ultimately, all of us have this final creation, this moment in our minds that has pictured and then ultimately judged the plausibility of that picture. So in a functional sense, we have given our minds the space for that split second to believe that pigs flying is plausible and then subsequently we judge whether or not pigs flying is actually plausible. So in order to avoid reinforcing or creating new biases that are irrational or unreasonable, we have to constantly be reintroducing this idea of judging the plausibility of some reality. And maybe that you have encoded a series of realities as not plausible already in your mind. And it's important that you take the time and take the energy. If you're trying to decode yourself out of some of these biases, it's important to take the time and take the energy to question whether those encodings that you have stored in your brain, whether those are actually reasonable. If they're actually rational, if you have intentionally put those encodings in your brain or if they're there as a result of some other process or some other experience in your life, ultimately it's not possible for us to defeat every bias that we have. And it's also not possible for us to even know every bias that we have. What is possible is for us to recognize that we are biased. Even the people who do this research, the people who are strongly aware of the presence of bias, even those people like Daniel Coneman, those people have to recognize that they have biases that affect the way they think and affect the way they act. And it's important for us to recognize that so that we can actively work against those biases because the reality is most of those biases end up marginalizing either us or another person or another group of people. In your day to day work, your biases may actually decrease the quality of your code. They may actually undermine your opportunities in your career. So working actively to defeat biases or to actively be aware of them, be aware of the possibility that your conclusion could be wrong and always open to new evidence, always open to other opinions, recognizing that you do have that ability to fail. You have that ability to think in a strongly biased way without even realizing it. Thank you for listening to today's episode of Developer Tea. I realized that this was a little bit more of a philosophical discussion and I highly recommend that you all take the time to go and do some more reading and research on cognitive biases. I think this is so important and it's something that is underserved both in the education system as well as in professional development systems, recognizing that internal way of thinking and how we actually go about perceiving the world around us that can be hugely beneficial for you to spend your time and energy on. Thank you so much for listening. Thank you again to Linode for sponsoring today's episode of Developer Tea. With Linode, you can get a server up and running in just a few minutes. You can keep it running for $5 a month for their lowest gigabyte plan. That's a gigabyte of RAM. You can get $20 worth of credit, which is equal to four months of that gigabyte plan. $20 worth of credit just for being a Developer Tea listener, head over to spec.fm slash Linode and use the code Developer Tea 2017. Check out. Thank you again to Linode for sponsoring today's episode of Developer Tea. If you're enjoying Developer Tea, make sure you subscribe and whatever podcasting app you use. Of course, you can always find the episodes at spec.fm. If you have any questions for me, you can always reach me at developertea.gmail.com. Thank you so much for listening to today's episode and until next time, enjoy your tea.