« All Episodes

What Does It Take To Change Your Mind?

Published 6/3/2020

Changing our minds on a belief or issue that we stand by is not easy and as humans but is required of us to grow as people and as professionals.

In this episode of Developer Tea, we're walking through three different visualizations to help us better understand and become more comfortable with when and why we decide to change our minds.

🙏 Today's Episode is Brought To You By: ZeBrand

Setting up your brand with ZeBrand only takes 5 minutes and you’ll have instant access to your assets to start showcasing your product today at https://zebranding.com/

🎟Upcoming Events

Wanna get your hands on the hottest #JavaScript tech?

Reserve a spot at #JSNationLive happening on June 18-19, and see speakers / instructors like Tobias Koppers (Webpack), Orta Therox (TypeScript), Minko Gechev (Angular), Matteo Collina (Node.js) and many others.

Check them out and reserve your spot at live.jsnation.com

Transcript (Generated by OpenAI Whisper)
What does it take to change your mind? That's what we're talking about in today's episode of Developer Tea. My name is Jonathan Cutrell. My goal on this show is to help driven developers like you find clarity, perspective, and purpose in their careers. What does it take to change your mind? This is a very difficult question to answer because most of us haven't really thought about it before. And beyond that, it's very difficult to remember when our minds actually change. Researchers aren't entirely sure why this is the case, but we want to remain consistent. And so we sort of block out whenever we believe that we were wrong and therefore adjust in response. And you can kind of prove this to yourself. I'd remember the last time that you had a significant change in something that you previously believed pretty strongly. I believe that you previously held strongly that you no longer do. If you can remember one of these moments in time, then good for you, kudos, because significant change is difficult. In fact, most people who have a significant change like this are likely to wrap some piece of their own ego or identity around the change itself. In other words, when we change our minds, we like to kind of wear it as a badge of honor. And there isn't anything shameful about this. We're not trying to make anybody feel bad on this episode. This is just the features of being a human. We don't like to change our minds, but it's critical. Changing your mind is the hallmark of learning. It's kind of the evidence of learning, especially later in life. Once you've learned something and you unlearn it to learn something different. So in today's episode, I want to kind of create some mental pictures of different ways that your mind might change. And none of these is scientifically accurate to how minds actually change. It's kind of difficult to nail that down. And researchers are certainly split on the subject, but thinking about the different models of ways that your mind might change may help you recognize when you're falling into one model or another. And whether or not you actually think that's a good place to be. So we're going to start with the most basic mindset, the purely rational mindset. When we say rational, I want you to focus on the word ration, in other words, incorrect measure. A purely rational person is going to ration their beliefs based on hard data only. So the mental picture that I want you to have in mind is a perfect scale. And in this case, we're talking about a scale that has two sides. And ever there's nothing on either one of those sides, it's perfectly balanced and the scale represents that balance. And of course, if you put equal weight on each side, the scale will represent the same balance. One side will cancel out the other side. In this visualization, I want you to imagine that facts are like small weights. And as you learn new facts, you might put a weight on one side of the scale or on the other side of the scale. Now in the rational person's mindset, they make decisions assuming that they have all of the available information. Of course, this is completely ludicrous because we almost never have all of the information. But sticking with the visualization, imagine that you have all of the available facts. And you can assign them to one side of the scale or the other. A rational person, once those facts are assigned, will choose the belief that aligns with the side that has the most weight, that has the most facts supporting it. Of course, the visualization compresses every fact to have the same weight. But in practice, a rational person would believe something once it has the sufficient evidence to outweigh any other belief. And we know that humans are much more complicated than this and that we are not perfectly rational creatures. On top of the fact that we certainly don't have all of the information, a purely rational person would adopt a belief when there's only slightly more evidence to support that information. This means that as new information is gathered, as we learn more, we would constantly be flip flopping back and forth between two different beliefs. Going back to our original statement about how we don't really want to think about changing our mind that much, we want to stay consistent with what we say and other people want people to be consistent in what they say. It would be totally socially unacceptable to act in this perfectly rational way. Flip flopping between two beliefs so easily and so quickly would make you seem unreliable. And so instead, our models look very different. The way we change our minds looks very different. We're going to talk about some different models, some different visualizations right after we talk about today's sponsor, Zebrand. Zebrand helps you launch a product with minimum visual brand design resources without hiring internal designers, outsourcing to costly agencies, or finding questionable freelancers. With Zebrand, you can launch your product without investing too much time and money on the branding, which means your team can give 100% focus to product development. By asking some simple questions about your product, AI-based Zebrand algorithms create a unique tailored brand toolkit, full of branding and marketing essentials, including fonts, color palettes, pitch deck templates, and much more. Zebrand is easy to use and is free to get started. And the results are very likely to be better than if you were to try to do this on your own. Go and check it out. Head over to zebranding.com. You can set up your brand with Zebrand. In five minutes, you'll have instant access to your assets and you can start showcasing your product today that's zebranding.com, zebranding.com. Thanks again to Zebranding for sponsoring today's episode of Developer Tea. I also want to take a quick moment to remind you about JS Nation Live. This is the biggest JavaScript conference in the cloud. It's on June 18th through the 19th of this year. You'll have three hour workshops, speaker video, Q&As, remote networking, remote after party. The best part is registration is free. So you quite literally have nothing to lose, head over to live.js Nation.com. That's live.js Nation.com to register today. So we're not rational. Hopefully that's not a surprise. It's not a shock to you. And asking other people to be rational is probably not a good idea. Even if we were more rational, it wouldn't necessarily end well because if I choose to be rational and other people don't value that. For example, if I was purely a rational mind changer, if I changed my beliefs every time I have evidence that outweighs other evidence, even in the slightest amount, people would see me as unreliable. And so we have to think about mind changing, about belief making, about learning in a different form. And think about it with a different model in mind, not just for ourselves, but when we're talking to other people as well. So I'm going to give you a couple more visualizations to think about ways that people might actually change their own minds. One visualization is an automatic sifter. Now this is a very special sifter because instead of sifting things based on their size, like you might find in a kitchen sifter, you instead sift them based on whether or not you already agree with them. We're going back to the idea of this sifter taking in information and sifting it. Your mind actually works this way to some degree. This is a bias called confirmation bias. And when we sift information in this particular way, we end up finding and ultimately retaining information that tends to agree with the viewpoints we already hold. Using this kind of sifting method with information makes it incredibly difficult to change someone's mind. One way to try to combat this confirmation bias is to make it harder to understand whether the information that you are consuming is something that you're likely to agree with or not. For example, if you know that you tend to agree with the slant of a particular journalism outlet, then it might make sense to try to gather news from multiple other sources beyond that outlet or even if it was possible to remove the name of the outlet altogether so that you don't know where the news is coming from. Of course, this comes with its own complications and risks, but the point still stands if you can avoid sourcing information that you already have a predisposition to agree with or disagree with, then you're less likely to find yourself kind of falling prey to confirmation bias. Now, I want you to imagine another visualization. Imagine that you have a particularly strangely shaped hole, maybe like a keyhole, and you have facts that are falling all around this keyhole, falling right on top of it, but none of them are fitting. Finally, you have one particular fact that fits in the keyhole. Now, the interesting thing is that there's only one that will actually fit, but that particular fact that fits in the keyhole is the one that matters. If you've ever heard the term key issues, as it turns out, this is kind of the point of a key issue. These are issues that tend to matter in an irrationally large way to the person who is forming that belief. Once again, when I speak of rationality here, this is not necessarily a bad or good thing, but instead it's saying that this person is treating this particular fact as special, even though there's no specific reason to do so. This often happens when a particular fact is easy to visualize when a particular issue is easy to visualize. Very practical example of this for developers is the idea that we are fixing, let's say, a visual bug on the homepage. We believe that this has really strong impact on the success of our work, but in fact, we may have been able to spend our time on an interior page, maybe a checkout, making the checkout more efficient. That could have had a greater impact towards our particular goals, but because the homepage is more visible, we may have an irrational belief that the homepage just matters more. This is the problem with key issues. Finally, I want you to imagine a picture of seventh grade science, the volcano that you created with vinegar and baking soda and probably a little bit of food coloring. Let's imagine that you theoretically would change your mind if you could fill up that bottle all the way to the top with the vinegar, but unfortunately it came preloaded with the baking soda. This is what happens when we become victim to the backfire effect. Here's how the backfire effect works. When we encounter information that we disagree with, not only do we simply reject it as we do with confirmation bias, but we may also end up entrenching ourselves and our existing beliefs even further. Even if we haven't encountered any more information that supports our existing beliefs, simply encountering information that we disagree with might cause us to entrench ourselves even further in those existing beliefs. When science is very simple, when we encounter some kind of dissonance or when we have to face the possibility of having been wrong about what we believed before, we experience negative emotions and our natural response to negative emotions is to avoid them. And so one way we can avoid them is to reject that new information entirely and retreat to where we had the good emotions that we had previously. It's much easier for us to remain feeling good about our current beliefs and reject new information than to feel bad and change our minds. So how do you get the baking soda out of the bottle? How do you avoid the backfire effect? Well, a lot of this is in the way that the information is presented. And there's more to dive into than we can cover in a single episode of this podcast. But the basic explanation here is to find a way to own the information yourself rather than being forced to give up your own belief, try to find a way to create the new belief for yourself, to accept the new belief for yourself. This is easier said than done and it puts the onus on the person who is presenting the information. And so it's harder for us to do this for ourselves. This brings me to my final point. We will always become better at things with practice. When we practice, we become familiar with the feelings that come along with mindset change. If you can practice changing your mind, then the threatening feeling of being inconsistent, that socially normative behavior that we want and we crave so badly, that threat will tend to dissipate because we know it's not a threat. When we practice it enough, we can become comfortable with it. Thanks again for listening to today's episode of Developer Tea Thank You. We end today's sponsor, Zbrand. Head over to zbranding.com to get started today. And of course, another reminder to register at live.js nation.com. This is a free registration for the largest remote JavaScript conference in the world, essentially. Go and check it out live.js nation.com. Thank you so much for listening to this episode. This episode and every other episode of Developer Tea can be found at spec.fm. Today's episode was produced by Sarah Jackson. My name is Jonathan Cutrell and until next time, enjoy your tea.