Investigating Your Invisible Systems
Published 7/17/2025
This episode focuses again on the fundamental principle that your systems are perfectly designed for the outcomes you are experiencing, regardless of whether those systems were intentionally or accidentally created.
Here are the key takeaways from the episode:
- Uncover how your systems, whether intentionally or accidentally designed, are perfectly configured for the outcomes you experience. The implication of design means choices have been made in setting up a system, but your intent is less important than the actual outcomes produced.
- Learn why your intent is less important than the actual outcomes when evaluating your systems. If your intent was the sole factor, everyone would achieve their desired results. Instead, systems should be judged by the outcomes they generate.
- Discover the concept of "accidental design," where unseen factors influence system behaviour. This can be inspired by Goodhart's law, where a measure becomes a target and changes behaviour, or by environmental factors, such as how your workspace impacts your thinking and heart rate.
- Explore how "invisible systems" – the unexamined rules and assumptions that govern your daily life – profoundly influence your actions and results. These are forces changing your behaviour that you likely haven't evaluated, such as automatically accepting all meeting invites.
- Understand that human behaviour, including your own, can be an outcome of your systems. This perspective offers the highest leverage opportunity for change, as modifying the underlying system is more effective than relying on temporary motivation or addressing knowledge gaps in isolation.
- Realise that system boundaries are often arbitrary, and a system's design must account for all factors influencing its outcomes. For example, a quality assurance system cannot be considered good if it fails due to a "talent" issue; the talent pool and hiring procedures are part of the overall system affecting the outcome. Ignoring such factors because they fall outside perceived boundaries of responsibility can lead to irreducible or expensive risks.
You are encouraged to investigate the invisible parts of your systems and write down the assumed rules that govern your life, even if you haven't evaluated their truth or helpfulness.
📮 Ask a Question
If you enjoyed this episode and would like me to discuss a question that you have on the show, drop it over at: developertea.com.
📮 Join the Discord
If you want to be a part of a supportive community of engineers (non-engineers welcome!) working to improve their lives and careers, join us on the Developer Tea Discord community by visiting https://developertea.com/discord today!
🧡 Leave a Review
If you're enjoying the show and want to support the content head over to iTunes and leave a review! It helps other developers discover the show and keep us focused on what matters to you.
Transcript (Generated by OpenAI Whisper)
Hey everyone and welcome to Developer Tea. My name is Jonathan Cottrell. My goal on the show is to help driven developers like you find clarity, perspective, and purpose in their careers. Today's episode is going to be focused on clarity in particular, and perhaps some on perspective, but mostly about clarity. In this episode, we're going to be talking more about systems. We've recently been talking about how your systems are perfectly designed for the outcomes that you're experiencing. All right, so go back and listen to that episode if you haven't yet. That is a very important kind of fundamental principle to walk into this episode with, because there's a little bit of an edit. Maybe not an edit, a clarification that we should make about systems. We said that your systems are designed perfectly for the outcomes. It's important to recognize, and perhaps we did talk a little bit about this in that episode, but it's important to recognize that your system exists whether you intentionally design it or accidentally design it. So let's talk a little more about what we mean by intentionally versus accidentally designing something. The implication of design is that you've made some kind of choices, right? You've set up your system in a particular way. And the false belief is that when we make decisions, our intent, right, whatever we are trying to do with the system, is the thing that matters the most. This is not the case. If all that mattered was our intent, then all of us would be exactly what we want to be. All of us would have figured out how to make all the money that we want to make. I intend to do that. Well, I imagine you do too, right? So why do our intentions not necessarily match up with our outcomes? Because what we've designed, and our system doesn't necessarily accomplish the things that we intend to accomplish, right? So our intent is less important than the outcomes. We need to judge our systems by the outcomes. We can also investigate whether the choices that we've made, the intentional parts of our design, are the principal components, the primary factors that are determining those outputs, those outcomes. Or if it's something else, it's very possible, I would say even likely, that if you have an intent, if you've intentionally designed a system, and you're not getting what you want, there's very likely something that is happening that you're not really seeing. There's some part of your system design that is accidental. So what do we mean by accidental design? The simplest example of this, is inspired by Goodhart's Law. The idea that with some system you've created a measure, and that measure has now become a target. And so now you've changed behaviors around the target. Where previously it may have been a good measure, it is no longer a good measure, because there are new behaviors introduced. You've encouraged or discouraged some kind of behavior in order to change that target. That's an example of an example of a good measure. You've created an accidental change, an accidental design factor. You've accidentally designed your system in a way that encourages or discourages a particular behavior. Gamification is a typical terminology that you hear with relation to this subject. There's also other accidental design decisions, or non-decisions maybe. For example, a very, very common one is people's workspace. Your workspace, whatever is around you, is going to have some kind of impact on the way you think. So if, for example, your workspace has more open space, then there's a chance, and this is based on some psychological research, there's a chance that you're going to have a little bit lower heart rate, because you can see something that is further distance away. Right? This is, you know, there's some biological factors and some kind of psychosocial factors or something like that at play there. I'm not, this is in my area of study, but the important kind of principle here is that the way you design your environment actually makes a big difference. Speaking of systems and designing environments, if you haven't read James Clear's book, Atomic Habits, hopefully, I mean, certainly if you listen to this podcast regularly, you've probably come across this book. James talks about how important our habits are and how our systems are designed, determines so much about our ability to adopt new habits. Habits are, in many ways, kind of expressions of the systems that we have around us. And James even says that we don't rise to the level of our ambition. It's, I'm paraphrasing. We don't rise to the level of our goals, I think may be the word he uses. We fall to the level of our systems. Our systems are the things that are there without us necessarily intervening. It doesn't require as much willpower to follow a system. James talks a lot about environment, for example, designing your environment so that you're getting the right cues and you're hiding the wrong cues. Go and read the book. I'm not going to give away the entire kind of model of thinking because that book is worth reading for sure. Add it to your reading list if you haven't read it yet. The important takeaway here is that our systems are not necessarily just the things we put together intentionally. Our systems are also not aware of our intent. We could say that I want to, let's say, increase quality of my code base. That's my intent. And in order to increase quality, I've decided that the best thing that we can do is to build a system that is more efficient and more efficient. And that's what I'm going to do. The first thing that I'm going to do is to have all of the product managers read the code base front to back. And I'm going to set up an expectation that all new product managers need to read the code base front to back. Yeah, this is a system, right? This is some kind of system that I've set up. And it has various triggers and it has various stocks and flows that you might be able to diagram out. And it's going to be a system that's going to be generating the output or the outcome that I want to generate. My guess is it probably wouldn't be very effective. Hopefully that's your guess too. But my intent may be fully intact. Right? So our intent we can set aside in terms of evaluating whether the system is good or bad. We can use our intent to kind of measure the outcomes. That if we want something, we want some outcome, then we implement a system in order to get that outcome that we want. We should be able to measure whether or not we are actually getting it. We can compare our outcomes to what we're actually getting. But the next step isn't to throw our hands up and say, well, we should try harder. It's not to throw our hands up and say, well, somebody's not doing it right. Somebody's not following the system because the system is foolproof. We should be investigating what it is that the system is generating and most importantly for this episode, the invisible parts of our system. Invisible things that are affecting our system. You know, we keep saying system over and over and this may feel like if you're a software engineer, you may feel like this is limited to managers. Or you may feel like this is limited to people. Or you may feel like this is corporate speak or not necessarily germane to your software engineering. Your IDE, your local setup is a system. You have designed your IDE to encourage some behaviors and discourage other behaviors. Are you getting what you want out of it? When you think about what you're designing, in your environment especially, those are the systems that are going to encourage or discourage certain outcomes for you. Most of the time, when we talk about these invisible systems, we discount that our behaviors can be included in the outcomes of a system. I want to really focus in on this because if you don't take anything else away from this episode, this is the key point to take away. Pause it, get something to write this down. The outcomes from your system are not necessarily just measurable dollar amounts. It's not necessarily the flow of candidates through your hiring pipeline. It's not just the pass rates of your test suite. That's not the only kinds of outcomes. The outcomes that you can have from a system can be human behavior, your own behavior, your own behavior included. You can design systems for yourself. If you start thinking about your own behavior through the lens of this first assertion from a few episodes ago that our systems are perfectly designed for the outcomes that we're experiencing, then my behavior is an outcome that I'm experiencing. I'm responsible for it. What is the systematic change that I can make? And this is really kind of the, some of the core fundamental ideas that James explores in his book as well. What is the systematic change that I could make such that my behavior changes in a way that I want? We don't have to necessarily fight ourselves over doing or not doing the right thing. If we can evaluate that we want to change our own behaviors, the highest leverage opportunity we have is to look at our systems. The invisible things that encourage us to behave in one way or not to behave in one way, that is where our highest leverage opportunity is. Not in getting some kind of motivation. Motivation is helpful for a very short amount of time. Most of the time it's not necessarily a knowledge gap. There may be a knowledge gap, but it's not necessarily a knowledge gap. How do you avoid having knowledge gaps in the future though? Reading that one book to close that knowledge gap, and then in a year from now, you've got another knowledge gap to figure out. What is the system? What is the engine? What is the generator? What is common about many strung together events, many strung together behaviors in your life? These are the systems that you fall to. So I want you to focus on identifying. Now, the most important thing, or I guess the most important exercise I want you to do in response to this episode. If I could have you do one exercise, I would have you get out a piece of paper and write down the invisible systems. Don't write down your typical commitments that you make like, oh yeah, I wake up at a certain hour in order to go to the gym. Those are the more visible systems that you've set up in your life. You know, oh, I have these meetings that are on my calendar. Those are visible systems. I want you to write down the assumed rules that govern your life. These are rules that you more or less are taking for granted. That you believe that you need to behave in a certain way. Maybe you haven't even evaluated whether that's true, but you've taken these rules for granted. These rules are roughly mapping to some kind of invisible system in your life. Right? Or set of systems. We don't necessarily have to, you know, identify specific atomic systems in your life. It's all kind of at the same layer. You can think about it that way. These are forces that are changing your behavior that you probably haven't evaluated in some time. Right? Think about, think about forces that, that encourage you to always accept meetings. Right? This is, this is a rule that you may have. Anytime somebody has invited you to a meeting, that it's important for you to show respect by accepting their invite. Right? So you live by this rule that maybe you haven't really evaluated whether that rule is producing the outcomes that you want. I imagine that most of you are pretty busy. So accepting all invites, is going to reinforce your busyness. Right? What are the outcomes that you really want? Do you want to continue being more of it, more and more busy? Or do you want to examine that invisible system, that invisible rule that you've just assumed is true, or assumed is helpful, or assumed is important? Dig that up and reevaluate and determine what systems are you following accidentally. And then you can start to think about what are the things that you're doing, what is the design, the accidental design of systems that is affecting your behavior day to day. Thank you so much for listening to today's episode of Developer Tea. You've heard a couple of plain noises in the background. It's one of the changes that have come with some of our production differences in moving to recording video. Again, we're going to be releasing videos. This is the third episode where we actually have a video recorded. It's not yet released. It's not yet released to iTunes. If you're listening right after the day that this is recorded, of course, you may be watching me say this right now, and then you'll know that this is not the case anymore. But we're recording videos. We're getting all of our kind of assets put together. I do have a full-time job that I'm, the podcast is not my full-time job. So thank you all for being patient and waiting for this video to come out. In the meantime, go and subscribe in whatever podcasting app you currently use to Developer Tea. And let your friends know about this. Once this thing is out on video, you can share the video with your friends just as easily. Thanks so much for listening. And until next time, enjoy your tea.