Today, we’re going to explore how the way things are presented can have huge implications for our decisions. Without us every realizing it. How a simple change of wording can dramatically influence multiple different medical outcomes. What accounts for an 82% difference in organ donation rates, and how much of this operates at a subconscious level beyond our conscious experience. This episode is going to focus on drilling down and understanding a specific cognitive bias. A mental model. To help you start building that mental toolkit that we’ve talked about in previous episodes. When we did the interview with Shane Parish, when we did the interview with Michael Mauboussin, both of those episodes dig down and drill in and explain the concept of making better decisions by building a toolkit of mental models. Of different ways of understanding the world. Ways of understanding reality. And if you want to drill down and get to the fundamentals of why you should build that toolkit, and how it’s important, I highly recommend checking out both of those interviews. The mental model that we’re going to focus on today is framing bias. Framing bias, along with priming which we covered in the last episode, and anchoring which we’re going to cover in the next episode. Are all cognitive biases that you want to know, understand, and be aware of, so that you can add them to your mental toolbox, so that you can be a more effective decision maker, and so that you can understand reality more effectively. I wanted to start out with a quote from the book Nudge by Richard Thaler. Great book, very focused on framing and describing framing and its implications.
QUOTE: “The false assumption is what almost all people, almost all the time, make choices that are in their best interest, or at the very least are better than the choices than would be made by someone else.” End quote.
One of the things we’re going to discover about the framing bias is that often, when we make choices, we think that we’re making choices based on logic, based on morality, based on rationality. But in many cases, the entire basis for why we made the decision is the frame. And by the frame, I mean the entire basis for the reason that we made that decision, is simply the way the question was worded. The framing effect, or the framing bias, is a cognitive bias in which people react to a particular chose in a different way, depending on how that choice is presented. There are three particular books that I really like that talk about framing, explain it and drill down into it. The first is Nudge by Thaler and Sunstein. And I quoted from Nudge a moment ago. The second is a book that we’ve talked about in the past, Thinking Fast and Slow by Daniel Kahneman. Again, that book is very dense, very technical, but also incredibly rich in information. Not the best starter book if you want to go down this path and learn about a lot of these topics, but unquestionably a book you must read if you ever want to have a deep understanding of how some of these biases work. Lastly, Think Twice by Michael Mauboussin. Again, previous podcast guest, someone we’ve talked about. If you want to get a slice or a view of how Michael thinks about the world, definitely listen to the interview that we did with him. But Think Twice by Michael is an amazing book that really covers a number of different cognitive biases and especially drills down and explains very effectively the framing bias.
So, we’re going to look at a few different examples of how the framing bias can shape and impact our decision making. Or, shape an impact the decision making of people that we often consider experts. And remember in the podcast episode in the past we talked about the authority bias when we went through the “Weapons of Influence” series. The authority bias, many times we think that people in authority have a special view on the world. That they know more than we do, that they make better choices than we do. In reality, authority, many times, doesn’t matter. It doesn’t make that big of a difference. Authority gives us a sense of confidence, gives us a sense of certainty, but it’s often falsely placed confidence, falsely placed certainty. And you’ll see that in a number of these examples. Let’s drill into the first example.
The first example was a study conducted by Kahneman and Tvirksy in conjunction with Harvard Medical School. We’re talking about serious experts here. This study was a classic example of the concept of emotional framing. The participants in this study were physicians, so they’re not students. These were doctors, these were practicing physicians. They were given statistics about two different treatments for lung cancer. One option was surgery, the other option was radiation. The statistics that were given the five-year survival rate clearly favored surgery. But there was a little bit of a twist. Surgery is slightly riskier than radiation in the short-term. So, the actual statistics were that they one month survival rate for surgery is 90%. Or if you look at that another way, there’s 10% mortality in the first month of surgery. But going back to the data, thinking about, looking at the data. Remember, the data that the doctors were given clearly showed that surgery was the better option long-term, for all of the patients. The results, 84% of the physicians chose surgery as the option when they were told that the one-month survival rate for surgery was 90%. When the physicians were told instead that surgery has a 10% mortality rate in the first month - again, these are the same sides of the coin, right? One is 90% survival, obviously implies a 10% mortality rate. But the doctors were only told one of those two sentences. The doctors that were told not that survey had a 90% survival rate but rather that surgery has a 10% mortality rate in the first month, those doctors, only 50% of them chose surgery. A 34% difference in the outcome. Surgery was clearly the optimal procedure, clearly the best choice in all instances. But just a slight tweak of the frame, a slight tweak of the wording, resulted in the doctors in the second case, the doctors that were presented with the fact that surgery has a 10% mortality rate in the first month, 34% fewer of those doctors made the recommendation of surgery.
So, from 84% down to 50%. That’s a massive change in something that seems so obvious. Right? If there’s a 90% survival rate, clearly that means there’s a 10% mortality rate. But the way that our brains are wired, the way that the human mind is structured, is that presenting something, or as we would say - framing something - remember, we’re talking about the concept of framing. Framing something a different way, even though logically they’re equivalent, logically they’re exactly the same thing. But framing them in a different way, this procedure has a 90% survival rate, versus this procedure has a 10% mortality rate, you know - it even sounds better, it sounds safer. I’d rather have a procedure with a 90% survival rate. But they’re the same thing. And these doctors at the Harvard Medical School were influenced simply by that framing. Only 50% chose the optimal procedure when they were presented with that procedure only having a 10% mortality rate. Whereas, 84% chose the procedure when they were presented that it had 90% survival rate. As Kahneman says, QUOTE: “Medical training is evidently no defense against the power of framing.” Unquote. The scary implication here is that most of us passively accept the way that problems are framed, and therefore we don’t often have the opportunity to discover that our decisions and our preferences are what Kahneman and Tvirksy call frame-bound rather than reality-bound. I.e., the way the question is framed and presented, changes the way we feel about it. Changes the ultimate decision we make.
This is not the only example of framing having a major implication in the way that experts feel and think about life and death outcomes. Another example is what Kahneman and Tvirksy call the Asian disease problem. In this study, Kahneman and Tvirksy had respondents look at an imaginary disease outbreak which is expected to kill 600 people. They were proposed with two alternative solutions which were phrased slightly differently. And this gets into the concept of one of the poor tenants of something called prospect theory which we’ll talk about in detail in a future episode on the podcast. But it’s something that Kahneman and Tvirksy created and discovered and are in many ways one of the things they’re best known for. There’s a disease that threatens the lives of 600 people. The first frame, the first option that was presented, was a choice between A and B. In program A, 200 people’s lives will be saved. In program B, there’s a 1/3 probability that 600 people will be saved and a 2/3 probability that no one would be saved. Okay. So, program A guaranteed saving 200 lives. Program B, 1/3 chance of saving 200 lives, 2/3 chance of killing everybody. A substantial majority of a respondents chose program A. They chose this certainty of saving 200 lives. Now it’s important to note here that statistically, those outcomes are identical, right. The expected value is identical between the two. 200 is 1/3 of 600. So, really, we’re looking at do people prefer a safe choice? Or do people prefer the gamble, right?
And this will come into play when we look at the second frame. The second way that the same decision was proposed is that if program A is adopted, 400 people will die. If program B is adopted, there’s a 1/3 probability that nobody will die, and a 2/3 probability that 600 people will die. If you think about it, program A and program A are identical, and so are the consequence of program B and program B. In the second frame, a large majority of people chose to gamble. They chose program B, right. This ties back to the same concept, the same idea of framing. But it gets at something else. When people are faced with dangerous outcomes, they prefer the sure thing over the gamble when the outcome is a good outcome. I.e., this is also known as being risk adverse. That’s why people, when the frame is presented as saving 200 lives, or gambling to save 600, people prefer the sure thing. They’re risk averse, they want to just lock in the 200 lives they can save. However, when outcomes are negative, people are risk-seeking. They tend to reject the sure thing and accept the gamble. When the same exact question is phrased as option A, 400 people die, option B a 1/3 chance of saving 600 people or a 2/3 probability of all of them dying. People vastly prefer the gamble.
Previously, these same exact conclusions have been discovered in a number of different contexts, looking at money - looking at how people behave in the financial markets. This is tied to the concept of loss aversion, which we touched on in the interview with Michael Mauboussin. The fascinating thing about this, is this also demonstrates the same tendency takes place when we’re talking about health outcomes, when we’re talking about people’s lives. As Kahneman says, QUOTE: “It is somewhat waring that officials that make decisions that affect everyone’s health can be swayed by such a superficial manipulation, but we must get used to the idea that even important decisions are influenced, if not governed by system one.” End quote. Again, System One we talked about this in the last episode, but System One, this isn’t a perfect description, but roughly speaking, System One, think of it as your subconscious sort of rapid decision making mind. So the Asian Disease Problem is a great example of looking at how the same exact outcome can be framed in two separate ways. It almost seems silly talking about it, because logically it’s so obvious that if you save 200, the other 400 will die. Or even thinking about the experiment with the Harvard Medical School. Somebody has a 90% survival rate, it’s the exact same thing as a 10% mortality rate. But just explaining it in a different way. Changing the frame substantially changes the way that people act. And it’s a very important thing to remember and to consider that when people are facing good outcomes, they’d rather be risk-adverse. They’d rather lock in the sure thing, right, they’d rather save those 200 people. But when it’s framed as a negative outcome, even when it’s the same situation, when it’s framed as condemning 400 people to die, they prefer the gamble of trying to save everyone. So, in both of those scenarios, the situations were actually identically. But changing the frame changed the way a substantial majority of respondents selected the outcome that they preferred.
Now we’re going to look at another example. This one you may have heard of this. It’s a very often-sited, very common example, of how framing can have a substantial impact on another medical outcome.
A study that was originally published in 2003 looked at the rates of organ donation in a number of different countries. Countries that they tried to compare was demographically and culturally similar to see why they had these massive gaps. And the two they looked at specifically, they looked at comparing Austria and Germany. Two very cultural similar nations and they looked at comparing Sweden and Denmark. The organ donation rate in Austria is near almost 100%. But the organ donation in neighboring Germany was only at 12%. What factor could explain the 88% gap between those two outcomes? The 88% gap in organ donation rates in two countries that, by and large, are very similar. And the inhabitants of each country behave very similarly, live very similar lives, have very traditions, morals, standards, cultural practices etc. Similarly, Sweden had an 86% organ donation rate. Denmark’s? 4%. These massive gaps - and these are life-changing outcomes here. Imagine if you have an entire population of organ donors, versus a population where only 4% donate their organs. This is something that’s a life-and-death thing for many, many people. This is changing people’s lives, people who are looking for organ donations. The thing that was causing this was so, so simple. It was a framing effect. Again. These enormous differences are caused simply by the fact that in Austria and Sweden, the countries with extremely high organ donation rates, everyone is opted in to organ donating. And you have to - it’s very simple, all you have to do is check a box and say “I no longer want to be an organ donor.” Vice versa, in the countries Germany, Denmark, you have to opt in to being an organ donor. That’s it. That’s the only difference. A simple checkbox. Whether people are opted in by default to donating their organs or not.
As Kahneman puts it in Thinking Fast and Slow, QUOTE: “that is all. The single best predictor of whether or not people will donate their organs is the designation of the default option that will be adopted without having to check a box.” End quote.
It’s that simple. That’s the crazy thing about the framing bias. These totally obvious, totally transparent, if you think about them logically, situations, people make crazy decisions, or society makes vastly different decisions based on something as simple as taking two seconds to check a box. These outcomes have huge, dramatic changes for the societies that they’re in. Or, if you’re looking at or thinking about these medical outcomes. Simply the way that something is phrased can change the way somebody makes a decision that can impact their life therein materially. That’s why framing is so dangerous sometimes, because we often don’t understand how the frame is impacting the way we think about the problem. Here is another great quote where Kahneman really sums this up nicely, from Thinking Fast and Slow.
QUOTE: “Your moral feelings are attached to frames. To descriptions of reality, rather than to reality itself. The message about the nature of framing is stark. Framing should not be viewed as an intervention that masks or distorts and underlying preference. At least in this instance, and also in the problems of Asian Disease and in surgery versus radiation for lung cancer, there is no underlying presence that is masked or distorted by the frame. Our preferences are about framed problems and our moral intuitions are about descriptions, not about substance.” End quote.
That’s very important the way that’s he’s phrased that. Our moral intuitions are about descriptions, not about substance. The way we viscerally feel about the option of saving 200 lives versus condemning 400 people, despite the fact that they’re the same thing, our emotional, our moral preferences, are about the frames themselves as opposed to the underlying reality. Thinking about the ways this might impact our lives on a day-to-day basis. Thaler, in the book Nudge, has another great quote.
QUOTE: “The verses that seemingly small features of social institutions can have massive effects on people’s behavior. Nudges are everywhere, even if we do not see them. Choice architecture both good and bad is pervasive and unavoidable, and it greatly effects our decisions.”
He uses a few phrases in there that we haven’t touched on before. Nudges are what Thaler and Sunstein use in the book Nudge to describe some of these frames, to describe another thing he calls choice architectures. The interesting thing is you can structure choice architectures in your own life in a way that can make you better decisions. You can think about, and be consciously aware of the frame. The sooner you become aware of it, the sooner you boil it down to the logic behind it - you can see through the illusion of the frame. You can see through the false choices that the frame creates, and make much more and effective and better decisions. Similarly there are many, many ways that you can think about how can you frame things more effectively to achieve what you want to achieve? If you’re presenting information to people, if you’re trying to convince someone to do something. Think very carefully about how you have framed the situation because the frame itself, just the wording of the situation, can have a dramatic impact on how people will react to it on the decision that people will make, and on the way that they’re going to feel about making that decision.
Think back to the example of the Harvard Medical School. Just a simple twist of the phrase - I think this project has an 80% chance of making it, or there’s a 20% chance this project is going to end in failure. If you’re sending an e-mail to your boss, if you’re proposing something, if you’re pitching investors, if you’re teaching students. Whatever it may be, think very carefully about the frames that you're using, because the frames can have a serious impact on how people react and the decision they ultimately make down the road.