Today you’re going to learn why con artists always wear lifts in their shoes, how a normal person can administer lethal shocks to an innocent research subject, why 95% of nurses were willing to give deadly doses of a dangerous drug to their patients, and much more.
This is the fifth episode in a six-part series on the Science of Success, titled Weapons of Influence. And based on the bestselling book Influence by Robert Cialdini. In each of these weapons of influence are deeply rooted and verified by experimental psychology research, which you will get a ton of amazing examples of. Last week, we talked about what made a guy named Joe Gerard the greatest car salesman of all time, how Tupperware grew their sales to 2.5 million dollars a day, why uglier criminals are more likely to go to jail, and much more. If you haven’t checked out that episode yet, listen to it after to you listen to this one.
This week we’re going to talk about the authority bias. This bias can create some astounding effects in the real world, and as some of these research studies can show, can often impact life and death decisions. Authority bias is one of the most adaptive and ingrained biases. Partially, because much of the time, listening to authorities is beneficial and the right thing to do. Just like the other weapons of influence, however, our minds can play tricks on us, and those automatic Click, Whirr responses that we talked about in the episode on the biological limits of the mind, can misfire at the worst possible times. Here’s how Cialdini describes the authority bias in Influence.
QUOTE: We rarely agonize to such a degree over the pros and cons of authority demands. In fact, our obedience frequently takes place in a Click, Whirr fashion with little or no conscious deliberation. Information from a recognized authority can provide us a valuable shortcut for deciding how to act in a situation. Conforming to the dictates of authority figures has always had genuine practical advantages for us. Early on, these people, parents, teachers, etc, knew more about we did. And we found that taking their advice proved beneficial. Partly because of their greater wisdom, and partly because they controlled our rewards and punishments. As adults, the same benefits persist for the same reasons, though the authority figures are now employers, judges, and government leaders. Because their positions speak of greater access to information and power, it makes sense to comply with the wishes of properly constituted authorities. It makes so much sense, in fact, that we often do so when it makes no sense at all. END QUOTE.
Long time listeners will know that I’m a huge fan of Charlie Munger, Warren Buffet’s billionaire business partner. Here’s how he describes the authority bias, and in particular a study using flight simulators and the authority bias.
QUOTE: They don’t do this in airplanes, but they’ve done it in simulators. They have the pilot do something where an idiot co-pilot would know the plane was going to crash. But the pilot’s doing it, and the co-pilot’s sitting there. And the pilot is the authority figure. 25% of the time the plane crashes. I mean, this is a very powerful psychological tendency. UNQUOTE.
I think one of the most important things that Cialdini said, is that authority bias is adaptive. What do I mean when I say it’s adaptive? I mean it has an extremely positive evolutionary benefit. It’s incredibly rewarding and beneficial, especially when we’re growing up to learn to authority figures. They control our rewards and punishment. They know what’s going on. They provide us with wisdom. And most of the time, it makes a ton of sense. But occasionally, ti completely misfires. Just like the other weapons of influence, this is something that, on the surface, seems relatively obvious. Yes, authorities can exert influence over people, but when you look at some of the manifestations in the ways that authority bias plays tricks on our mind, it’s fascinating. Let’s dig into some of the research examples.
Of course the most well-known example of the authority bias in action is the infamous Milgram experiment, using electronic shocks. In this experiment, ordinary people were asked to deliver increasingly deadly electric shock to a test subject, who was in fact a paid actor and was not receiving real shocks. The results were shocking. And defied much of what people thought about human behavior at the time. Here’s how Cialdini describes the experiment in depth.
QUOTE. Rather than yield to the pleas of the victim, about 2/3s of the subject in Milgram’s experiment pulled every one of the thirty shocks which is in front of them, and continued to engage in the last switch, 450 volts, until the researcher ended the experiment. More alarming still, almost none of the 40 subjects in this study quit his job as teacher when the victim first began to demand his release. Nor later, when he began to beg for it. Nor even later when his reaction to each shock had become, in Milgram’s words, quote “definitely an agonized scream”. The results surprised everyone associated with the project. Milligram included, in fact, before the study began, he asked groups of colleagues, graduate students, and psychology majors at Yale University, where the experiment was performed, to read a copy of the experimental procedures and estimate how many subjects would go all the way to the last 450 volt shock. Invariable, the answers fell in the 1-2% range. A separate group of 39 psychiatrists predicted that only about one person in a thousand would be willing to continue to the end. No one then was prepared for the behavior pattern that the experiment actually produced. UNQUOTE.
Here’s how Milgram himself said it.
QUOTE. It is the extreme willingness of adults to go to almost any lengths on the command of an authority that constitutes the chief finding of this study. UNQUOTE.
The Milgram experiment is the bedrock of the authority bias. And also, one of the most controversial and talked about studies in psychology. Cialdini elaborates more on the importance and the significance of the Milgram experiment by saying,
QUOTE. In the Milgram studies of obedience, we can see evidence of strong pressure in our society for compliance with request of an authority. Acting contrary to their own preferences, many normal, psychologically healthy individuals, were willing to deliver dangerous and severe levels of pain to another person, because they were directed to do so by an authority figure. The strength of this tendency to obey legitimate authorities comes from the systematic socialization practices designed to instill in members of society the perception that such obedience constitutes correct conduct. UNQUOTE.
And again, the person in this experiment wasn’t actually receiving electric shocks. What they did was they had an actor who was the test subject, but the actual subject was the person administering the shocks, and then they had another - they had a researcher in a white lab coat basically saying “continue to shock them” “shock them at a higher level”. And they weren’t actually being shocked, but the actor was - the person administering the shocks by every right believed they were actually administering real shocks and the person who was - they would say this person being shocked and begging for release and saying “please stop shocking me” and they would keep doing it because the authority was telling them to do so.
Many of you have probably heard of this experiment. The Milgram experiment is very, very talked about. If you’ve read even some rudimentary psychology research, I’m sure you’ve run into it or heard it talked about or uncovered it. But, you can’t have a conversation about the authority bias and not have a prominent in the discussion about the Milgram experiment. At the time, it was totally ground breaking and even today the findings are astounding.
So let’s look at a few other different examples. One of them is about symbols of authority. Cialdini cites a number of actors who play tv roles, from doctors, to Martin Sheen playing the president on West Wing as examples on how people defer to authorities who have no actual substance, but only the appearance and the trappings of authority. We talked about this in the previous episodes when we talked about the liking bias. Celebrity endorsements are harping on the connection between authority and liking bias, and the fact that you have celebrities who don’t have any credentials or any credibility to be talking about some particular things, but they just happen to be an actor playing a particular role. But the symbol of that authority alone is enough to impact people on a subconscious level, and to drive that behavior. Here’s how Cialdini puts it.
QUOTE. The appearance of authority was enough. This tells us something important about unthinking reactions to authority figures. When it in a Click Whirr mode, we are often as vulnerable to the symbols of authority as to the substance. Several of these symbols can reliably trigger our compliance in the absence of the genuine substance of authority. Consequently, these symbols are employed extensively by those compliance professionals who are short on substance. Con artists, for example, drape themselves with the titles, the clothes, and the trappings of authority. They love nothing more than to emerge elegantly dressed from a fine automobile and introduce themselves to their prospective marks as doctor or judge or professor or commissioner someone. They understand that when they are so adorned, their chances for compliance are greatly increased. Each of these types of symbols of authority titles, clothes, and trappings, has its own story and is worth a deeper look. UNQUOTE.
That ties into another research study which I find really funny, but a crazy example that again kind of ties back into the liking bias, we talked about how important physical attractiveness can be. People perceive the same person to be more than 2.5 inches taller simply when their title was changed from “student” to “professor”. This is a study they conducted in 1992. Here’s how Cialdini describes it.
QUOTE. Studies investigating the way in which authority status affects perceptions of size have found that prestigious titles lead to height distortion. In one experiment, conducted on five classes of Australian college students, a man was introduced as a visitor from Cambridge University in England. However, his status at Cambridge was represented differently in each of the classes. To one class, he was presented as a student. To a second class, a demonstrator. To another, a lecturer, and to yet another a senior lecturer. To a fifth, a professor. After he left the room, the class was asked to estimate his height. It was found that with each increase in status, the same man grew in perceived height by an average of a half-inch. So that the professor, he was seen as 2.5 inches taller than the student. Another study found that after winning an election, politicians became taller in the eyes of the citizens. UNQUOTE.
A crazy corollary of this study is of course the reason why con artist also wear lifts in their shoes. So that they can appear taller, because it works both ways. Again, this kind of ties back into the concept of the liking bias.
The next experiment is something I like to call the Astrogen experiment. After they conducted this experiment, they surveyed a different group of 33 nurses and only two indicated that they would have done this, they would have done what happened in the experiment, which you’re about to find out what that is. Showing off just how massive the gap between what we think we do and what we actually do really is. It ties back into this same thing. The power of the subconscious mind. The power of all of these weapons of influence. The power of the Click, Whirr responses that are biologically built into our brains. Again, when surveyed, a different group of nurses, only 2 out of 33 said they would have done what happened in this experiment. Here’s how Cialdini describes the research.
QUOTE. A group of researchers composed of doctors and nurses with connections to three Midwestern hospitals became increasingly concerned with the extent of mechanical obedience to doctor’s orders on the part of nurses. One of the researches made an identical phone call to 22 separate nurses stations on various surgical, medical, pediatric and psychiatric wards. He identified himself as a hospital physician and directed the answering nurse to give 20mg of a drug Astrogen to a specific ward patient. There were four excellent reasons for the nurses caution in response to this order. One, the prescription was transmitted by phone, in direct violation of hospital policy. Two, the medication itself was unauthorized. Astrogen had not been cleared for use, nor placed on the ward’s stock list. Three, the prescribed dosage was obviously and dangerously excessive. The medication containers clearly stated that the maximum daily dose was only 10mg, half of what had been ordered. Four, the directive was given by a man the nurse had never met, seen, or even talked with before on the phone. Yet, in 95% of the instances, the nurses went straight to the ward medicine cabinet where they secured the ordered dosage of Astrogen and started for the patient’s room to administer it. It was at this point that they were stopped by a secret observer, who revealed the nature of the experiment. The results are frightening indeed, that 95% of regular staff nurses complied unhesitatingly with a patently improper instruction of this sort, must give us all as potential hospital patients, great reason for concern. What the Midwestern study shows is that the mistakes are hardly limited to the trivial slips in the administration of harmless ear drops or the like. But extends to grave and dangerous blunders. Additional data collected in the Hawkling study, the study we’re talking about, suggested that nurses may not be as conscious to the extent to which the doctor sways their judgement or actions. A separate group of 33 nurses and student nurses were asked what they would have done in the experimental situation, contrary to the actual findings: only two predicted that they would have given the dose. UNQUOTE.
Again, this highlights the massive gap between how we perceive ourselves and our behavior, and how our behavior actually is. We have this conscious interpretation that, of course something is obvious as liking or social proof, or authority isn’t going to really impact my decisions. I’m smarter than that. I’m not going to fall prey to something so silly, right? I mean, it makes me think of the experiment we talked about last episode about judges and how they can fall prey to one of the most starkly obvious biases imaginable, physical appearance. It’s astounding. But in this research study, only two out of 33 nurses thought that they would have done that. But in reality, 95% of them were willing to administer an illegal and deadly dose of medicine from a person they had never met, never spoken to, simply because they referred to themselves as a doctor.
This next experiment I find particularly hilarious. I call it “Give that man a dime”. They conducted a number of variants on this, but I like this one the best because the authority figure himself was actually around the corner when this request took place. I’ll let Cialdini explain the experiment for you.
QUOTE. Especially revealing was one version of the experiment in which the requester stopped pedestrians and pointed to a man standing by a parking meter 50 feet away. The requester, whether dressed normally or as a security guard, always said the same thing to the pedestrian, quote, “You see that guy over there? He’s over parked but doesn’t have any change. Give him a dime.” The requester then turned a corner and walked away, so by the time the pedestrian reached the meter, the requester was out of sight. The power of his uniform lasted, however, even after he was long gone. Nearly all of the pedestrians complied with his directive when he wore the guard costume, but fewer than half did so when he was dressed normally. UNQUOTE.
When you think about it on the surface, it doesn’t seem like anything crazy, bizarre, or weird is happening, right? Yeah, I mean, if you see someone in a security guard outfit they’re probably an authority, you should probably listen to them. But the reality of this bias is just because a total stranger happens to be wearing a different set of clothes, drastically changes the way that people react to them. Right? That’s really a great example, and a concrete way to think about the authority bias. Nothing about that person changed, except for the clothes that they were wearing. And those clothes materially impacted the way that people reacted to their statement to give that man a dime. It changed the way that people behaved and perceived that person simply by changing their clothes. Something that, in reality, had no impact on their credibility. No impact on their authority. No impact on whether or not someone should have complied with their request.
In another research study that I call the suited jaywalker, they had somebody gross the street. They had somebody jaywalk. In half of the cases, the person jaywalking was in a freshly pressed suit and tie and looking very nice and looking very formal. And in the other half, they just had them wearing a work shirt and trousers. What they really wanted to monitor was how many pedestrians standing on that street corner would follow the jaywalker. What they actually discovered was that three and a half times as many pedestrians were willing to jaywalk following the suited man as they were willing to follow the person that was dressed in regular, every day clothes. Again, a similar instance in the fact that just changing your clothing, just changing your appearance can communicate at a subconscious level that “hey, this is somebody of authority. This is somebody we should listen to. This is someone whose advice we should take, someone who’s model we should follow.”
So, what are some of the learnings from this episode? What are some of the learnings from this research? There are a number of major drivers of the authority bias. The first is that the authority bias is adaptive. It’s ingrained in us since childhood. And frequently, it has very positive effects. Here’s a quick quote by Cialdini on this.
In addition, it is frequently adaptive to obey the dictates of genuine authorities, because such individuals usually possess high levels of knowledge, wisdom, and power. For these reasons, deference to authorities can occur in a mindless fashion as a kind of decision making shortcut. ENDQUOTE.
Again, this is the same learning that we’re getting from many of the different weapons of influence. These are things that are evolutionary beneficial. These are things that are positive traits and positive characteristics, but occasionally they just have these wacky misfires that end up with people doing ridiculous things. The second learning is that symbols of authority, however vacuous, have the same effect as actual authority. We talked about celebrity endorsements, we talked about the research studies that backed that up. The second learning is that symbols of authority, however vacuous, have the same effect as actual authority. There’s a couple different ways that manifests itself. We talked about titles and how they have a massive impact. Thinking back to the Astrogen experiment, how just a total stranger on the phone using the word ‘doctor’ was able to drive those nurses to administer a potentially lethal dose of medicine. Here’s how Cialdini elaborates on it a little bit more.
QUOTE. Titles are simultaneously the most difficult and the easiest symbols of authority to acquire. To earn a title normally takes years of work and achievement, yet it is possible for somebody who has put in none of these effort to adopt the mere label and receive a kind of automatic deference. As we have seen, actors in TV commercials and con artists do it successfully all the time. UNQUOTE.
Another one of these vacuous symbols of authority is clothing. Clothing alone can create compliance and the illusion of authority. Think back to the jaywalker and the “give that man a dime” experiment. Right? Here’s how Cialdini sums it up.
QUOTE. A second kind of authority symbol that can trigger our mechanical compliance is clothing. Though more tangible than a title, the cloak of authority is every bit as fake-able. UNQUOTE.
I think one of the last big learnings about authority and we see this learning across the weapons of influence. But it’s that people massively underestimate how much authority bias actually influences them.
When we think back to the Astrogen experiment, only two out of the 33 nurses said they would have done that, but in reality when actually tested in an experiment, 95% of them did that. Here’s how Cialdini explains that
QUOTE People were unable to predict correctly how they or others would react to authority influence. In each instance, the effect of such influence was grossly under estimated. This property of authority status may account for much of its success as a compliance device. Not only does it work forced on us, but it does so unexpectedly UNQUOTE.
So how can we defend against the authority bias? Something that we naturally underestimate, something that can really operate at a subconscious level. Again, the defenses for a lot of the weapons of influence really stem back to the same ideas of awareness, of asking the right questions, of being self-aware and understanding what thoughts are going in your mind, what things you’re thinking about and the way that you’re feeling. Being able to tap into that and kind of say, “Hey, something seems amiss”, right? “Why am I complying with this person’s request?” But Cialdini specifically sites two questions that he suggests we ask as a way to combat the authority bias.
The first question he suggests we ask is - “Is this authority truly an expert?” Right, and this asks us to boil down and really think about - do they actually know what they’re talking about? What makes them a real expert? And in many of the research instances we’ve cited, it’s patently obvious that if you pause for one moment and think “Okay, no, this person isn’t an expert, so I shouldn’t let their opinion or their comment bias me unnecessarily.” The second question which we really only answer if the person actually happens to be an expert, is “How truthful can we expect this expert to be?” especially given the situation, and the context of the situation. Right? And what that kind of tries to tap into, is that even though authorities, if they’re a true expert, may actually be the most knowledgeable, have the most experience, be the experts, do they really have our best interest in mind? Or are they, in this particular instance, trying to manipulate us, or trying to drive us to perform a certain action or do a certain thing. So try to keep those questions in mind, trying to ask: is this authority really an expert? Is somebody crossing the street just because they’re wearing a suit, do they know more about crossing the street than anybody else? Is this person who just called me on the phone and said they’re a doctor, how do I know that they’re really a doctor? Is this person really an expert - and two, if they really are an expert, how truthful can I really expect them to be? Again, the way that you tap into that automatic subconscious processing that’s going on in your mind is to develop the presence and the ability to understand and to see what thoughts are taking place in your mind.
Meditation is an amazing tool for doing that, which we’ve got an upcoming episode on, which is going to be awesome.