The Science of Success Podcast

  • Get STARTED
  • The Podcast
    • The Best Of
    • SHOW NOTES
    • EPISODE LIST
    • BLOG
  • ABOUT US
    • Our Mission
    • Team
    • Our Partners
    • CONTACT
  • Evidence-Based Growth
  • Rate & Review
  • BOOKSHELF
  • Shop
  • Search

Why An Almost-Empty Cookie Jar Is More Valuable Than A Full One

February 23, 2016 by Austin Fabel in Weapons of Influence, Influence & Communication

This is the FINAL episode in a six-part series on "The Science of Success" titled WEAPONS of INFLUENCE, based on the best-selling book “Influence” by Robert Cialdini. Each of these weapons of influence are deeply rooted and verified by experimental psychology research (of which you'll get a ton of amazing examples). 

So what are the 6 weapons of influence?

  • Reciprocation

  • Consistency & Commitment

  • Social Proof

  • Liking

  • Authority

  • Scarcity

Today you’re going to learn about Scarcity Bias, and what happens when you take people’s cookies away; how changing a single phrase drove six times more sales; and why open outcry auctions turn your brain into mush. Like many of the weapons of influence, this is something we intuitively know and understand, but often don’t realize how powerful it is or how much it impacts our decisions at a subconscious level through daily life.

Thank you so much for listening!

Please SUBSCRIBE and LEAVE US A REVIEW on iTunes! (Click here for instructions how to do that!).  

EPISODE TRANSCRIPT

Today, you’re going to learn what happens when you take people’s cookies away, how changing a single phrase drove six times more sales, and why open outcry options turn your brain into mush. 
	This is the final episode in a six part miniseries on the Science of Success titled Weapons of Influence, based on the bestselling book Influence by Robert Cialdini. Each of these Weapons of Influence are deeply rooted and verified by experimental psychology research which you’re gonna get a ton of amazing examples of, if you’re just now tuning in to this episode definitely go back and listen to the series because there is some amazing content in there. 
	Last week we talked about why con artists wear lifts in their shoes, how a normal person can administer lethal shots on innocent research subject, why 95 percent of nurses are willing to give deadly doses of drug to their patients, and much more. If you haven’t checked that episode out yet, listen to it after you listen to this one.
	I actually can’t believe that Weapons of Influence is already coming to an end. It’s been such a fun miniseries and I love the book influence by Robert Cialdini so, it’s been great for me to go back and really dig into somebody’s research examples and really learn about them, and it’s been awesome to share it with everybody on the podcast but, just because Weapons of Influence is ending…you know, we’ve got some amazing…really, really exciting contents an awesome interview some really deep dives and some cool subjects coming up in the next couple of weeks. Stay tuned and get excited but, this week we’re going to talk about the scarcity bias. Like many of the Weapons of Influence this is something that we intuitively know and understand. But, often don’t realize how powerful it is or how much it impacts our decisions at a subconscious level or throughout our daily lives.
	Here’s is how Cialdini describes scarcity bias, note how he describes something psychological reactive theory, this is a key part of the scarcity bias and also something that Charlie Monger touches on by another name, he call it deprival super reaction syndrome. Anyway, here’s how Cialdini describes it “according to the scarcity principle, people assign more value to opportunities when they are less available, the use of this principle for profit can be seen in such complainants techniques that limited numbers and deadline tactics. Where in practitioners try to convince us that access to what they’re offering is restricted by amount or time.
The scarcity principle holds for two reasons, first, because things that are difficult to obtain are typically more valuable, the availability of item or experience can serve as a shortcut queue to its quality, second, as things become less available we lose freedoms.
According to psychological reactions theory we respond to the loss of freedoms by wanting to have them, along with the goods or services connected to them more than before.
	The scarcity principle is most likely to hold true under two optimizing conditions: First, scarce items are heightened in value when they are newly scarce. That is, we value those things that have become recently restricted more than those that were restricted all along. Second, we are most attracted to scarce resources when we compete with others for them. Compliance practitioners’ reliance of scarcity articles as a weapon of influence is frequent, wide ranging, systematic and diverse. Whenever this is the case with a weapon of influence we can be assured that the principle involved has notable power in directing human actions.” 
	One of the most interest things that Cialdini mentions in that quote, is the fact that we want scarce things even more when we are competing with other people for those goods, and we’ll dig into a couple pieces of research that kind of showcase that but, lets dig into the research now and look at how the scarcity principle can impact your behavior.
	Let’s start out with an experiment that showcases the scarcity principle at work on kids as early as age two. A study in Virginia had researchers take two toys and place them in a room divided by a Plexiglas barrier. For half the kids the barrier was one foot high posing no barriers to the child ability to access the toy. For the other half of the kids the barrier was high enough that they were obstructed from reaching the toy without going around it. 
With the small one foot barrier children showed no preference for either toy. However as you would expect, once the barrier went up, children went for the obstructed toy three times faster than to the easily accessible toy, as the researchers said “the boys in this study demonstrated the classic terrible two’s response to a limitation of their freedom, outright defiance”.
	I think the fascinating thing about the two year old Plexiglas experiment, is the fact that the behavior starts to manifest itself at such an early age, right? And this ties it again to the thing that we heard again and again, is that these biases are built into our minds, they’re ingrained into our bodies, in our brains by our a society, by evolution, by all kinds of different factors very, very deeply ingrained and that’s why they have such a powerful effect on shaping human behavior.
	The next study takes a looks at how we perceive items that are banned, limited and restricted from us, and this result has been repeated across several other and different banned items with the same results. But, in this particular study it was in Dave County, Florida. The government imposed a ban prohibiting “the use and possession of laundry and cleaning products that could contain phosphates.” Cialdini described how the residents of Dave County reacted in two parts. “First, in what seems a Florida tradition, many Miamians turn to smuggling. Sometimes with neighbors and friends and large ‘soap caravans’, they drove to nearby counties to load up on phosphates detergents, hoarding quickly developed and in the rush of obsession that frequently characterizes hoarders, families boasted of having a twenty year supplies of phosphate cleaners.”
	That behavior looks pretty ridiculous and shows the lengths that people go once they perceive something scarce but, that’s only really scratching the surface of the underlying subconscious shift the people had towards the phosphates cleaners products after the ban is to me the most striking finding. This passage also helps to explain the concept of psychosocial reactive’s that we talked about at the top and how it underpins the scarcity principle. “The second reaction to the law was more subtle and more general than the deliberate defiance of the smugglers or hoarders. Spurred by the tendency to want but no longer have, the majority of the Miami consumers came to see phosphate cleaners as better products than before, compared to Tampa residents who were not affected by the Dave County ordinance, the citizens of Miami rated phosphate detergents gentler, more effective in cold water, better whiteners and fresheners and more powerful on stains. 
	After passage of the law they have even come to believe that the phosphate detergents poured more easily. This sort of response is typical of individuals who have lost an established freedom and recognizing that it is typical, is crucial to understanding how psychological reactions and the principles of scarcity work.
When something becomes less available our freedom to have it is limited, and we experience and increase desire for it, we rarely recognize however that psychological reactance has cause us to want the item more, all we know is that we want it. To make sense of the heighten desire for the item we begin to assign it possible qualities”. 
This is an extremely important finding and a very, very relevant distinction that Cialdini makes in that piece of research, psychological reactance theory…the fact that we have the freedom of having that detergent that was taken away, that’s what a subconscious level makes us want it even more but, what happens is we start inventing this conscious justification for it, we started inventing this imagine that changes of the trades and the characteristics of that item that we want and this is all taking place at a subconscious level and consciously this justifications make a ton of sense and we believe that, “oh, yeah, phosphates cleaners they’re better in cold water, they’re fresheners and whiteners, they’re better and even pours more easily.” All these things sort of bubble to the conscious mind and believe them, that those are the reasons why we are mad that they took away the phosphate cleaners but, the real reason, the real thing that worked here is the scarcity principle, it’s the fact that it was taken away, creates the subconscious desire to have it back, that visceral two year old response of “you can’t take away my toys” and we consciously develop all kinds of fake justifications for why we actually wanted it.
Something that really want to be tuned to really understand, because this happens all of the time, our subconscious makes a decision often because of the psychological bias, often because we’ve been influenced by one of these Weapons of Influence and consciously we make up with a completely different justification for why we made that decision or why we happen to like this thing more than others, why it happens to buy this thing more frequently than another thing.
	The next study that we’re gonna look at takes place in a more commercial context: How do buyers respond when what they want suddenly becomes scarce. I like to call this “where’s the beef?” This experiment showed how this subtle turn of  phrase and the way that information was presented in this content as exclusive information about an impending scarcity, drove more than six times the amount of sales for buyers. Robert Cialdini explains here, “The company’s customers, buyers for supermarkets and other retail food outlets were called on the phone as usual by a sales person and asked for a purchase in one of three ways. One set of customers heard a standards sales presentations before being asked for their orders.
Another set of customers heard the standard sales presentation plus information that the supplier of the imported beef was likely to be scarce in the upcoming months. A third group received the standard sales presentation and the information about the scarce supply of beef. However, they also learned the scarce supply news was not generally available information. It had come, they were told, to certain exclusive contacts that the company had. That’s the customers who received this last sales presentation learned that not only was the availability of the product limited, so too was the news concerning, the scarcity double-whammy”.
	So, you probably see what’s gonna happen next, right? Cialdini continues, “The results of the experiment quickly became apparent when the company sales people begin to urge the owner to buy more beef because there wasn’t enough in the inventory to keep up with all the orders they we’re receiving. Compared to the customers who only got the standards sales appeal, those who were also told about the future scarcity of beef bought more than twice as much. The real boost in sales, however, occurred among the customers who heard the impending scarcity and the exclusive information. They purchased six times the amount that the customers who had received the standard sales pitch did. Apparently the effect of the news about the impending scarcity was it self-scarce made especially persuasive.” I love the phrase scarcity double-whammy. This experiment is such a simple and powerful demonstration of broad reaching and it impact of scarcity principle can really be.
	When the information about the impending scarcity was given to the customers, they doubled their beef. That alone is a fascinated finding, right? You double your sales just by leveraging the scarcity tactic. But, as soon as that information itself somehow become scarce they had six times more sales. That one really made me pause and think. It’s amazing how much scarcity can drive human behavior, just the scarcity itself more than double itself but, the fact that the scarcity was scarce information in its own…six times more it’s incredible. 
	This next experiment is one of my favorites and we’re gonna look it at three different parts, and I call it the cookie experiment. The first part of the experiment was simple enough. People were shown a jar of cookies. It either had ten cookies in it or it had two cookies in it, and they were asked to rate the cookies across a number of factors. Unsurprisingly, when there were only two cookies in the jar they were rated “as more desirable to eat in the future, more attractive as a consumers item, more costly than the identical cookies in abundant supply” then the experiment has mixed things up a bit, they kept the part of the experiment there were people in the jar that had two cookies in it but, the people with the jar of ten cookies had the jar taken away then replaced with the jar than only had two cookies.
	The goal of this particular twist was to measure how people reacted to a change in scarcity, instead of just a constant scarcity condition, Cialdini explains, “In the cookie experiment the answer is plain, the drop from abundance to scarcity produced a decidedly more positive reaction to the cookies than did constant scarcity, the idea that newly experienced scarcity is the more powerful kind applies to situations well beyond the balance of the situations study. For example, social scientists had determine the such scarcity explain is that primary cause by a political and thermal unbalance.” The researchers weren’t done having fun with cookies yet. They wanted to dig even deeper and so they looked at how suggest what react to cookies scarcity created from different sources. Cialdini elaborates here: “Certain participants were told that some of their cookies had to be given away other raiders in order to supply the demand for cookies in the study. Another set of participants was told that the number of their cookies had to be reduced because the researcher had simply made a mistake and simply given them the wrong jar initially.
The result showed that those whose cookies became scare through the process of social demand like the cookies significantly more than then those whose cookies become scarce by mistake. In fact, the cookies became less available through social demand were rated the most desirable of any in the study. This finding highlights the importance of competition in the pursuit to limited resources not only do we want the same item more when it is scarce, we wanted most when we are in competition for it. This is a key distinction and one that underpins an important learning about scarcity, we want things more when we’re in competition for them, not just when they’re scarce.”
Here’s the last fascinating bit from this series of cookie experiment, who would have thought you could learn so much from cookie jars, the one thing that held constantly through the research at no point did the subjects say the cookies tasted any better. They only rated them higher, more attractive and they say that they would pay a higher price for them. Cialdini concludes, “Therein lies an important insight the joy is not in the experiencing of a scarce commodity but, in the possessing of it”.
	It turns out that we like having our cake more than eating it as long as is scarce enough. I found the cookies experiment interesting  I think there’s so many different takeaways from it but, you know I’m really amazed that this research were be able to pull out just from using a few jars of cookies and measuring human behavior impacts the way people perceive that but, two things that I really think it’s important for you to draw out from the cookie experiment one, obviously is the idea of people wanted it more when they were competing with other people for the cookies, that’s what made it them wanted it most, and when you think about this tie that back to the idea to the biologically limits of the mind which we talked about in an earlier podcast there’s very much kind of a visceral real sort of revolutionary feel to that, right? The idea that in wild…in the times before society existed people were competing for resources and if somebody else has…you know, more resources than you, you wanted even more, you’re more fueled to go get it. And, I think the other thing that is fascinating that a not point do they actually rate the cookies any better the enjoyment of the cookies themselves was unchanged but, the scarcity bias materially impacted their desire for the cookies.
I think that’s the part that is really, really critical, the cookies didn’t taste any better but, the possession of the cookies just because they were scarce is what made people want them so much, is what the people really cared the most about. 
	Lastly, I wanted just include a quote about open outcry auctions, right? Open outcry auctions, are a great example of not only scarcity but also, many of the other Weapons of Influence and how they come together to social proof, etc...I’ll give you this quote from Charlie Monger were he kinds of talks about how multiple biases can compound together in what he calls a lollapalooza effect to basically multiply the power and the influence of all of these different biases. “Finally the open outcry auction. While the open an outcry auction is just made to turn the brain into mush, you got social proof the other guy is betting, you get reciprocation tendency, you get deprival super reactions syndrome and this thing is going away. I mean, it’s just absolutely it’s designed to manipulate people into idiotic behavior” and Charlie Monger get…he’s the billionaire business partner of Warren Buffet, and he and Buffet are both famous for saying that they avoid open outcry auctions like the plague but, open outcry auctions is just an interest example because they really demonstrate how all of this biases don’t just exist in a vacuum and that’s something as wrapping up weapon of influence series, that’s something I really want you to take home and think about is the fact that we’ve seen a number of instances and cases where the biases kind of blend together and interact and there’s instances were liking and social proof tied together,  and there’s instances where authority and social proof, or authority and liking tied together, or scarcity and authority tied together there’s…in the real world things are never as neat and as simple as they are when we’re just talking about an individual bias.
 In the real world all of this stuff is interplayed and intervolved and mixing together and is a lot more cognitive biases that we’re doing future episodes on, we are going to drill down and talk about that as well. This happen to be some of the biggest and most powerful ones but, in real life its much messier and the reality is that all this stuff can compound is not just edited when these things can merged together its multiplicative, its… it really stacks up and it can really get absolute result, and the crazy outcomes, and the more biases you have kind of stacking together, the more you get ridiculous human behavior and I mean…we’ve seen throughout this series a number of crazy, wacky…you know, absurd research findings of just simple little turns of phrase, or tweaks, or all kind of minors changes that can result in changes can make huge impacts. 
	If you hadn’t gone back and listened to some of the other episodes after you wrap this up, you should really check the whole series, because it all ties together and it is all so important but, as we kind of finish this series the things that I want you to think about is the fact that in the real world all this stuff is mingled together and  that makes it even harder to compound some of these biases but also, gives you the opportunity to really deep down and understand all these individually, and then how they work together so that you can formulate away to really be able to be aware of this biases, to combat them so they don’t impact into your decision making in the negative fashion.
	So, what’ve we learn about the scarcity bias? I think we’ve learn quite a bit and this quote from Cialdini sums it up nicely. One of the challenges in dealing with the scarcity bias is as a 2005 study showed, it’s a very physical bias. “Part of the problem is that our typical reaction to scarcity hinders our ability to think. When we watch as something we want become less available, a physical agitation sets in, especially in those cases involving direct competition. The blood comes up, the focus narrows, the emotions rise as this visceral current advances the cognitive rational side retreats, in the rush of arousal it is difficult to become and studied in our approach.”
	So, there’s really a couple takeaways about scarcity that I wanna make sure you understand. There’s two primary reasons that the scarcity bias is so powerful. The first is because things that are difficult to obtain are typically more valuable and so, at a subconscious level, it’s kind of like a mental shortcut, you know, is something like scarce is typically valuable. “Okay, this thing’s scarce so it must be valuable.” But, that’s not always the case right? That’s why we see these crazy outcomes. But, that’s one of the underpinnings one of the reasons why the bias operates. The second is that as things become less than accessible we lose freedom and that ties back at the idea that psychological reactance theory, it goes back to that example of the two year olds, when we have our freedom taken away, or the detergent examples is an amazing kind of studying how that takes place and when we get those freedoms taken away, that’s when that really physical emotion and scarcity bias takes place and there’s two conditions that really set the stage for the scarcity bias to be the most powerful.	
	The first is that scarce items are heighten their value when is newly scarce, that leads back to the cookie jar experiment when something is recently becomes scarce,  we want it even more and we rated and think it about as more favorable, more desirable, and the second thing is that when we’re in competition with other people for that particular resource that makes us even more prone to want whatever that is, want whatever we can’t have because somebody else’s have, when somebody else is competing for it. So both of those factors are two conditionings that if either or both are present, they really amp up and magnified impact of scarcity bias. And both the detergent example and the cookie jar experiment showcase how powerful those can be.
	And, I think the other thing that I really want you taking away from this is, thinking back to the detergent experiment, when people had the detergent taken away they rated it as more favorable, better cleaning, you know, all of these things when in reality the reason that they wanted it was because it had been taken away but, they consciously invented all this justifications for why they wanted it. That’s a very insidious, very dangerous behavior, one that you should take great care to try and be aware of and really understand what’s the real reason that I feel a certain way,  that are thinking sort of thing and is the reason that I’m telling myself a justification that I made up instead of an actual reason.
	So, how do we defend against scarcity bias? I’ll start with the quote from Cialdini. “Should we find ourselves beset by scarcity pressures in a compliance situation then our best response would occur into a two-stage sequence. As soon as we feel the tide of emotional arousal that flows from scarcity influences we should use that rise and arousal as a signal to stop short. Panicky feverish reactions have no place in wise compliance decisions. We need to calm ourselves and regain our rational perspectives. Once that is done, we can move to the second stage by asking ourselves why we want the item under consideration. If the answer is that we want it primarily for the purpose of owning it, then we should use its ability to help wage how much we would to expend for it. However, if the answer is that we want it primarily for its function that is we want something good to drive, drink or eat then we must remember that the item under consideration would function equally well while scarce or plentiful. Quite simply, we need to recall that the scarce cookies didn’t tasted any better.”
	And, I think one of the most important parts of what Cialdini says is the importance of maintaining a calm, rational perspective, and I’ve talked…I’ve referenced Charlie Monger times and I made future podcasts suggest about him and he’s such a fascinating individual and incredibly successful businessmen, but also so wise about psychology and how it impacts human decision making. But, if you look at him, but you look at Warren Buffet, the reason they’ve been so successful is…and they’ll say this many times is, partially because of the huge focus on rationality and really try to be as objective as possible. And in one of the earlier podcasts episodes of the Science of Success, we talked about the ideas of accepting reality and the reality of perception, and the sooner you have a totally objective, rational acceptance of the way reality is, the faster you can recognize things like the scarcity bias the faster that you can recognize any of these Weapons of Influence from kind of seeping into your thoughts and impacting your decision making.
We’ve seen countless examples of how powerful, how insidious, how dangerous these biases can be and the best way to combat it is to cultivate that rationality, is to cultivate that awareness, is to cultivate the ability to both see and understand your own thoughts and we think back again to the detergent example, to see…you know, why do I really like this thing, what’s really driving my behavior? Am I deluding myself into thinking one thing when the reality is something different?
	That’s it for this episode of scarcity and that’s it for the Weapons of Influence miniseries, it’s been an absolute blast to do this miniseries but, I’m also super excited about some upcoming episodes that we have. So, stay tuned, because it’s going to be awesome.	

 

February 23, 2016 /Austin Fabel
Weapons of Influence
Weapons of Influence, Influence & Communication

Why Co-Pilots May Ignore Instinct and Let A Plane Crash

February 18, 2016 by Austin Fabel in Weapons of Influence, Influence & Communication

This is the FIFTH episode in a six-part series on "The Science of Success" titled WEAPONS of INFLUENCE and based on the best-selling book “Influence” by Robert Cialdini. Each of these weapons of influence are deeply rooted and verified by experimental psychology research (which you will get a ton of amazing examples of).

So what are the 6 weapons of influence?

  • Reciprocation

  • Consistency & Commitment

  • Social Proof

  • Liking

  • Authority

  • Scarcity

This week we're going to talk about the Authority Bias. This bias can create some astounding effects in the real world, such as: Why con artists wear lifts in their shoes; how a normal person can administer lethal shocks to an innocent research subject; why 95% nurses were willing to give deadly doses of a dangerous drug to their patients; and much more.

Thank you so much for listening!

Please SUBSCRIBE and LEAVE US A REVIEW on iTunes! (Click here for instructions how to do that!).  

EPISODE TRANSCRIPT

Today you’re going to learn why con artists always wear lifts in their shoes, how a normal person can administer lethal shocks to an innocent research subject, why 95% of nurses were willing to give deadly doses of a dangerous drug to their patients, and much more. 

This is the fifth episode in a six-part series on the Science of Success, titled Weapons of Influence. And based on the bestselling book Influence by Robert Cialdini. In each of these weapons of influence are deeply rooted and verified by experimental psychology research, which you will get a ton of amazing examples of. Last week, we talked about what made a guy named Joe Gerard the greatest car salesman of all time, how Tupperware grew their sales to 2.5 million dollars a day, why uglier criminals are more likely to go to jail, and much more. If you haven’t checked out that episode yet, listen to it after to you listen to this one. 

This week we’re going to talk about the authority bias. This bias can create some astounding effects in the real world, and as some of these research studies can show, can often impact life and death decisions. Authority bias is one of the most adaptive and ingrained biases. Partially, because much of the time, listening to authorities is beneficial and the right thing to do. Just like the other weapons of influence, however, our minds can play tricks on us, and those automatic Click, Whirr responses that we talked about in the episode on the biological limits of the mind, can misfire at the worst possible times. Here’s how Cialdini describes the authority bias in Influence.

QUOTE: We rarely agonize to such a degree over the pros and cons of authority demands. In fact, our obedience frequently takes place in a Click, Whirr fashion with little or no conscious deliberation. Information from a recognized authority can provide us a valuable shortcut for deciding how to act in a situation. Conforming to the dictates of authority figures has always had genuine practical advantages for us. Early on, these people, parents, teachers, etc, knew more about we did. And we found that taking their advice proved beneficial. Partly because of their greater wisdom, and partly because they controlled our rewards and punishments. As adults, the same benefits persist for the same reasons, though the authority figures are now employers, judges, and government leaders. Because their positions speak of greater access to information and power, it makes sense to comply with the wishes of properly constituted authorities. It makes so much sense, in fact, that we often do so when it makes no sense at all. END QUOTE. 

Long time listeners will know that I’m a huge fan of Charlie Munger, Warren Buffet’s billionaire business partner. Here’s how he describes the authority bias, and in particular a study using flight simulators and the authority bias. 

QUOTE: They don’t do this in airplanes, but they’ve done it in simulators. They have the pilot do something where an idiot co-pilot would know the plane was going to crash. But the pilot’s doing it, and the co-pilot’s sitting there. And the pilot is the authority figure. 25% of the time the plane crashes. I mean, this is a very powerful psychological tendency. UNQUOTE.

I think one of the most important things that Cialdini said, is that authority bias is adaptive. What do I mean when I say it’s adaptive? I mean it has an extremely positive evolutionary  benefit. It’s incredibly rewarding and beneficial, especially when we’re growing up to learn to authority figures. They control our rewards and punishment. They know what’s going on. They provide us with wisdom. And most of the time, it makes a ton of sense. But occasionally, ti completely misfires. Just like the other weapons of influence, this is something that, on the surface, seems relatively obvious. Yes, authorities can exert influence over people, but when you look at some of the manifestations in the ways that authority bias plays tricks on our mind, it’s fascinating. Let’s dig into some of the research examples. 

Of course the most well-known example of the authority bias in action is the infamous Milgram experiment, using electronic shocks. In this experiment, ordinary people were asked to deliver increasingly deadly electric shock to a test subject, who was in fact a paid actor and was not receiving real shocks. The results were shocking. And defied much of what people thought about human behavior at the time. Here’s how Cialdini describes the experiment in depth.

QUOTE. Rather than yield to the pleas of the victim, about 2/3s of the subject in Milgram’s experiment pulled every one of the thirty shocks which is in front of them, and continued to engage in the last switch, 450 volts, until the researcher ended the experiment. More alarming still, almost none of the 40 subjects in this study quit his job as teacher when the victim first began to demand his release. Nor later, when he began to beg for it. Nor even later when his reaction to each shock had become, in Milgram’s words, quote “definitely an agonized scream”. The results  surprised everyone associated with the project. Milligram included, in fact, before the study began, he asked groups of colleagues, graduate students, and psychology majors at Yale University, where the experiment was performed, to read a copy of the experimental procedures and estimate how many subjects would go all the way to the last 450 volt shock. Invariable, the answers fell in the 1-2% range. A separate group of 39 psychiatrists predicted that only about one person in a thousand would be willing to continue to the end. No one then was prepared for the behavior pattern that the experiment actually produced. UNQUOTE.

Here’s how Milgram himself said it.

QUOTE. It is the extreme willingness of adults to go to almost any lengths on the command of an authority that constitutes the chief finding of this study. UNQUOTE.

The Milgram experiment is the bedrock of the authority bias. And also, one of the most controversial and talked about studies in psychology. Cialdini elaborates more on the importance and the significance of the Milgram experiment by saying,

QUOTE. In the Milgram studies of obedience, we can see evidence of strong pressure in our society for compliance with request of an authority. Acting contrary to their own preferences, many normal, psychologically healthy individuals, were willing to deliver dangerous and severe levels of pain to another person, because they were directed to do so by an authority figure. The strength of this tendency to obey legitimate authorities comes from the systematic socialization practices designed to instill in members of society the perception that such obedience constitutes correct conduct. UNQUOTE.

And again, the person in this experiment wasn’t actually receiving electric shocks. What they did was they had an actor who was the test subject, but the actual subject was the person administering the shocks, and then they had another - they had a researcher in a white lab coat basically saying “continue to shock them” “shock them at a higher level”. And they weren’t actually being shocked, but the actor was - the person administering the shocks by every right believed they were actually administering real shocks and the person who was - they would say this person being shocked and begging for release and saying “please stop shocking me” and they would keep doing it because the authority was telling them to do so.

Many of you have probably heard of this experiment. The Milgram experiment is very, very talked about. If you’ve read even some rudimentary psychology research, I’m sure you’ve run into it or heard it talked about or uncovered it. But, you can’t have a conversation about the authority bias and not have a prominent in the discussion about the Milgram experiment. At the time, it was totally ground breaking and even today the findings are astounding.

So let’s look at a few other different examples. One of them is about symbols of authority. Cialdini cites a number of actors who play tv roles, from doctors, to Martin Sheen playing the president on West Wing as examples on how people defer to authorities who have no actual substance, but only the appearance and the trappings of authority. We talked about this in the previous episodes when we talked about the liking bias. Celebrity endorsements are harping on the connection between authority and liking bias, and the fact that you have celebrities who don’t have any credentials or any credibility to be talking about some particular things, but they just happen to be an actor playing a particular role. But the symbol of that authority alone is enough to impact people on a subconscious level, and to drive that behavior. Here’s how Cialdini puts it.

QUOTE. The appearance of authority was enough. This tells us something important about unthinking reactions to authority figures. When it in a Click Whirr mode, we are often as vulnerable to the symbols of authority as to the substance. Several of these symbols can reliably trigger our compliance in the absence of the genuine substance of authority. Consequently, these symbols are employed extensively by those compliance professionals who are short on substance. Con artists, for example, drape themselves with the titles, the clothes, and the trappings of authority. They love nothing more than to emerge elegantly dressed from a fine automobile and introduce themselves to their prospective marks as doctor or judge or professor or commissioner someone. They understand that when they are so adorned, their chances for compliance are greatly increased. Each of these types of symbols of authority titles, clothes, and trappings, has its own story and is worth a deeper look. UNQUOTE.

That ties into another research study which I find really funny, but a crazy example that again kind of ties back into the liking bias, we talked about how important physical attractiveness can be. People perceive the same person to be more than 2.5 inches taller simply when their title was changed from “student” to “professor”. This is a study they conducted in 1992. Here’s how Cialdini describes it. 

QUOTE. Studies investigating the way in which authority status affects perceptions of size have found that prestigious titles lead to height distortion. In one experiment, conducted on five classes of Australian college students, a man was introduced as a visitor from Cambridge University in England. However, his status at Cambridge was represented differently in each of the classes. To one class, he was presented as a student. To a second class, a demonstrator. To another, a lecturer, and to yet another a senior lecturer. To a fifth, a professor. After he left the room, the class was asked to estimate his height. It was found that with each increase in status, the same man grew in perceived height by an average of a half-inch. So that the professor, he was seen as 2.5 inches taller than the student. Another study found that after winning an election, politicians became taller in the eyes of the citizens. UNQUOTE. 

A crazy corollary of this study is of course the reason why con artist also wear lifts in their shoes. So that they can appear taller, because it works both ways. Again, this kind of ties back into the concept of the liking bias. 

The next experiment is something I like to call the Astrogen experiment. After they conducted this experiment, they surveyed a different group of 33 nurses and only two indicated that they would have done this, they would have done what happened in the experiment, which you’re about to find out what that is. Showing off just how massive the gap between what we think we do and what we actually do really is. It ties back into this same thing. The power of the subconscious mind. The power of all of these weapons of influence. The power of the Click, Whirr responses that are biologically built into our brains. Again, when surveyed, a different group of nurses, only 2 out of 33 said they would have done what happened in this experiment. Here’s how Cialdini describes the research.

QUOTE. A group of researchers composed of doctors and nurses with connections to three Midwestern hospitals became increasingly concerned with the extent of mechanical obedience to doctor’s orders on the part of nurses. One of the researches made an identical phone call to 22 separate nurses stations on various surgical, medical, pediatric and psychiatric wards. He identified himself as a hospital physician and directed the answering nurse to give 20mg of a drug Astrogen to a specific ward patient. There were four excellent reasons for the nurses caution in response to this order. One, the prescription was transmitted by phone, in direct violation of hospital policy. Two, the medication itself was unauthorized. Astrogen had not been cleared for use, nor placed on the ward’s stock list. Three, the prescribed dosage was obviously and dangerously excessive. The medication containers clearly stated that the maximum daily dose was only 10mg, half of what had been ordered. Four, the directive was given by a man the nurse had never met, seen, or even talked with before on the phone. Yet, in 95% of the instances, the nurses went straight to the ward medicine cabinet where they secured the ordered dosage of Astrogen and started for the patient’s room to administer it. It was at this point that they were stopped by a secret observer, who revealed the nature of the experiment. The results are frightening indeed, that 95% of regular staff nurses complied unhesitatingly with a patently improper instruction of this sort, must give us all as potential hospital patients, great reason for concern. What the Midwestern study shows is that the mistakes are hardly limited to the trivial slips in the administration of harmless ear drops or the like. But extends to grave and dangerous blunders. Additional data collected in the Hawkling study, the study we’re talking about, suggested that nurses may not be as conscious to the extent to which the doctor sways their judgement or actions. A separate group of 33 nurses and student nurses were asked what they would have done in the experimental situation, contrary to the actual findings: only two predicted that they would have given the dose. UNQUOTE.

Again, this highlights the massive gap between how we perceive ourselves and our behavior, and how our behavior actually is. We have this conscious interpretation that, of course something is obvious as liking or social proof, or authority isn’t going to really impact my decisions. I’m smarter than that. I’m not going to fall prey to something so silly, right? I mean, it makes me think of the experiment we talked about last episode about judges and how they can fall prey to one of the most starkly obvious biases imaginable, physical appearance. It’s astounding. But in this research study, only two out of 33 nurses thought that they would have done that. But in reality, 95% of them were willing to administer an illegal and deadly dose of medicine from a person they had never met, never spoken to, simply because they referred to themselves as a doctor. 

This next experiment I find particularly hilarious. I call it “Give that man a dime”. They conducted a number of variants on this, but I like this one the best because the authority figure himself was actually around the corner when this request took place. I’ll let Cialdini explain the experiment for you.

QUOTE. Especially revealing was one version of the experiment in which the requester stopped pedestrians and pointed to a man standing by a parking meter 50 feet away. The requester, whether dressed normally or as a security guard, always said the same thing to the pedestrian, quote, “You see that guy over there? He’s over parked but doesn’t have any change. Give him a dime.” The requester then turned a corner and walked away, so by the time the pedestrian reached the meter, the requester was out of sight. The power of his uniform lasted, however, even after he was long gone. Nearly all of the pedestrians complied with his directive when he wore the guard costume, but fewer than half did so when he was dressed normally. UNQUOTE.

When you think about it on the surface, it doesn’t seem like anything crazy, bizarre, or weird is happening, right? Yeah, I mean, if you see someone in a security guard outfit they’re probably an authority, you should probably listen to them. But the reality of this bias is just because a total stranger happens to be wearing a different set of clothes, drastically changes the way that people react to them. Right? That’s really a great example, and a concrete way to think about the authority bias. Nothing about that person changed, except for the clothes that they were wearing. And those clothes materially impacted the way that people reacted to their statement to give that man a dime. It changed the way that people behaved and perceived that person simply by changing their clothes. Something that, in reality, had no impact on their credibility. No impact on their authority. No impact on whether or not someone should have complied with their request. 

In another research study that I call the suited jaywalker, they had somebody gross the street. They had somebody jaywalk. In half of the cases, the person jaywalking was in a freshly pressed suit and tie and looking very nice and looking very formal. And in the other half, they just had them wearing a work shirt and trousers. What they really wanted to monitor was how many pedestrians standing on that street corner would follow the jaywalker. What they actually discovered was that three and a half times as many pedestrians were willing to jaywalk following the suited man as they were willing to follow the person that was dressed in regular, every day clothes. Again, a similar instance in the fact that just changing your clothing, just changing your appearance can communicate at a subconscious level that “hey, this is somebody of authority. This is somebody we should listen to. This is someone whose advice we should take, someone who’s model we should follow.” 

So, what are some of the learnings from this episode? What are some of the learnings from this research? There are a number of major drivers of the authority bias. The first is that the authority bias is adaptive. It’s ingrained in us since childhood. And frequently, it has very positive effects. Here’s a quick quote by Cialdini on this.

In addition, it is frequently adaptive to obey the dictates of genuine authorities, because such individuals usually possess high levels of knowledge, wisdom, and power. For these reasons, deference to authorities can occur in a mindless fashion as a kind of decision making shortcut. ENDQUOTE.

Again, this is the same learning that we’re getting from many of the different weapons of influence. These are things that are evolutionary beneficial. These are things that are positive traits and positive characteristics, but occasionally they just have these wacky misfires that end up with people doing ridiculous things. The second learning is that symbols of authority, however vacuous, have the same effect as actual authority. We talked about celebrity endorsements, we talked about the research studies that backed that up. The second learning is that symbols of authority, however vacuous, have the same effect as actual authority. There’s a couple different ways that manifests itself. We talked about titles and how they have a massive impact. Thinking back to the Astrogen experiment, how just a total stranger on the phone using the word ‘doctor’ was able to drive those nurses to administer a potentially lethal dose of medicine. Here’s how Cialdini elaborates on it a little bit more.

QUOTE. Titles are simultaneously the most difficult and the easiest symbols of authority to acquire. To earn a title normally takes years of work and achievement, yet it is possible for somebody who has put in none of these effort to adopt the mere label and receive a kind of automatic deference. As we have seen, actors in TV commercials and con artists do it successfully all the time. UNQUOTE.

Another one of these vacuous symbols of authority is clothing. Clothing alone can create compliance and the illusion of authority. Think back to the jaywalker and the “give that man a dime” experiment. Right? Here’s how Cialdini sums it up. 

QUOTE. A second kind of authority symbol that can trigger our mechanical compliance is clothing. Though more tangible than a title, the cloak of authority is every bit as fake-able. UNQUOTE.

I think one of the last big learnings about authority and we see this learning across the weapons of influence. But it’s that people massively underestimate how much authority bias actually influences them.

When we think back to the Astrogen experiment, only two out of the 33 nurses said they would have done that, but in reality when actually tested in an experiment, 95% of them did that. Here’s how Cialdini explains that

QUOTE People were unable to predict correctly how they or others would react to authority influence. In each instance, the effect of such influence was grossly under estimated. This property of authority status may account for much of its success as a compliance device. Not only does it work forced on us, but it does so unexpectedly UNQUOTE.

So how can we defend against the authority bias? Something that we naturally underestimate, something that can really operate at a subconscious level. Again, the defenses for a lot of the weapons of influence really stem back to the same ideas of awareness, of asking the right questions, of being self-aware and understanding what thoughts are going in your mind, what things you’re thinking about and the way that you’re feeling. Being able to tap into that and kind of say, “Hey, something seems amiss”, right? “Why am I complying with this person’s request?” But Cialdini specifically sites two questions that he suggests we ask as a way to combat the authority bias.

The first question he suggests we ask is - “Is this authority truly an expert?” Right, and this asks us to boil down and really think about - do they actually know what they’re talking about? What makes them a real expert? And in many of the research instances we’ve cited, it’s patently obvious that if you pause for one moment and think “Okay, no, this person isn’t an expert, so I shouldn’t let their opinion or their comment bias me unnecessarily.” The second question which we really only answer if the person actually happens to be an expert, is “How truthful can we expect this expert to be?” especially given the situation, and the context of the situation. Right? And what that kind of tries to tap into, is that even though authorities, if they’re a true expert, may actually be the most knowledgeable, have the most experience, be the experts, do they really have our best interest in mind? Or are they, in this particular instance, trying to manipulate us, or trying to drive us to perform a certain action or do a certain thing. So try to keep those questions in mind, trying to ask: is this authority really an expert? Is somebody crossing the street just because they’re wearing a suit, do they know more about crossing the street than anybody else? Is this person who just called me on the phone and said they’re a doctor, how do I know that they’re really a doctor? Is this person really an expert - and two, if they really are an expert, how truthful can I really expect them to be? Again, the way that you tap into that automatic subconscious processing that’s going on in your mind is to develop the presence and the ability to understand and to see what thoughts are taking place in your mind.

Meditation is an amazing tool for doing that, which we’ve got an upcoming episode on, which is going to be awesome.

February 18, 2016 /Austin Fabel
Weapons of Influence
Weapons of Influence, Influence & Communication

Why Ugly Criminals Are 2X As Likely To Go To Prison

February 09, 2016 by Austin Fabel in Weapons of Influence, Influence & Communication

This week we are continuing our new miniseries within "The Science of Success" called "Weapons of Influence". This is the fourth episode in a six-part series based on the best-selling book Influence by Robert Cialdini. If you loved the book, this will be a great refresher on the core concepts. And if you haven't yet read it, some of this stuff is gonna blow your mind.

So what are the 6 weapons of influence?

  • Reciprocation

  • Consistency & Commitment

  • Social Proof

  • Liking

  • Authority

  • Scarcity

Each one of these weapons can be a powerful tool in your belt - and something to watch out for when others try to wield them against you. Alone, each of them can create crazy outcomes in our lives and in social situations, but together they can have huge impacts.

Today’s episode covers the fourth weapon of influence: Liking Bias. In it, we'll cover what made Joe Girard the greatest car salesman of all time; how Tupperware grew their sales to 2.5 million per day; and why uglier criminals are more than TWICE as likely to go to jail; and much more. 

Thank you so much for listening!

Please SUBSCRIBE and LEAVE US A REVIEW on iTunes! (Click here for instructions how to do that!).  

EPISODE TRANSCRIPT

Today, you’re going to learn what made Joe Gerrard the greatest car salesman of all time, how Tupperware grew their sales to $2.5 million a day, and why uglier criminals are more than twice as likely to go to jail, as well as much more.

Because the Science of Success has taken off like a rocket ship since launch, with more than 80,000 downloads, we made the front page of New and Noteworthy on iTunes, and much more, I wanted to offer something to my listeners. I’m giving away my three favorite psychology books to one lucky listener. Just text ‘smarter’, that’s s-m-a-r-t-e-r, to 44222 to be entered to win, and if you’ve been listening and loving the podcast, please leave us an awesome review and subscribe on iTunes. It helps spread the word so more people can master the science of success.

This is the fourth episode in a six part series on The Science of Success titled Weapons of Influence, and based on the bestselling book, Influence, by Robert Cialdini. Each of these weapons of influence are deeply rooted and verified by experimental psychology research, which you’re about to get a ton of amazing examples of.

Last week we talked about why news coverage makes school shootings more likely by a factor of 30 times, which is crazy; how someone can get stabbed to death in front of 38 people and no one does a thing; and why you should always point at the dude in the blue jacket and tell him to help you. The topic we covered last week was the concept of social proof and how it is so powerful that it can literally override someone’s desire to live. If you haven’t checked out that episode yet, listen to it after this one.

Today, we’re going to talk about the liking bias. Liking bias sounds pretty straightforward, but some of the research is pretty astounding. You’ll be amazed to learn what impacts our perceptions of what we think we like, and how easily those perceptions can be manipulated in a way that materially impacts our decision making. Here’s how Cialdini describes the liking bias: “People prefer to say ‘yes’ to individuals they know and like. Recognizing this rule, compliance professionals commonly increase their effectiveness by emphasizing several factors that increase their overall attractiveness and likeability.” If you’re unfamiliar with the term ‘compliance professionals’, we talked about that in the first episode of Weapons of Influence and it’s essentially a term that Cialdini uses to describe somebody who is wielding these weapons of influence to convince other people to comply with their requests. 

There are a few primary drivers of the liking bias. One of the biggest culprits is physical attractiveness. As Cialdini notes: “Physical attractiveness seems to engender a halo effect that extends to favorable impressions of other traits such as talent, kindness, and intelligence. As a result, attractive people are more persuasive in both terms of getting what they request and in changing other’s attitudes.” 

The second major driver of the liking bias is similarity. As Cialdini says: “We like people who are like us, and we are more willing to say ‘yes’ to their requests, often in an unthinking manner.” That actually brings up an interesting point. If you remember from the last episode where we talked about the idea of social proof, and we talked about how whenever there’s front page coverage of a suicide there is an unexplained uptick of more than 50 related suicides. The factor that drives that, and we get much more detail on it in the previous episode of the podcast, but the factor that drives that more than anything is when similar others see somebody like them doing something it drives them to that behavior. It’s a similarity, and a crossover, between that liking bias and social proof, but if you want to learn more and dig deeper into that concept, the previous episode does a great job of explaining that.

The third thing that really drives the liking bias is familiarity. Familiarity breeds liking in an insidious and subconscious fashion. Here’s what Daniel Kahneman says in his book Thinking Fast and Slow, which is another fabulous book about psychology, by the way: “A reliable way to make people believe in falsehoods is frequent repetition because familiarity is not easily distinguished from truth. Authoritarian institutions and marketers have always known this fact.” 

The fourth major way that liking bias works is via Pavlovian association, or mirror association, as it’s sometimes called. Here’s what Cialdini has to say about it: “The linking of celebrities to products is another way advertisers cash in on the association principle. Professional athletes are paid to connect themselves to things that can be directly relevant to their roles: sports shoes, tennis rackets, golf balls, or wholly irrelevant: soft drinks, popcorn poppers, panty hose. The important thing for the advertiser is to establish the connection. It doesn’t have to be a logical one just a positive one. What does Tiger Woods really know about Buicks, after all?” 

Okay, now let’s dig into some of the research examples that support and demonstrate some of these different manifestations of the liking bias. The first example is Tupperware parties. Now, Tupperware parties are something that today aren’t quite as popular, and aren’t as frequent, but in the late ‘80s, early ‘90s was a huge social phenomenon. You see it today. People do different socially themed parties to sell things, and the reason that this sort of sales methodology is still around is because it’s so incredibly powerful. I’ll let Cialdini describe it here: “In fact, consumer researchers who have examined the social ties between the hostess and the party goer in home party sales settings have affirmed the power of the company’s approach. The strength of that social bond is twice as likely to determine product purchase as is the preference for the product itself. The results have been remarkable. It was recently estimated that Tupperware sales now exceed $2.5 million a day. Indeed, Tupperware’s success has spread around the world to societies in Europe, Latin America, and Asia, where one’s place in a network of friends and family is more socially significant than the United States. As a result, now less than a quarter of Tupperware sales take place in North America. What is interesting is that the customers appear to be fully aware of the liking and friendship pressure embodied in the Tupperware party. Some don’t seem to mind, others do, but don’t seem to know how to avoid these pressures.” I think that’s a really critical distinction and something to draw out of that quote, the fact that people are consciously aware of the bias, and consciously aware of this sort of awkward obligation to purchase the Tupperware. Or, if you’ve ever been to a Trunk Club show, or there’s a lot of other social sales settings, and home party sales settings, that people use to bring to bear the liking bias, and to drive sales. Tupperware showcases how they’ve used this gorilla underground marketing strategy, driven in a psychological bias that’s rooted in science, to grow the organization to more than $2.5 million a day in sales.

The next example is the greatest car salesman of all time. It’s a guy named Joe Gerrard, and he was actually named the greatest car salesman of all time by The Guinness Book of World Records. So, I didn’t just make that title up. That’s something that he was awarded by The Guinness Book of World Records. The question is: How exactly did Joe achieve that goal, right? Obviously he had to sell a lot of cars, but what did he leverage, or what tools did he use to sell so many vehicles? I’ll let Cialdini tell the story: “There is a man in Detroit, Joe Gerrard, who specializes in using the liking rule to sell Chevrolets. He became wealthy in the process, making hundreds of thousands of dollars a year. With such a salary we might guess that he was a high level GM executive, or perhaps the owner of a Chevrolet dealership, but no. He made his money as a salesman on the showroom floor. He was phenomenal at what he did. For 12 years straight he won the title of number one car salesman, and averaged more than five cars and trucks sold every day that he worked. He’s been called the world’s greatest car salesman by The Guinness Book of World Records.” The quote continues later: “Joe Gerrard says the secret of his success was getting customers to like him. He did something that, on the face of it, seems foolish and costly. Each month he sent every one of his more than 13,000 former customers a holiday greeting card containing a printed message. The holiday greeting card changed from month-to-month: Happy New Years, Happy Valentine’s Day, Happy Thanksgiving, and so on, but the message printed on the face of the card never varied.”

 I’m gonna pause and interrupt the quote for a second because this is a critical thing to pay attention to, and it’s so simple, and so transparent, and it’s almost a no-brainer when you think about it, but pause for a second and ask yourself: What do you think the card that he sent said every month? The quote continues: “The card read: ‘I like you’. As Joe explained it: ‘There’s nothing else on the card. Nothin’ but my name. I’m just telling ‘em that I like ‘em.’ I like you. It came in the mail every year, twelve times a year like clockwork. ‘I like you’, on a printed card that went to 13,000 other people. Could a statement of liking, so impersonal, obviously designed to sell cars, really work? Joe Gerrard thought so, and a man as successful as he was at what he did deserves our attention. Joe understood an important fact about human nature: We are phenomenal suckers for flattery.” Again, this highlights a very similar principle, which is the fact that people can be totally aware of the liking bias. It can be totally transparent and yet it still drives behavior. It still influences the way that people think. It still gets into your mind, and still impacts your thinking, and that’s one of the core lessons across all the weapons of influence. None of these things are totally shockers, right? I mean, liking bias, that’s not something that takes a rocket scientist to come up with. Congratulations, if you like somebody you’re more likely to want to interact with them, do business with them, listen to them, etcetera. Great, but the reality is when you look at how it impacts people’s behavior, when you look at how something as simple as a printed card that just says, ‘I like you’ drove Joe Gerrard to becoming the greatest car salesman of all time, according to The Guinness Book of World Records. That’s a lesson that’s worth paying attention to. There’s something in there that’s worth digging down and really figuring out: What other manifestations of the liking bias are taking place in your life? What other ways has the liking bias shaped your decision making? What are some of the ways that you can use the liking bias to achieve the goals that you want to achieve?

Let’s look at another example: physical attractiveness and the judicial system. This comes from a study in 1980. Researchers rated the physical attractiveness of a number of different defendants in court cases. They had 74 people in total, but they rated their physical attractiveness. They came back several months later, after the decisions had been made, the court rulings had been made, and they looked at: How did those trials fare, and did physical attractiveness play a role in the outcome of the cases? Here are the results from Influence: “When much later the researchers checked court records for the results of these cases they found that the handsome men had received significantly lighter sentences. In fact, attractive defendants were twice as likely to avoid jail as unattractive defendants. In another study, this one on the damages awarded in a staged negligence trial, a defendant who was better looking than his victim was assessed an average amount of $5,623, but when the victim was more attractive of the two, the average compensation was $10,051. What’s more, both male and female jurors exhibited the attractiveness-based favoritism.” I don’t think there’s an example of something that we think of as more objective, more rational, more bias-free than the judicial system. Obviously, there are a lot of issues with the judicial system, which we are not gonna get into, but when you think about human institutions, obviously everyone makes mistakes, humans are fallible, but at some level, I think subconsciously especially, we hold the judicial system in high regard, but when you look at the research, physical attractiveness has that sizeable of an impact on court cases. It’s staggering.

Another study, which I don’t have in front of me, but I think we’ve actually mentioned before on the podcast- the results... I don’t remember exactly what it was, but essentially they looked at when the judge had last eaten, and basically right after the judge had eaten, like taken a lunch break, or when they had eaten breakfast, their sentences were much lighter and much easier, but then right when they were coming up to lunch time, or right when they were getting to the end of the day, their sentences were much harsher. Again, it blows my mind that something that seems so… that should be so objective, and so rational, something as base as physical attractiveness can exert that much of an influence. I think, personally I feel… probably most of the people listening to the people listening to this podcast, if you were to ask anybody: “Hey, does physical attractiveness impact the way that you feel about people?” we’re taught from the age of two to be like, “No, of course not,” right? Don’t judge a book by its cover. Well, even in the judicial system highly educated judges are making decisions at a subconscious level based on physical attractiveness, and based on the liking bias.

Another example is something called mirroring and matching. This is actually something you can try at home, and if you are a follower of Tony Robbins at all, he advocates this, and talks about this, a lot. Mirroring and matching is something that’s really fascinating, and I’ll tell you kind of an example that you can do and then we’ll talk about the research, but one thing you can do is actually… the way to build rapport with people is to mirror and match their behavior, which basically means somebody’s talking in a certain tone, match their tone of voice. If somebody’s sitting a certain way, sit the same way as they do. If somebody has their arms crossed, cross your arms. If they’re leaning forward, lean forward; etcetera. There’s all kinds of- you’ve heard that stat that X percentage of communication is nonverbal. What that really means is that mirroring and matching, and sort of doing exactly what someone does physically, is a way to subconsciously create a connection with somebody, and build rapport with someone even without ever saying anything. One of the ways you can try that is: If you’re ever at a restaurant, or at a bar, pick out somebody, like a total stranger- and this an exercise I think Tony Robbins came up with- just start mirroring and matching everything that they do. When they take a sip of their water, take a sip of your water. When they scratch their head, scratch your head. All of the activities, everything they do, mirror their activity exactly, and what will happen is a lot of times that person will come up to you randomly and be like, “Hey, do I know you from somewhere?” because their subconscious has picked up on some sort of similarity between the two. They like you at some level because of the fact that you’ve been mirroring and matching them. Because you’ve been doing physically the same thing as them.

So, I’ll just read this brief quote from Influence where they talk a little bit about how mirroring and matching ties into the liking bias: “Many sales training programs now urge trainees to ‘mirror and match the customer’s body posture, mood, and verbal style. As similarities along each of these dimensions have been shown to lead to positive results.’” Here’s another quote: “A 1970 study conducted at the University of Pennsylvania, by a guy named Dr. Ray Birdwhistell”- quite the name- “concluded that 93% of our communication takes place nonverbally and unconsciously.” Mirroring and matching is part of the way, or part of the reason, that that takes place. 

Alright, now let’s take a look at a research example that talks about familiarity. Familiarity can be an extremely powerful bias. It’s something that Cialdini draws on, and that Daniel Kahneman, who we talked about before, calls the ‘mirror exposure effect’. Drawing again from Thinking Fast and Slow, here’s a fascinating experiment about familiarity that Kahneman and his associates conducted, where they showed images rapidly and then later asked participants to rate if the images were good or bad. Here’s how Kahneman describes it: “When the mysterious series of ads ended, the investigators sent questionnaires to the university communities asking for impressions of whether each of the words ‘means something good or something bad’. The results were spectacular. The words that were presented more frequently were rated much more favorably than the words that had been shown only once or twice. The findings had been confirmed in many experiments using Chinese ideographs, faces, and randomly shaped polygons. The mirror exposure effect does not depend on the conscious experience of familiarity. In fact, the effect does not depend on consciousness at all. It occurs even when the repeated words, or pictures, are shown so quickly that observers never become aware of having seen them. They still end up liking the words or pictures that were presented more frequently. As should be clear by now, system one can respond to impressions of events of which system two is unaware. Indeed, the mirror exposure effect is actually stronger for stimuli that the individual never consciously sees.” Wow, that’s pretty crazy. Think about that. If you see an image more frequently, you’ll like it more. You’re more familiar with it and that drives you to like it more, but the crazy thing is if you see it only at a subconscious level, you actually have a stronger positive association with it. This is a really, really dangerous way that liking bias can manifest itself. It’s something that, at a subconscious level, the more you’re exposed to something- that’s why Kahneman calls it the mirror exposure effect- the more you’re exposed to something, the more times you see it subconsciously, the more that you like it. The more that it can drive your behavior. It doesn’t matter what it is. They did it with words, faces, Chinese characters, randomly shaped polygons, all kinds of different things, and the effect still held. It was more powerful when they showed it at such a speed that people were not consciously aware of it. It never ceases to amaze me that the human mind can be manipulated, or impacted, by something like that. It’s fascinating. If you don’t think about it, if you don’t understand it, it can impact you, but there are ways that you can still combat that and defend against that, and that’s one of the things that Cialdini talks about in Influence, and we’ll talk about it in the learnings and recap section of this episode. That particular experiment is, to me, maybe the most powerful, the most interesting, experiment on this episode. 

The next piece of the liking bias is something that, on the surface sounds very similar to familiarity, and there are similar undertones, but we’re gonna talk about Pavlovian association. The Pavlov experiment is the experiment where he rings the bell while he’s feeding the dogs, and he does that for a while, conditions them to do that, and then rings the bell without feeding them and they salivate. The way that’s typically taught, or the way people react to that is: “Okay, cool. So, the bell rang and the dog salivated. Congratulations.” What does that really mean? What that really means is that any two completely unrelated phenomenon can be linked together, and can drive your perception, and the way that you think and feel about that particular object. One of the most obvious manifestations of Pavlovian association is when you see an advertisement that has a celebrity endorsement, and often the celebrity has nothing to do with the product they’re endorsing, but just having the celebrity endorsement itself is what drives those sales; what drives people to like that particular product. If you like Peyton Manning and he’s endorsing something on TV, at a subconscious level you draw the association, the connection, between those two things, and you like whatever he’s endorsing more. In Influence they cite a number of examples of TV doctors, actors who play doctors on TV, doing commercials where they advocate certain medicines, or certain medical procedures, or whatever it might be. It has a huge positive impact on the sales of that particular procedure, or product, or whatever it is, which is totally ludicrous if you think about the fact that just because they play a doctor on TV, they have absolutely no medical credibility, but because of the Pavlovian association between seeing that actor on television playing a medical expert, people are driven to believe what they have to say, and listen to what they have to say. 

I want to tie this in with a quote from Charlie Munger, who’s somebody I’m a huge fan of, and somebody we’ve talked a lot about on the podcast. He really hammers home how widespread, and how relevant, Pavlovian association is, and how much it impacts huge swaths of our society in our everyday lives without us having any knowledge, or any realization. “Practically three quarters of advertising works on pure Pavlov. Just think how pure association works. Take Coca Cola, where we’re the largest shareholder. They want to be associated with ever wonderful images: heroics, the Olympics, music, you name it. They don’t want to be associated with president’s funerals. The association really works at a subconscious level, which makes it very insidious. The Persians really did kill the messenger that brought the bad news. Do you think that is dead?” I love the analogy of Coca Cola advertising and the fact that, if you think about it, if you see any advertisement they’ve ever done, it’s all about happiness and ‘make the world a better place’, and ‘let’s all be happy’, and open happiness, all that stuff. They’re not running advertisements with president’s funerals, and that’s because those have a very negative, very sad association, but the reality is whatever they’re advertising with, the association that they’re drawing doesn’t necessarily have anything to do with what they’re actually talking about. It’s just like the dog and the bell. Two completely unrelated phenomenon, and just through repeating them over and over and over again, as the Kahneman experiment shows, you can link those things together and make people feel, and really believe, that there’s a positive association there.

One other thing I wanted to touch on briefly is the impact of flattery and compliments, and how those tie into the liking bias. They did a study in 1978, and they found that, quoting from Influence, “Positive comments produced just as much liking for the flatterer when they were untrue as when they were true.” I mean, that’s something that’s pretty simple and straightforward, but again it’s so transparent, and it’s so obvious. You can give someone a compliment that isn’t even true and it will increase, at a subconscious level, their liking towards you and how they feel about you.

So, let’s tie this up. Let’s wrap this up and talk about some of the key learnings about the liking bias. I know we touched on a bunch of research, and some of this research is mind-blowing, but there’s really four or five core drivers of the liking bias. We talked about physical attractiveness, we talked about how that impacts the supposedly objective judicial system. We talked about similarity and how similar others can- and mirroring and matching- can drive a subconscious connection, a subconscious liking bias. We talked about familiarity, how just merely seeing something, and being more familiar with it, even at a subconscious level, makes you like something more. We talked about Pavlovian association, about how just connecting two unrelated things, again and again and again, can drive somebody to like something. And we touched briefly on the power of praise and flattery even if it’s totally transparent and totally obvious. 

How can we defend against the liking bias? Cialdini cites two ways to potentially catch ourselves, or defend against, falling prey to this bias. The first thing he recommends is to focus on finding, and being aware of, the feeling that we’ve come to like something, or someone, more quickly and more deeply than we would have expected to. If you just met somebody and suddenly you’re thinking, “Oh my gosh, I love this guy,” or like, “We are new best friends and we just met,” maybe there’s something at play there. Maybe that should be a trigger to just press pause and think, “Hold on a second. I need to pull back, and I need to think about this a little more deeply. Why have I suddenly jumped in and become so- why have I started liking this thing so much so rapidly?” Again, as we talked about in previous Weapons of Influence episodes, the way to cultivate the mental awareness to be able to flag those thoughts in your mind and catch on to them, is with tools like meditation, which we will talk about in a future episode. 

The second thing that Cialdini recommends is the simple recognition of the fact that we like something so much when it isn’t really warranted by the facts, or isn’t really warranted by the data, it is one of the best ways to combat that. Again, there’s no perfect solution, but it really stems from self-awareness and trying to be objective, and even if you can just catch yourself liking something more than you should, or liking something for a totally- no reason that you can rationally determine, flagging that thought in your mind is enough to start building the awareness, and slowing down and saying, “Hold on a second. Why am I falling prey to this bias?”

That wraps up our episode on the liking bias. 

 

February 09, 2016 /Austin Fabel
Weapons of Influence
Weapons of Influence, Influence & Communication

Why You Should Always Ask the Guy in the Blue Jacket for Help

February 02, 2016 by Austin Fabel in Weapons of Influence, Influence & Communication

Based on the bestselling book “Influence” by Robert Cialdini, this is the third week of our six-part "Weapons of Influence" miniseries within "The Science of Success". If you loved the book, this will be a great refresher on the core concepts. And if you haven't yet read it, some of this stuff is gonna blow your mind. 

So what are the 6 weapons of influence?

  • Reciprocation

  • Consistency & Commitment

  • Social Proof

  • Liking

  • Authority

  • Scarcity

Each one of these weapons can be a powerful tool in your toolbelt - and something to watch out for when others try to wield them against you. Alone, each of them can create crazy outcomes in our lives and in social situations - but together - or combined - they can create huge impacts.

Today’s episode covers the third weapon of influence - Social Proof. In it, we'll cover:

How social proof can over-ride people’s will to liveWhy news coverage makes mass shootings more likelyWhy TV shows use canned laughterHow someone could be stabbed in front of 38 people without any helpHow you should ask for help in a dangerous situation

Thank you so much for listening!

Please SUBSCRIBE and LEAVE US A REVIEW on iTunes! (Click here for instructions how to do that!).  

EPISODE TRANSCRIPT

Today you’re going to learn why news coverage makes school shootings more likely by a factor of more than 30 times, which is pretty insane; how someone can get stabbed to death in front of 38 people and no one does a thing; and why you should always point at the dude in the blue jacket and tell him to help you. If you missed last week’s episode about weapons of influence, don’t worry. I explain the series now, but you should absolutely go back and listen to it. 
For those of you who were here last week this is going to serve as a quick refresher on the topic. This is the third episode in a six part series based on the bestselling book Influence by Robert Cialdini. If you love that book you’re going to find this to be a great refresher on the core concepts, and if you haven’t read it yet some of this stuff is going to blow your mind.
So, what are the six weapons of influence? “Reciprocation”, which we talked about two weeks ago. Highly recommend you go back and listen to that episode, as well as the second one, which is “consistency and commitment tendency”, which we talked about last week; “social proof”, which we’re going to talk about today; “liking”, “authority”, and “scarcity”. Each one of these weapons can be a powerful tool in your tool belt, and something to watch out for when others try to wield them against you. Alone each of them can create crazy outcomes in our lives, and in social situations, but together or combined, they can result in huge impacts.
In episode one we talked about the biological limits of the human mind. If you haven’t listened to that episode yet you should really go back and check it out after you listen to the Weapons of Influence series, or even just after you listen to this particular podcast, because it explains how these automatic click-whirr responses get triggered when cognitive biases, like social proof, come into play. It explains how some of these evolutionarily beneficial traits and behaviors can sometimes result in crazy, ridiculous outcomes. 
In episode one we talked about the example of the mother turkey taking care of a polecat, which is one of those examples, and in the last two episodes of Weapons of Influence we’ve gone through dozens of research studies and examples that show how tiny little tweaks in behavior can result in substantial differences in outcome solely based on activating, or triggering, cognitive biases. 
The weapons of Influence series- and this is again the third part; we’re going to talk about “social proof”- is really going to dig into the meat of some of the most powerful cognitive biases that can impact your mind, and we’re going to learn how these can be used to manipulate you if you don’t know how to defend against them, and how they can be part of your arsenal if you learn how to harness them. Here’s how Cialdini describes the impact of these weapons of influence: Quote, “Each principle has the ability to produce a distinct kind of automatic mindless compliance from people that is a willingness to say ‘yes’ without thinking first.” 
Today we’re going to talk about “social proof”. It’s so powerful it can literally override someone’s desire to live. Have you ever been in a situation where you didn’t know what was going on? Maybe in a foreign country, or a new city, and you get caught up in something and think, “What am I supposed to do next? What am I supposed to do here?” Do you ever have that tendency to look around and see what other people are doing? They probably know what to do so you follow them; get in line; etcetera; right? That’s “social proof” and sometimes social proof can be totally conscious. If you’re in a foreign country and you go somewhere and you don’t know where to stand; you don’t know where to line up; you don’t know how to eat your food; you don’t know what the customs are; you look around and you figure out, “How’s everybody else doing it?” and you consciously imitate them. That’s a conscious example of social proof, but there are also a number of ways social proof can manifest itself totally subconsciously. Like I said at the top, “It is an incredibly powerful phenomenon.” Literally in many cases can override the desire to live. Here’s how Cialdini describes it in Influence: “This principle states that we determine what is correct by finding out what other people think is correct. The principle applies especially to the way we decide what constitutes correct behavior. We view a behavior as correct in a given situation to the degree that we see others performing it.”
This week is going to get a bit darker than some of the other weeks as we look at some of the crazy things social proof can motivate people to do. As I said before, “It’s literally so powerful that it some instances it can result in people committing suicide as a result of social proof.”
Here’s another quote from Influence: “Work like Phillips helps us appreciate the awesome influence of the behavior of similar others. Once the enormity of that force is recognized it becomes possible to understand one of the most spectacular acts of compliance of our time, the mass suicide at Jonestown, Guyana. If you remember- if you’ve ever heard of Jonestown- it’s the instance where a huge cult of people all drank cyanide-laced Kool-Aid and killed themselves, and that’s something the we will talk about in a minute, but something that is a striking and haunting example of the ridiculous power of social proof.
One of the most simple experiments, and it’s just something -it’s a little bit more uplifting than some of these other ones- but I call it “The Dog Terror Experiment”, and it was conducted in 1967 on nursery school age children. They were chosen because specifically they were terrified of dogs, and the experiment was really basic. Essentially they had these children who were really scared of dogs watch a little boy play with the dog, and have a lot of fun, and be happy for 20 minutes a day. These children- the result of just watching that video produced such as drastic change in these children that were terrified of dogs that after only four days 67% of them were willing to climb in a playpen and play with a dog, with being literally terrified of dogs four days earlier. That shows you how someone who’s very similar to you- and similarity is one of the key drivers of social proof- people who are really similar to you, just watching a video of them doing something can subconsciously change your perception. It can overcome phobias; that’s how powerful social proof is as a phenomenon.
The next instance of social proof, and this isn’t necessarily an experiment, but it demonstrates a concept which is called “pluralistic ignorance”. It’s something that’s pretty shocking, but you may have heard of it if you’ve dug in or done much reading about psychology, but it’s the infamous incident of Kitty Genovese. I’ll read you this quote from Influence: “For more than half an hour 38 respectable law abiding citizens in Queens watched a killer stalk and stab a woman in three separate attacks in Kew Gardens. Twice the sound of their voices, and the sudden glow of their bedroom lights, interrupted him and frightened him off. Each time he returned, sought her out, and stabbed her again. Not one person telephoned the police during the assault. One witness called after the woman was dead. That was two weeks ago today, but assistant chief inspector Fredrick M. Wilson, in charge of the borough’s detective activities and a veteran of 25 years of homicide investigations, is still shocked. He can give a matter-of-fact resuscitation of many murders, but the Kew Gardens slaying baffles him. Not because it is a murder, but because quote unquote: ‘Good people failed to call the police,’ end quote. How does something like that happen? How does somebody get stabbed in front of 38 people and nobody does anything to stop it? Again, it’s a phenomenon called “pluralistic ignorance”, and it’s a manifestation of social proof. What happened in the Kitty Genovese stabbing, and a lot of psychologist have talked about this; have researched this; have written about it; but essentially it’s the idea- and I’m sure everybody has thought this or felt this at some time: If you’ve ever driven by somebody with their car broken down on the side of the road and you think, “Oh, somebody’s going to help them,” right? That’s what pluralistic ignorance is. It’s the idea that every one of those 38 people saw this happening, heard this happening, and they thought to themselves, “Somebodies got to be calling the police. Somebodies got to be doing something. Somebody else is helping, so I don’t need to help,” or “I don’t want to help,” or “I don’t want to be another phone call into the police,” or whatever. The reality is because every single person felt that way, and thought the same thing, no one did anything and she was murdered in front of 38 bystanders, all of them which could have potentially saved her life. That’s pretty shocking and it shows you how social proof can have a huge impact.
Another similar experiment was conducted in Toronto in 1971. They had a single bystander- they created situations where there was a single bystander, and then they sort of created some kind of faux “emergency situation”; somebody collapsed on the ground, or something like that. In the instances where there was a single bystander, 90% of the time the single person helped the person who was having some kind of an emergency situation. In the instance where they then planted two passive bystanders to simply sit there and watch as the emergency situation- quote unquote- unfolded. In that instance only 16% of people helped the person who looked like they were having the emergency situation. So, if it happened to be one person walking down the street, and this person collapses on the ground and is writhing around, 90% of the time that person is going to help the person who’s on the ground struggling, but if you just plant two people standing there and watching, only 16% of people will then help the person who’s on the ground. And again, that’s “pluralistic ignorance” manifesting itself. It’s an example of how social proof can shape our behavior even if we’re not cognizant of it; even at a subconscious level.
The next example of social proof is something called the “Werther effect”, or as I like to call it, “Why I don’t like the evening news.” The Werther effect is this fascinating phenomenon where they discovered that every time a suicide is published in the news, there’s a massive uptick in suicides, and related suicides, and suicides that are very similar to that particular kind. I’ll quote here from Influence: “The Werther effect from examining the suicide statistics of the United States between 1947 and 1968 found that within two months of every front page suicide story, an average of 58 more people than usual killed themselves. In a sense, each suicide story killed 58 people who otherwise would have gone on living.” That’s pretty wild; it’s pretty fascinating. Again, they did a statistical analysis over a 20 year period where they controlled for seasonality; they controlled for age; they controlled for all these different factors; and they basically found that because of the idea that these people- again it’s about similar others; people who are like you- there’s this subconscious tendency that as soon as you see somebody who is like you doing something, it suddenly kind of enters the realm of “acceptable behavior”, or behavior that’s okay for you to do. Or maybe it’s like, “Oh, well somebody just like me did this. Maybe it’s something that I should be thinking about. Maybe it’s something that I should be doing.” Sometimes that can be good; sometimes that can be bad; sometimes it can be really, really bad. It blows my mind, but every time they publish a front page story about a suicide 58 more people, then otherwise would have, kill themselves. 
There’s actually a related inference from the Werther effect, and I’m sure you might be thinking about it now, but I’ll read this quote from Influence and then we’ll talk about it: “Back in the 1970s our attention was brought to the phenomenon in the form of airplane hijackings, which seemed to spread like airborne viruses. In the 1980s our focus shifted to product tamperings, such as the famous case of Tylenol capsules injected with cyanide, and Gerber baby food products laced with glass. According to FBI forensics experts, each nationally publicized incident of this sort spawned an average of 30 more incidents. More recently we’ve been jolted by the specter of contagious mass murders occurring first in the workplace setting, and then, incredibly, in the schools of our nation. I don’t think that could be timely, or more relevant, today. When you think about the fact that mass shootings have become something that everybody’s talking about now in the United States, and it’s amazing, but when you think about it: every time we publish, and blow up, and talk nonstop incessantly about these things, FBI research and statistical analysis has shown every time one of these events gets publicized it creates 30 copycat events. That’s mind-blowing to me, and it’s one of the reasons that- and maybe we’ll talk about this in a future podcast- but I really… I don’t read the local news; I don’t read the evening news because it’s filled with so much negativity, but I won’t go down that rabbit hole right now. 
So, what are the practical takeaways that we can learn about social proof, and this incredibly powerful phenomenon, and how can we take these lessons and apply them to our daily lives? Remember “social proof” is the conclusion that people often use other’s behavior in order to decide how they should handle situations. Especially when dealing with uncertainty. To quote Cialdini again: “The principle of social proof states that one important means that people use to decide what to believe, or how to act in a situation, is to look at what other people are believing or doing there. Powerful imitative effects have been found among both children and adults, and in such diverse activities as: purchase decisions, charity donations, and phobia remission. The principle of social proof can be used to stimulate a person’s compliance with a request by informing the person that many other individuals- the more the better- are, or have been, complying with it.” 
Cialdini also nails the two most important implications of social proof in this quote: “Social proof is most influential under two conditions. The first is “uncertainty”. When people are unsure, when the situation is ambiguous, they are more likely to attend to the actions of others and to accept those actions as correct. In ambiguous situations, for instance, the decisions of bystanders to help are much more influenced by the actions of other bystanders then when the situation is a clear-cut emergency. The second condition under which social proof is most influential is “similarity”. People are more inclined to follow the lead of similar others.” 
So, how do people make use of that? How do you see that manifesting itself in everyday life? Obviously there’s a lot of those negative consequences. One of the smaller ways that you see it, or one of the ways that people apply it in a sales context, is through the use of testimonials, or through the use of: “50 million households can’t be wrong that they’re buying XYZ,” right? Or, when you see your friends doing something and you want to do it as well, right? Trends- in a lot of ways- are kind of manifestations of social proof, but another way that you can kind of combat some of the implications of pluralistic ignorance, which is the Kitty Genovese phenomenon that we talked about before, is by using specific call-outs. Here is what Cialdini says, “Point directly at that person and no one else: ‘you sir in the blue jacket, I need help. Call an ambulance.’ With that one utterance you would dispel all the uncertainties that might prevent or delay help. With that one statement you will have put the man in the blue jacket in the role of the rescuer.” So, if you’re ever in a situation, and it’s an emergency and you’re being robbed, or being- you’re choking, or have some kind of medical situation and there’s a group of people, single out an individual person. Point to them and ask them specifically to help you. That eliminates the pluralistic ignorance; that eliminates social proof from kind of combating people from potentially being able to help you.
Another way that you can potentially use social proof to your advantage is by figuring how to arrange group conditions. If you’re in a management context- or something like that- to leverage social proof for your benefit. You want to be able to kind of demonstrate: “Hey, here’s how XYZ is doing it. Here’s how our competitors are doing it. Here’s how similar others are doing it,” right? Because similarity is one of the most powerful drivers of social proof, but there’s a lot of applications of social proof in day-to-day life and sales testimonials; all kinds of different things. So, it’s something that has these huge social implications. If you think about school shootings; you think about mass suicide- all this type of stuff- but it also has a lot of implications in our day-to-day life, and it’s something that- it’s really, really hard bias to combat. One of the ways you can defend yourself against it is kind of cultivating that ability to stop and say, “Hey, why am I doing this?” If you catch yourself saying, “Well, everybody’s doing this so I should think about doing it too,” that’s a red flag, and that’s something that you should really think about: “Hey, hold on. Pump the brakes. Maybe I shouldn’t be doing that. Maybe I should think this through,” and more logically really come to a conclusion then just be influenced by similar others, and kind of fall prey to social proof.

 

February 02, 2016 /Austin Fabel
Weapons of Influence
Weapons of Influence, Influence & Communication

The Power and Danger of a Seemingly Innocuous Commitment

January 26, 2016 by Austin Fabel in Weapons of Influence, Influence & Communication

Based on the bestselling book Influence by Robert Cialdini, this is the second part of a new six-part miniseries within "The Science of Success" called "Weapons of Influence". If you loved Cialdini's book, this will be a great refresher on the core concepts in it. And if you haven't yet read it, some of this stuff is gonna blow your mind. 

So what are the 6 weapons of influence?

  • Reciprocation

  • Consistency & Commitment

  • Social Proof

  • Liking

  • Authority

  • Scarcity

Each one of these weapons can be a powerful tool in your belt - and something to watch out for when others try to wield them against you. Alone, each of them can create crazy outcomes in our lives and in social situations, but together they can have huge impacts.

Today’s episode covers the second weapon of influence: Consistency & Commitment Bias. We'll cover:

The powerful application of the “foot in the door” techniqueWhy hard won commitments are the most powerfulThe dangers of seemingly innocuous commitmentsHow commitment builds its own internal justificationsHow you can defend yourself against falling prey to commitment bias

Thank you so much for listening!

Please SUBSCRIBE and LEAVE US A REVIEW on iTunes! (Click here for instructions how to do that!).  

EPISODE TRANSCRIPT

Today, you’re going to learn why you should always ask that stranger to guard your bag at the airport; how a simple phone call increased donations to the American Cancer Society by 700%; how people get slowly roped into huge commitments without realizing in, and much more.

If you missed last week’s episode about weapons of influence don’t worry, I’ll explain the series now, but you should go back and listen to it. For those of you who tuned in last week, here’s a quick refresher on the Weapons of Influence series. This is the second of a six part series based on the bestselling book, Influence, by Robert Cialdini. If you loved that book this will be a great refresher on the core concepts, and if you haven’t read it yet some of this stuff is going to blow your mind.

So, what are the six weapons of influence? Reciprocation, which we talked about last week, and I highly recommend after you listen to this, go back and listen to Reciprocation so that you can get all six of the weapons; consistency and commitment, that’s what we’re going to talk about this week; social proof, that’s next week’s episode; liking, authority, and scarcity. Each one of these weapons can be a powerful tool in your tool belt, and something to watch out for when others try to wield them against you. Alone each of them can create crazy outcomes in our lives, and in social situations, but together or combined, they can result in huge impacts.

If you remember in episode one we talked about the biological limits of the human mind. If you haven’t listened to that episode yet you should absolutely go back and check it out. In that episode we talked about the automatic click whirr response that gets triggered when a cognitive bias comes into play; how evolutionarily beneficial traits and behaviors can sometimes manifest themselves in ridiculous outcomes, like the example of a mother turkey taking care of a polecat, which happens to be its natural predator and enemy. These weapons of influence are exactly those kinds of cognitive biases. We’re really going to get into the meat of some of the most powerful cognitive biases that cause human decision making to go haywire. These weapons of influence can be used to manipulate you if you don’t know how to defend against them, and can be part of your arsenal if you learn how to harness them. As Cialdini described weapons of influence in his book, Influence: Each principle has the ability to produce a distinct kind of automatic mindless compliance from people. That is a willingness to say ‘yes’ without thinking first. 

The topic today is weapon of influence number two, consistency and commitment. I will start with an overview of what consistency and commitment bias is, then we will dive into a number of ridiculous research studies that demonstrate this behavior in the real world, and lastly we will look at some of the practical implications of how you can use this in real life.

So, what is consistency and commitment tendency? Here’s how Cialdini puts it: “It is quite simply our desire to be, and to appear, consistent with what we have already done. Once we make a choice, or take a stand, we will encounter personal and interpersonal pressures to behave consistently with that commitment.” He continues later in the book, “To understand why consistency is so powerful a motive we should recognize that in most circumstances consistency is valued and adaptive.” Remember this all comes back to the biological limits of the mind. The traits and characteristics that were super valuable from an evolutionary standpoint- that’s why he says it’s adaptive- can often go haywire when they collide with modern day society. Okay, so what? People like to be consistent. Why does that matter? Well, that simple bias toward staying consistent with what you have said, and more importantly with what you have done, because research shows that actions commit us more strongly at a subconscious level. 

Here’s another quote from Cialdini about the importance of the commitment and consistency bias: “Psychologists have long recognized a desire in most people to be and look consistent within their words, beliefs, attitudes, and deeds. This tendency for consistency is fed from three sources. First, good personal consistency is highly valued by society. Second, aside from its effect on public image, generally consistent conduct provides a beneficial approach to daily life. Third, a consistent orientation affords a valuable shortcut through the complexity of modern existence. By being consistent with earlier decisions one reduces the need to process all the relevant information in future similar situations. Instead, one merely needs to recall the earlier decision, and to respond consistently with it. Within the realm of compliance, securing an initial commitment is the key. After making a commitment that is taking a stand or position, people are more willing to agree to requests that are in keeping with their prior commitment.”

Now let’s dig into the research. The first experiment that we’re going to talk about today is what I call ‘the blanket experiment’. This experiment was done in 1975. The control scenario: They had somebody sitting outside with their stuff, and they simply got up, walked away, and then they had a sort of staged theft where someone would come in, steal their bag, and run off. They did this 20 separate times and on four occasions somebody stepped in and did something to stop, or prevent, or say something: “Hey, what are you doing? Why are you taking that person’s bag?” whatever. Then they did the experiment a little bit differently with a slight twist, and the results were dramatically different. In this instance they have the same person come by, set down their bag, and then walk off, with the exception that they then asked somebody nearby to, “Watch my things”. That was the only difference. Three words, “Watch my things.” In that instance, 19 out of the 20 instances, that person who was asked became as they say in Influence, “virtual vigilantes running after and stopping the thief, demanding an explanation, often restraining the thief physically, or snatching the object back.” That’s pretty amazing when you think about that. Simply by committing a total stranger to a simple sentence with a three word question, or a three word statement: “Watch my things,” they went from 4 out of 20 people stopping them from taking the bag, to 19 out of 20 people stopping them, and becoming, as they say, “virtual vigilantes”. That’s what happens when you get people to commit to something very simple. They stay locked in and become extremely consistent. They want to stay consistent with their behavior. So, that little toehold, that little question, causes them to suddenly be chasing after a thief, which is something that could be incredibly dangerous, right? 

This next experiment is also pretty fascinating, and the results are astounding. This took place in 1980 in Bloomington, Indiana. A social psychologist named Steven J. Sherman conducted this experiment. He had the control group, where he simply called people and asked them: “Hey, would you be willing to spend three hours volunteering for the American Cancer Society going door-to-door collecting money?” He then had the experiment group where they called people and asked them ahead of time, “As a hypothetical, would you be willing to spend three hours volunteering for the American Cancer Society?” Not wanting to be rude or uncharitable, people said, you know in thinking about, “Yeah, of cour- yeah, I’d be willing to do that. Yeah, hypothetically.” Then they had that group… they had them call again three days later and ask those people, “Hey, can you volunteer at such and such date, and can you actually go door-to-door and canvas for three hours for the American Cancer Society?” They had a 700% increase in volunteers in their success rate when they did that. That’s an astounding result if you think about it. A 700% increase simply by calling three days ahead of time and saying, “Hypothetically, would you be willing to volunteer?” and people said, “ Yeah, of course. I love volunteering. I love helping people fighting cancer. Yeah, I’d, in theory, I’d volunteer,” right? That little tiny subconscious commitment days later resulted in a 700% increase in volunteers. It’s fascinating. 

Another experiment, which I call the ‘yard sign experiment’, was conducted in 1966 by Jonathan Friedman and Scott Frasier, and I’ll quote here from Cialdini’s Influence: “A researcher posing as a volunteer worker had gone door-to-door in a residential California neighborhood making a preposterous request of homeowners. The homeowners were asked to allow a public service billboard to be installed on their front lawns. To get an idea of the way this sign would look they were shown a photograph depicting an attractive house, the view of which was almost completely obscured by a very large, poorly lettered sign reading, ‘Drive Carefully’.” In that instance only 17% of the people said yes to this request. This is where it gets really interesting. They conducted another study. They went door-to-door, same thing, asked people to display a ridiculously oversized drive carefully sign, but in this instance 76% of the people said yes. From 17% to 76%. What was the change? Two weeks before that a door-to-door canvasser had come by and asked those homeowners to display a small three inch sign on their driveway that said, “Be a safe driver.” That tiny little commitment two weeks beforehand resulted in 76% of the people being willing to display a gaudy, ridiculous, oversized billboard on their front yard that said, “Drive Carefully,” whereas only 17% of the people who were asked to do that without a prior commitment did it. That shows you how powerful it can be when you commit something, when you commit to something, even the smallest fashion. You kind of escalate into it, and subconsciously want to be consistent with what you’ve done, and so you get roped into it, or sucked into it, and all of a sudden you don’t even realize that it’s a completely subconscious process, and suddenly you’ve got a giant billboard on your front yard.

Interestingly, Friedman and Frasier conducted a similar experiment where they had someone go door-to-door and get people to sign a petition about state beautification. They then came by a couple weeks later and asked again, “Would you like to put a giant ‘Drive Carefully’ sign in your yard?” and of those people, nearly half of them said yes. So, it wasn’t quite the 76% jump. It was from 17% to 50%, or so, which is still a pretty astounding leap. That’s still almost a tripling of the compliance rate. What caused people to do that? They speculated that because people somehow now viewed themselves as civic-minded citizens, because they had signed a simple petition weeks earlier about state beautification, something totally unrelated, they now were willing to put that billboard in their driveway. As Cialdini says in Influence, “What the Friedman and Frasier findings tell us, then, is to be very careful about agreeing to trivial requests because that agreement can influence our self-concepts,” and that’s why this is such an insidious tendency. In this instance, whether it’s simply agreeing to a hypothetical, “Hey, yeah I’d be willing to volunteer my time, in theory,” or signing a petition, “Yeah, I’m in favor of state beautification,” or putting a tiny little sign in your yard, again and again these simple, innocuous commitments can result in an escalation that you get sort of drawn in, and sucked in, and before you know it you’re doing all kinds of stuff because you’ve built up this image in your mind that you’re trying subconsciously to stay consistent to, and that’s why it’s such a powerful cognitive bias.

So, those are a couple examples of the research and how different research studies have demonstrated this tendency, and it’s been demonstrated many more times than that, but those are just three examples that I thought you would find really interesting. Now, I want to talk about: What are some of the practical implications of the consistency and the commitment bias? Here’s a great quote from Cialdini that sums it up very nicely: “It appears that the commitments most affective in changing a person’s self-image and future behavior are those that are active, public, and effortful.” So, let’s dig into a couple of these practical implications. The first is the concept of the foot in the door technique, and that’s what they demonstrated with the yard sign experiments, is that a lot of times if you can land, or if you can get, just this innocuous initial concession you can kind of build on that, and suddenly get people to agree to things that internally to them seem very consistent with their self-image, but started with this tiny little commitment. An example of that in negotiations is to give somebody a reputation to live up to. Here’s a quote from Influence talking about Anwar Sadat: “One of the best at it was former President of Egypt, Anwar Sadat. Before international negotiations began, Sadat would assure his bargaining opponents that they, and the citizens of their country, were widely known for their cooperativeness and fairness. With this kind of flattery he would not only create positive feelings, but he also connected his opponent’s identities to a course of action that served his goals.” Remember public commitments are more powerful. That’s why if you put something in your yard, or you state publicly a position, it’s really hard to back down from that. It’s really hard to change course from that, and the research shows again and again that the more publicly committed to something you are, the more it’s kind of engrained in your identity. Hard one conclusions are the most valued, as Cialdini says, and he actually uses the example in the book Influence of fraternity hazing, right? The more you suffer and toil away for a conclusion, or a piece of your identity, the more you want to stay committed to that. The more it means something to you, and the harder it is to see that blind spot in your mind, to see that bias that’s shading your vision, or your actions.

Another really important take away is that the most effective commitments are focused internally, not externally. There’s an experiment that is fascinating, and a little, in some ways, shows how twisted psychologists can be, but I call it the ‘toy robot experiment’, and in this experiment they had 22 kids come and visit this psychologist, and they would leave the kids alone in a room with a number of different toys. In the first example the psychologist said to the child, before they left them alone and then went around to watch them through a one way mirror, “It is wrong to play with the robot. If you play with the robot, I’ll be very angry and will have to do something about it.” So, they had five or six toys in there. All of them were pretty lame, except the robot was like, totally awesome, so the kids had this natural incentive to go play with the robot, or it was like, a rubber duck and a bunch of other junk toys, but in that survey, only 1 out of the 22 children played with the robot. 

They did another study where the psychologist simply said, “It is wrong to play with the robot.” That’s it, they didn’t have any threat. They didn’t say they were going to be angry, whatever. In that research, in that study, again, 1 out of the 22 children played with the robot initially, but this is where it gets really fascinating. In the scenario where they threatened the students, where they had this external punishment: “I’m going to be angry and do something about it,” six weeks later they had the kids come back, put them in the same room, didn’t say anything to them, and let them play with whatever. The kids who had been threatened, 77% of those children, went back and played with the robot when they were in the room six weeks later. That’s because the external threat didn’t matter as much then. They weren’t as committed to it. They didn’t feel the need to stay as consistent with it. The kids who had been told only, “It is wrong to play with the robot,” no threat, more of an internal motivation, something they internalized, only 33% of those children played with the robot. So, less than half of the kids played with the robot in that scenario, and that demonstrates how much more powerful a commitment is if it’s internalized. Whether somebody’s trying to get you to internalize a commitment, or you can get someone to internalize a commitment, it shows you that to be super powerful, if these commitments are internalized, they’re- in this instance- more than doubly effective.

Another practical application is what’s called the ‘low ball technique’. That’s what Cialdini refers to it as, and I’ll read this quote from Influence, “When calling one sample of students, we immediately informed them of the 7 AM starting time. Only 27% were willing to participate.” He’s talking here about an activity that they wanted the students to participate in. The quote continues: “However, when calling a second sample of students we threw a low ball. We first asked if they wanted to participate in a study of thinking process, and after they responded, 56% of them positively; we mentioned the 7 AM start time and gave them a chance to change their minds. None of them did. What’s more, in keeping with their commitment to participate, 95% of the low balled students came to the psychology building at 7 AM, as promised.” So, that’s kind of a strategy where you get somebody to commit to something and then you layer in the bad news. I’m sure we’ve all experienced that at one time or another in our life, where someone had done that to us. That’s an example of the commitment and consistency tendency, right? If people knew off the bat that it was a 7 AM start time, only 24% of them were willing to participate, but as soon as they committed, and 56% of them committed on the front end; then after they had that commitment, and were psychologically anchored into that outcome, then when the bad news starts rolling in, they were okay and they accepted it, and they stuck with it. So, just flipping the wording, flipping that situation around, which seems so trivial, and something that you would never think about, can more than double the impact of what you’re saying or what you’re doing.

One of the other really fascinating takeaways that Cialdini talks about, and why commitment is such an insidious weapon of influence, is because commitment, in many cases, can be self-perpetuating. What he says is that commitments build their own legs. He likens it to a table analogy, and basically the table starts out with sort of a single leg, which is the commitment that you agree to, or you get someone to agree to, but then it starts building all of these other justifications around it, and people actually end up building their own subconscious justifications for that commitment that have nothing to do with what they initially committed to. The yard sign example is a perfect example that demonstrates that. These people started to think of themselves as an advocate for safe driving, or a civic minded citizen, or whatever, and all of these other justifications start being built, where the original justification doesn’t even matter and can be taken away, and people will still behave that way. That finding is found again and again in the research that you can actually literally take away the justification that people had for changing their behavior, or committing to a certain course of action, and in many cases their commitment stays just as strong, or sometimes even gets stronger once they’ve been committed down that path.

So, how do you defend against the commitment and consistency tendency? Here’s how Cialdini handles it: “I listen to my stomach these days, and I have discovered a way to handle people who try to use the consistency principle on me. I just tell them exactly what they are doing. This tactic has become the perfect counterattack for me. When my stomach tells me I would be a sucker to comply with the request, merely because doing so would be consistent with some prior commitment I was tricked into, I relay that message to the requester. I don’t try to deny the importance of consistency, I just point out the absurdity of foolish consistency. Whether in response the requester shrinks away guilty, or retreats in bewilderment, I am content. I have won and an exploiter has lost.” That shows us how important consistency and commitment tendency is, and how it can have huge results in your life, and how these little, simple commitments, something you would of never thought of or never even thought about, can actually change your self-image and your self-perception, and become these little seeds that get planted in your mind, and almost become self-perpetuating. Especially, think about the toy robot example, if it changes your identity, and changes your self-image or self-perception, it can shift the future direction of your behavior even if you completely forget about the original source of the commitment.

That’s it for today’s episode. 

 

January 26, 2016 /Austin Fabel
Weapons of Influence
Weapons of Influence, Influence & Communication

How To Triple the Rate of Your Success With One Simple Question

January 19, 2016 by Austin Fabel in Weapons of Influence, Influence & Communication

This week we are kicking off a new miniseries within "The Science of Success" called "Weapons of Influence". This is the first in a six-part series based on the best selling book Influence by Robert Cialdini. If you loved that book, this will be a great refresher on the core concepts. And if you haven't yet read it, some of this stuff is gonna blow your mind. 

So what are the 6 weapons of influence?

  • Reciprocation

  • Consistency & Commitment

  • Social Proof

  • Liking

  • Authority

  • Scarcity

Each one of these weapons can be a powerful tool in your belt - and something to watch out for when others try to wield them against you. Alone, each of them can create crazy outcomes in our lives and in social situations, but together they can create huge impacts.

Today’s episode covers the first weapon of influence - Reciprocation Bias - and you'll learn:

  • How reciprocation creates unequal exchanges, and in one experiment by a factor of more than 500%

  • Why reciprocation is powerful regardless of how much someone likes you

  • How giving away flowers help build a powerful religious movement

  • What made the “rejection and retreat” technique triple the success of an experiment

  • How to defend against reciprocation bias from negatively impacting your decisions

YouTube.png

Thank you so much for listening!

Please SUBSCRIBE and LEAVE US A REVIEW on iTunes! (Click here for instructions how to do that!).  

EPISODE TRANSCRIPT

Today, you’re going to learn how adding one simple question can triple the rate of your success, how a church used flowers to exponentially multiply their fundraising campaign, how one free drink generated over a 500% return, and much more.

I’m super excited this week because we are kicking off a new miniseries within Science of Success called Weapons of Influence. This is the first in a six part series based on the bestselling book, Influence, by Robert Cialdini. If you loved that book, this will be a great refresher on the core concepts, and if you haven’t read it yet, some of this stuff is going to blow your mind.

So, what are the six weapons of influence? The first is reciprocation; the second, consistency and commitment; the third, social proof; the fourth, liking; the fifth, authority; and the sixth is scarcity. Each one of these weapons can be a powerful tool in your tool belt, and something to watch out for when others try to wield them against you. Alone, each of them can create crazy outcomes in our lives and in social situations, but together, or combined, they can result in huge impacts. Something billionaire business partner of Warren Buffet, Charlie Munger, once described as ‘Lollapalooza Effects’. 

Remember in episode one when we talked about the biological limits of the human mind? If you haven’t listened to that episode yet, you should absolutely go back and check it out after you listen to this one. In that episode we talked about the automatic click whirr response that gets triggered when a cognitive bias comes into play. We talked about how evolutionarily beneficial traits and behaviors can sometimes manifest themselves in ridiculous outcomes, like the example of the mother turkey taking care of a polecat, which happens to be its natural predator and enemy. These weapons of influence are exactly those kinds of cognitive biases. Now, we are really going to get into the meat of some of the most powerful cognitive biases that cause human decision making to go haywire. These weapons of influence can be used to manipulate you if you don’t know how to defend against them, and can be part of your arsenal if you learn how to harness them.

Here’s how Cialdini describes the impact of these weapons in his groundbreaking book, Influence: “Each principle has the ability to produce a distinct kind of automatic mindless compliance from people. That is a willingness to say yes without thinking first.” Don’t forget, we like to keep our discussions grounded in the science. Each of these weapons of influence are deeply rooted and verified, again and again, by experimental psychology research. In this series I will share a number of crazy, hilarious, and sometimes sad examples of that with you.

The topic today is weapon of influence number one, reciprocation. I will start with an overview of what reciprocation bias is, then we will dive into a number of wacky research studies that demonstrate this behavior in the real world, and lastly we will look at some of the practical implications of how you can use this in real life. So, what is reciprocation bias? Part of the reason these biases are so powerful is because they have been built into our minds by thousands of years of evolution, and in the vast majority of cases, were incredibly evolutionary beneficial. It’s something that has been engrained in humans since birth, and in our culture for millennia. Here’s how Cialdini describes it: “According to sociologists and anthropologists, one of the most widespread and basic norms of human culture is embodied in the rule of reciprocation. The rule requires that one person try to repay, in kind, what another person has provided. By obligating the recipient of an act to repayment in the future, the rule for reciprocation allows one individual to give something to another with confidence that it is not being lost. The sense of future obligation within the rules makes possible the development of various kinds of continuing relationships, transactions, and exchanges that are beneficial to society. Consequently, all members of the society are trained from childhood to abide by the rule or suffer serious social disapproval.” 

Here’s how Cialdini defines the reciprocation rule, and note one of the terms he uses is a bit clunky. He often cites what he calls “compliance professionals”, which is essentially someone who is trying to get you to do something. Think of a salesperson, a boss, a negotiator, someone who’s trying to get you to comply with their requests. That’s why he says, “compliance practitioners”. Cialdini uses this throughout the book as a blanket term to describe those who wield the weapons of influence. Here’s another quote from Cialdini where he lays out the definition, and some of the ground rules, of reciprocation: “One favorite and profitable tactic of certain compliance professionals is to give something before asking for a return favor. The exploitability of this tactic is due to three characteristics of the rule for reciprocation. First, the rule is extremely powerful, often overwhelming the influence of other factors that normally determine compliance with a request. Second, the rule applies even to uninvited first favors, thereby reducing our ability to decide whom we wish to owe, and putting the choice in the hands of others. Finally, the rule can spur unequal exchanges. To be rid of the uncomfortable feeling of indebtedness an individual will often agree to a request for a substantially larger favor than the one he or she received.” In a nutshell, reciprocation bias is the tendency to reciprocate when someone does something for us, which makes perfect sense when you think about it, but the power of the bias really manifests itself when you think about the fact that: one, the effect still holds even when the gift is unwanted, and even when you don’t like the person giving you the gift; and two, the reciprocation often takes the form of a substantially larger gift than the original gift.

Now we’re going to dive into some of the research and see how exactly reciprocation bias impacts people in the real world, and what psychological studies have shown some of these effects can be. One of the first experiments is something that I call a ‘Coke bottle experiment’. In this experiment there was a subject in a room, and there was also an experimenter with them- his name was Joe- and they had some sort of task that they were supposed to perform. It was kind of a red herring. There was a two minute break in the middle of the study, and Joe would leave the room, and in half of the instances he would come back with nothing and they would just continue on with the experiment. In the other half he would buy two cokes and bring one back, give one to the other person and say, “Hey, there was a drink machine and I thought I would grab you a drink too,” and just gives it to them. At the end of the study, they would have Joe then ask that person to buy some raffle tickets from him. This was in the ‘70s, so he would sell them for 25 cents, which doesn’t sound like a lot, but Joe would basically say, “Hey, by the way, I’m selling these raffle tickets. I was wondering if you would be willing to buy some.” So, one of the most interesting things was that in the scenario where people- where Joe didn’t bring back anything, where he just went along with the experiment, and then at the end asked them to buy raffle tickets, there was actually a liking scale. They, after the experiment, had people rate how much they liked Joe, and basically they would give him- they would buy a certain number of raffle tickets from him based on how much they liked him on a scale. So, they would obviously have already purchased the tickets, and then they would come and say, “Okay, on a scale of 1 to 10, how much do you like Joe?” and they would go through a number of questions about him, and his behavior, and everything else. There was a pretty strong correlation between how much they liked him, and how many raffle tickets they would buy, but the most fascinating thing is that in the instances where Joe brought back the Coke and gave it to the subject of the experiment, the relationship between liking and compliance was completely wiped out. For those who owed Joe a favor because he had given them a drink, even though they never asked or it, it made no difference whether they liked him or not. They felt a sense of obligation to repay him, and they did. Again, this experiment took place a long time ago, so at the time a Coke cost ten cents. He was selling these raffle tickets for 25 cents. So, the average return that he had for the people that he gave the Coke to was more than 500%. That’s a pretty fascinating study, but the most interesting thing about it is the fact that even though it’s a miniscule, small gift, in the scenario where he didn’t give them anything, how many tickets they purchased was completely dependent on how much they liked Joe, but as soon as he gives them a ten cent present, the relationship is completely obliterated and all they care about is repaying that favor that he had given to them.

Another fascinating example, and again, this one takes place many years ago in the ‘70s, and part of that reason is that Influence was written originally in the ‘70s, and it’s been updated a number of times, but another example from the ‘70s is of the Hari Krishnas. This was a religious sect, that now it’s not really very popular, but back then they experienced this huge growth, and this huge boom, and it was funny because they had been struggling for a really, really long time financially. They couldn’t figure out how to raise money, and one day they happened on this idea of giving people a flower before they asked for a donation. So, they would go to high traffic areas, they would go to airports, they would go to bus stations, all that kind of stuff, and they would basically just come up and hand people a flower, or they would hand them a small book of their scriptures, or just some small gift, and they would not accept ‘no’ as an answer. They would say, “No, this is our gift for you. Please take it. Please accept it.” As soon as they implemented that strategy they went from struggling, stagnating, being kind of a washed up religious order, to massive growth. They exploded. This fundraising strategy completely revolutionized the church. 

Here’s how Cialdini describes the Hari Krishna strategy: “The unsuspecting passerby who suddenly found flowers pressed into their hands, or pinned to their jackets, were under no circumstances allowed to give them back, even if they asserted that they did not want them. “No, it is our gift to you,” said the solicitor, refusing to take it back. Only after the Krishna member had thus brought the force of the reciprocation rule to bare on the situation was the target asked to provide a contribution to the society. This benefactor before beggar strategy was wildly successful for the Hari Krishna society. Producing large scale economic gains and funding, the ownership of temples, businesses, houses, and property in the 321 centers in the United States and abroad. 

So, the Hari Krishna example is a great example that shows us how even if you don’t want the gift, somebody who you don’t like, don’t care about, can give you something and suddenly this bias gets triggered and you feel obligated to give them something back. There’s a similar example in pharmaceutical research, and this example showcases also the superpower of incentives, which we will talk about in a later podcast. That’s something… we talked about Charlie Munger before. Charlie Munger, again the billionaire business partner of Warren Buffet, once said that he has been in the top of his age cohort his entire life in understanding the power of incentives, and his entire life he has underestimated them. Anyway, this example showcases the superpower of incentives. A study in 1998 found that 100% of the scientists who had published results supporting a certain calcium drug had received prior support from the pharmaceutical company that produced them, but only 37% of those publishing critical results had received the same kind of support. So again, it’s something that incentives are incredibly powerful, and it’s something that we often think, “Yeah, of course I know incentives are powerful,” but the reality is even when you account for the fact that you know how powerful they are, they can be even more powerful than that. Even something as simple as funding certain types of research, right, and you can see this in global warming, or tobacco, or all kinds of different things. Often the people who fund a lot of this research, the scientists, even if it’s at- not at a conscious level, but at a subconscious level, often come to conclusions that support whoever happens to be paying their bills. Paying their paychecks. So, there’s an Upton Sinclair quote that it’s hard to get a man to understand something when his paycheck depends on him not understanding it.

Another really simple example, and this is pretty crazy: We all know what a pain in the ass it is to have to fill out surveys from an insurance company, or whatever other ridiculous junk mail. Most people just throw it out, right? Like, I mean I know personally that I throw gobs of mail out every day. I just get a ton of junk in the mail. Well, in this experiment, and this took place in 1992, an insurance company actually found that when they mailed people a $50- when they offered a $50 reward for completing a survey, they didn’t have a lot of traction, but when they switched to just sending people a $5 gift check along with the surveys, they doubled the effectiveness of their strategy. So literally, instead of getting paid $50 for filling the strategy out, when people received upfront a $5 gift, the reciprocation bias kicked in and they felt some sort of obligation, you know, “Oh my gosh, they sent me five bucks. Yeah, I’ll take 30 seconds and fill out this survey,” but it was literally one tenth of what they could have been offered and it was twice as effective. It just shows you how powerful reciprocation can be.

Another example is if you ever get those things in the mail where they send you shipping labels that have your own name and address on them. I know, for example, AAA sends me those all the time, and having read Influence I ruthlessly exploit them and just take the stickers for myself, but one charity found that when they would normally send out a mailer requesting donations, they would have about an 18% success rate, but just by including those individual shipping labels they doubled their success rate to 35%. Again, it might not seem like that much, but think about the fact literally just including a few shipping labels doubles their success rate with that strategy. It’s just like the… you know, I mean reciprocation is incredibly powerful bias.

Now we’re going to get into what I think probably is one of my favorite examples of how powerful the reciprocation bias can be, and that’s what’s called the “zoo experiment”. The zoo example is one of my favorites because it’s so nakedly obvious that there’s cognitive bias at work here. This piece of research highlights something that’s called the rejection and retreat technique. Cialdini and a group of researchers conducted an experiment where they approached college students and asked them to volunteer, and take juvenile delinquents on a day trip to the zoo. Okay, in that study, that was kind of the control case, 83% of the students said ‘no’. I mean, I don’t really blame them. I probably would have said no myself. Next they changed things up just a little bit. They did the same experiment on a different set of college students, but they tweaked it just a tiny bit. They added one question before they asked the students to take the juvenile delinquents on a day trip to the zoo. Before they asked that, they asked the students, “Would you like to volunteer two hours a week, for a minimum two year commitment, to be a counselor for juvenile delinquents?” 100% of the people said ‘no’, but they then followed up with the same request, “Would you like to take juvenile delinquents just on a single day trip to the zoo?” In that instance 50% of the people said ‘yes’. That’s a tripling of the compliance rate simply by including a question that every single person said ‘no’ to at the beginning. That’s pretty wild when you think about it. They went form a 17% yes rate to a 50% yes rate without changing the question. All they did was add another question at the beginning that triggered the cognitive bias because they conceded and backed away from their position, and then the other person felt, “Okay, well they made a concession to me. I’ll make a concession to them,” and that’s why it’s called the rejection and retreat technique. Now, the rejection and retreat technique is something that everyone on some level or another is probably familiar with. That’s just kind of a piece of research that really validates that, and everybody’s heard the- when you’re dealing with negotiations, or whatever, that you should ask for more than you want, and blah, blah, blah, but it’s not just hearsay, it’s not just folk wisdom, it’s actually validated research.  

One of the even more interesting findings is that they did a very similar study, but what they really wanted to understand is: Is this so nakedly obvious that it works on the front end, but then as soon as people realize that they’d been taken advantage of, they lose the buy in and they don’t care anymore, and they’re not going to continue to kind of comply with your requests?  So, they did an experiment with blood donations. Another fascinating example of the rejection and the retreat technique is how it can create longer lasting effects, and is nearly immune to the idea that people would refuse in the future because they feel like they were taken advantage of. So, in this experiment, in the blood donation experiment, they had college students who were asked to give a pint of blood as part of the annual campus blood drive. Then they had another group of students who were asked first to give a pint of blood every six weeks for a minimum of three years, and then they backed down to: “Okay, well would you just give a pint of blood once?” So, it’s the same kind of thing with the juvenile delinquents, the same strategy. You have one control group and you have one group where you ask a ridiculous request and then follow it up with: “Okay, will you just do something a little simpler?” The results were replicated. Of course the people were more likely to comply when they first offered them the really tough question, but the fascinating thing was the students who actually went to the blood center were then asked if they would be willing to give their phone numbers so that they could be called upon again to donate blood later in the future. So, the finding was, or what they were testing for was: Okay, the people who we essentially tricked, or used these weapons of influence on, are they going to be bitter, and are they going to say, “Well, they tricked me into going here?” or whatever, and thus be less likely to give their phone number to donate in the future? What actually happened is the students who had the rejection-then-retreat used on them, 84% of the students gave their phone number and said they would be willing to donate blood again. The students who were just in the control group, who were only asked, “Hey, will you donate a pint of blood?” only 43% of those students said that they would be willing to give their phone number and be a donor in the future. So, even for future favors, even when somebody might know, or feel like, they’ve been taken advantage of, the rejection-then-retreat technique proved superior, and the reciprocation bias is so strong that it can carry through something like that weeks later.

So, now that we’ve looked at some of the research, what are the practical implications of this? How can we use this in our everyday lives? Again, as a refresher, the reciprocation bias is the tendency to reciprocate when someone does something for us. Sounds really simple, but what are the practical implications of that? What are the takeaways from the research, and how can we apply this stuff to our everyday life? The first major lesson is that reciprocation supersedes our wants and our likes. The effect still holds even when the gift is unwanted, and even if you don’t like the person giving you the gift. The Hari Krishna example, and the Coke bottle experiment, and the blood donation research all point to that conclusion, and all demonstrate that conclusion. That’s why it’s so powerful. The person doesn’t even have to like you. The person doesn’t even have to want the gift. If you give it to them, it will trigger this innate subconscious desire, need, obligation to reciprocate. Similarly, reciprocation can trigger unequal exchanges. A small initial favor can trigger the psychological response to do a much larger favor. The Coke bottle experiment’s a good example, where Joe had more than a 500% return on his gift, but there are countless examples of this in real life. 

The third lesson is that this applies to concessions in a negotiation. Think about the zoo experiment and the rejection and retreat technique. If you make a bigger ask, and then you give the concession to the other person, they feel this deep subconscious obligation to make a concession also. There is a little bit of a caveat there because subsequent research has shown that if your initial ask is too big, or too ridiculous, over a certain threshold, people will see right through it and they’ll basically… they won’t get caught in the reciprocation trap, but as the blood donation research showed, as the zoo trip showed, it can be a pretty hefty request. As long as you concede and back down, you can double or triple your compliance rate simply by adding another request in at the beginning that’s a little bit more burdensome, a little bit more onerous.
So, how do you defend yourself against reciprocation tendency? How do you stop somebody from exploiting you by using this strategy? Cialdini says that knowledge and awareness are the best offenses, and that you should steal yourself against the feeling of having to reciprocate a gift. One of the best ways he suggests combatting the reciprocation bias is by reframing in your mind, from a gift to a trick. Here’s what he says in Influence, “If gifts were used not as genuine gifts, but to make a profit from you, then you might want to use them to make a profit of your own. Simply take whatever the compliance practitioner is willing to provide, thank them politely, and show them out the door. After all, the reciprocity rule asserts that if justice is to be done, exploitation attempts should be exploited.” I talked about that briefly earlier when I gave the example of including the shipping labels in the mailer nearly doubling the effectiveness of that fundraising campaign. That’s why when you get those free shipping labels, you should steal those things and don’t worry about even replying to the rest of the mail. Just throw it out because exploitation attempts should be exploited. 

So, that’s reciprocation bias. It’s something that’s incredibly powerful. It’s something that I hope this research demonstrated to you, shapes and impacts our lives in a number of ways. Now that you’re aware of it, not only can you use it for good, and use it for your own benefit, but now you can stop people from exploiting you by using the reciprocation tendency.

 

January 19, 2016 /Austin Fabel
Weapons of Influence
Weapons of Influence, Influence & Communication