In this episode we discuss information overload. How do you deal with a world where there is a constant and overwhelming stream of noise. How do you filter and decide what to pay attention to? How can you determine what’s worth your precious time and attention? What should you do with information that you disagree with? In a world full of more and more information, this interview with Dr. Thomas Hills explores the solution that will help you finally deal with information overload.
Dr. Thomas Hills is a professor of Psychology at the University of Warwick. His research involves using algorithmic approaches to understanding the human condition through language, wellbeing, memory, and decision making. He is a current fellow of the Alan Turing Institute and the Director of the Bridges-Leverhulme Doctoral Training Centre. He also, co-directs Warwick's Global Research Priority in Behavioural Science and his works have been published in numerous academic journals.
What is Information Overload?
Information overload has crept into our lives and changed our identities in a way that has gone almost completely unnoticed
This is a phenomenon that has crept into our lives and yet it’s gone largely unnoticed - we’ve had to outsource the information filtering in some way
There’s so much information that we have to outsource the filtering process in some way - either to other people, experts, thought leaders, or algorithms
Dr. Hill’s research began from studying how we make tough choices across hugely complex fields, beginning with things like how children learn language
There’s a whole “pandora’s box” under the problem of information overload
The question we have to ask ourselves - what’s the best way to go about dealing with something like information overload?
Whether we are looking at religion or even something as simple as the food you eat - you have the same problem
How do we decide what the right thing is? How do we make the right decision on the tough areas of life?
How do we decide the right filters are for information?
People only look for information that supports their existing beliefs and that is “incredibly dangerous.”
We only know the language / vocabulary of our past experiences - and that’s what we begin with to filter out our understanding of the world
The “vocabulary” for explaining the world that we already have constraints our ability to think, see, and understand the world in certain a
People tend to have very similar reactions to similar situations - learn from the experiences of similar people
You must look for people who have done or experienced what you want to understand and learn from them - case studies and base rates
It’s essential to seek out the beliefs and ideas from those you disagree with
There is an infinite amount of information around you - your brain can’t process all of it and is forced to filter out certain experiences and events
What is an attentional bottleneck? How does it shape our understanding of reality?
What is negativity bias? How does the innate, evolutionary bias baked into our brain cause us to focus on things that are negative
It’s really important to ask yourself - WHAT AM I BIASED ABOUT?
The more newspapers you read, the stupider you get.
If you’re not getting outside of your box, your safety zone, you’re getting dumber.
By exposing yourself to other people’s criticism you get smarter
You have to be willing to be wrong about something to get more right about it
Ask yourself - how might I be wrong? Why might I be wrong? Try to harness the wisdom of the crowd in your own head
The bleeding edge of the news is mostly noise.
Homework: If you know ahead of time what it is you want out of your relationship with reality and what it takes to get there.
Thank you so much for listening!
Please SUBSCRIBE and LEAVE US A REVIEW on iTunes! (Click here for instructions on how to do that).
This week's Mindset Monday/episode is brought to you by our incredible sponsors at HoneyBook!
If you're thinking of starting a business, a creative freelancer, or already own your own business, HoneyBook helps you stay organized, with custom templates and powerful automation tools.
Over seventy-five thousand photographers, designers, event professionals, and other entrepreneurs have saved hundreds to thousands of hours a year.
It’s your business. Just better. With HoneyBook.
Right now, HoneyBook is offering our listeners 50% off your first year with promo code SUCCESS. Payment is flexible, and this promotion applies whether you pay monthly or annually.
Go to HoneyBook.com, and use promo code SUCCESS for 50% off your first year.
Get paid faster, and work smarter, with HoneyBook.com, promo code SUCCESS.
Want To Dig In More?! - Here’s The Show Notes, Links, & Research
[Profile] Warwick Department of Science - Thomas Hills
[Profile] The Alan Turing Institute - Thomas Hills
[Article] “The Dark Side of Information Proliferation” by Thomas Hills
[Article] Medical Xpress - “Mass proliferation of information evolving beyond our control, says new psychology research” by University of Warwick
“Forage longer for berries, study on age-related memory decline suggests” by University of Warwick (2013)
[Directory] ResearchGate - Project profile
[Article] Psychology Today Column - Statistical Life by Thomas Hills
[Article] Science Daily - Bad news becomes hysteria in crowds, new research shows by Robert D. Jagiello and Thomas T. Hills
[Article] AlleyDog.com - Bottleneck Theory
[Article] PNAS - “The amplification of risk in experimental diffusion chains” by Mehdi Moussaïd, Henry Brighton, and Wolfgang Gaissmaier
[Wiki Article] Observational learning
[Wiki Article] Confirmation bias
[Article] Psychology Today - “The True Odds of Shooting a Bad Guy With a Gun” by Thomas Hills
[Article] LessWrong - “Dialectical Bootstrapping” by John Nicholas
[Article] Lars P. Syll - “Why reading newspapers makes you stupid”
[Article] Phys.org - “Booty, booby and nitwit—academics reveal funniest words” by Warwick University
[Article] Aeon - “Masters of reality” by Thomas Hills
Aeon - “Does my algorithm have a mental-health problem?” by Thomas Hills
[Research Project] Propaganda for Change
[Article] New Atlas - “Extremism and fake news: The dark side of too much information” by Rich Haridy
[Directory] Google Scholar Cited works by Thomas Hills
[Directory] Warwick Academic works directory, Thomas Hills
Warwick Newsroom - Dr Thomas Hils - Memory and Ageing
[Book] Stumbling on Happiness by Daniel Gilbert
[SoS Episode] The Shocking Counter-Intuitive Science Behind The Truth of Positive Thinking with Dr. Gabriele Oettingen
[SoS Episode] Research Reveals How You Can Create The Mindset of a Champion with Dr. Carol Dweck
[00:00:04.4] ANNOUNCER: Welcome to The Science of Success. Introducing your host, Matt Bodnar.
[0:00:11.8] MB: Welcome to the Science of Success; the number one evidence-based growth podcast on the internet with more than three million downloads and listeners in over a hundred countries.
In this episode, we discuss information overload. How do you deal with a world where there's a constant and overwhelming stream of noise? How do you filter and decide what to pay attention to? How can you determine what's worth your precious time and attention? What should you do with information that you disagree with and a world full of more and more and more information? This interview with Dr. Thomas Hills explores the solution that will help you finally deal with information overload.
I’m going to tell you why you’ve been missing out on some incredibly cool stuff if you haven’t signed up for our e-mail list yet. All you have to do to sign up is to go to successpodcast.com and sign up right on the home page.
On top of tons subscriber-only content, exclusive access and live Q&As with previous guests, monthly giveaways and much more, I also created an epic, free video course just for you. It's called How to Create Time for What Matters Most Even When You're Really Busy. E-mail subscribers have been raving about this guide.
You can get all of that and much more by going to successpodcast.com and signing up right on the home page, or by texting the word smarter to the number 44-222 on your phone. If you like what I do on Science of Success, my e-mail list is the number one way to engage with me and go deeper on what I discuss on the show, including free guides, actionable takeaways, exclusive content and much, much more.
Sign up for my e-mail list today by going to successpodcast.com and signing up right on the home page, or if you're on the go, if you're on your phone right now, it's even easier. Just text the word smarter, that's S-M-A-R-T-E-R to the number 44-222. I can't wait to show you all the exciting things you'll get when you sign up and join the e-mail list.
In our previous episode, we discussed how to beat FOMO, the fear of missing out. How do you overcome the emotional barriers and fears of missing out and saying no to things? How do you get over the awful feeling of turning down opportunities? We share simple, actionable strategies for you to say yes to yourself and for you to say yes to what's really important and actually matters in your life. We share a great strategy that you can use to make a huge difference in your life in two minutes or less, and we dig into the important concept that in a world drunk on speed, slowness is a superpower. All that and much more with our previous guest, Carl Honore.
Now, for our interview with Dr. Hills.
[0:03:15.2] MB: Today, we have another exciting guest on the show, Dr. Thomas Hills. Thomas is a Professor of Psychology at the University of Warwick. His research involves using algorithmic approaches to understanding the human condition through language, well-being, memory and decision-making. He is a current fellow of the Alan Turing Institute and the Director of the Bridges Lever-Leverhulme Doctoral Training Centre. He also co-directs Warwick's global research priority in behavioral science and his works been published in numerous academic journals. Thomas, welcome to the Science of Success.
[0:03:45.6] TH: Thanks for having me on the show. This is really fun that you and I – I think your listeners are interested in this area. Yeah.
[0:03:50.5] MB: Yeah. I mean, this is a topic that I think is so, so important. As your research demonstrates, is becoming increasingly important, because it's so dangerous and fraught and potentially problematic. I'd love to begin with this idea of information overload. That's almost a buzzword these days. We hear it all the time about how much information there is and all this new content being created, etc., but you really took a more scientific approach to looking at and thinking about this. Tell me what inspired you to dig into that research and what did you uncover.
[0:04:22.0] TH: Right, right. I think you're right. Information overload have been this buzzword for a long time. I think most people have got it wrong, right? Which is to say most people think they know what information overload is. They think, “Gosh, my phone's going off all the time and I get on the news website and there's all these things blink in on the side of the screen.” They think that's information overload, right?
Really, there's this way that information overload creeps into our lives and it changes our identities, right? It changes the way the way we are and the way we think about things. It does it in this almost secretive way. Part of it is the algorithms that are running underneath, the recommendation systems that are running underneath, personalized news that are running underneath the fact that people might like to listen to your podcast, for example. They think, “I the stuff that Matt has to say and he's going to say interesting things, and so I'm going to go back and I'm going to listen to that again.” That means that in a sense, that you get to control what it is that they're listening to.
Part of it is that there's so much information that we necessarily have to filter it for ourselves, or algorithms have to filter it for us, so we let other people do the filtering. As a consequence, that changes who we are in part by what kinds of information we actually get fed. I found this partly by accident. My research looks at the way people interact with really complicated information. When I say really complicated information, I mean, things like language.
How does a little kid learn language, right? This little kids bopping around and they don't know anything about all the sounds they're hearing around them. They're just hearing the sounds, right? You imagine a dog, right? Listening, walking around the house and people are talking and then no idea what any of these words mean, right? Little kids doing the same thing, and so how is this kid going to pick out which words are the important words, or which sounds are the important sounds, or which concepts are the things that they should learn, out of all the thousands of possible things they could learn?
The stuff that I do are building algorithms and looking at experimental data at the way – well, one of them is of course the way kids learn language. Also looking at the way adults might choose well, in behavioral economics, they're often like gambles, right? You might imagine. How do you choose a song out of all the possible songs? Or how do you choose a pension out of all the possible pension plans? How do you choose any particular thing when there's so many different varieties? It's just this little kid learning language, right?
What you learn, I think, or what I've learned in my research is that there are these cues out there in the environment. They're partly based on our predispositions. We have predispositions for certain kinds of information; information that's belief consistent, right? Do we do we already understand it in a way, or have we heard it before? Then it becomes the thing that's attractive, right? Or is it negative, right? If it's negative, then it becomes a warning sign. It's almost like a stop sign, or a police siren and these kinds of things.
In any case, all these things led me to realize, gosh, there's a whole, if you will, Pandora's box underneath this problem of information overload. It's just a matter of hardly of us organizing and understanding how it's affecting our own identities and the evolution of those identities.
[0:08:03.1] MB: Well, you bring up two really good points; one is this idea that this phenomenon is basically crept into our lives. Yet, we may not even have noticed, or really realized it. The second is this idea that there's so much information out there that it's almost necessary to filter it out in some form or fashion, whether that's relying on a thought leader, or an expert or an algorithm. There's just so much that there's no other real way to get some distillation of it.
[0:08:31.6] TH: That's right. I guess, partly the question that we have to ask ourselves is what's the best way to go about this, right? It's when you're a kid and you get many of us anyway. We grew up probably in a quasi-religious, or maybe more or less religious background. It was just one religion, right? That we got talked to in general. It was probably one religion that we got exposed to as a child. You might ask yourself, “Okay.” Many people do, right? I mean, this is a stereotype of the kid who finally they reach their early – or their late teens, or their early 20s and they're thinking to themselves, “Is the religion that I practice as a child, is that the religion that I want, right? That I want to pursue in my life? Why do I believe this thing and not this other thing?”
This is saying, well look, you got filtered one information as a child, but there's all different kinds of ways to believe about the meaning of your own life, or even really simple things, like what are you going to eat in your diet, or how are you going to bring your own kids up, or what relationship are you going to have with other people? What moral values you’re going to you have? All those are when you start questioning them, I mean, I think that's the point where you get exposed, or you realize there's all these other choices in the world, there's all these other kinds of information out there. How do you take, if you will, a practical, adaptive, functional, well-adjusted perspective on all this other kinds of information, right? I put it in this one context of religion, right? It applies to everything.
We get indoctrinated by the way we grow up and by the way information is exposed to us. Then we have to make a decision, or make a series of decisions, or maybe if we're I guess appropriately enlightened, we're constantly making these decisions about is this the information that I think is valuable? Is it telling me what I need to know? Should I be asking more questions or not?
[0:10:40.0] MB: I love the example of things, starting with religion, but even going something as simple as diet. There's almost an infinite amount of filters that are running in the background that have been pre-programmed, or implanted in us from whether it's our experiences, or random chance, or the people we happen to grow up around. All of these shape the way that we perceive and interact with information. Even information we decide whether or not we want to interact with.
[0:11:06.3] TH: Yeah, that's right. I mean, I guess how do we decide what the right thing is? I think that's a really tricky thing. Like the way many of us do, which is in my research is associated with people's pursuit for belief consistent information. They look for things that support what they already believe in. This is so incredibly dangerous. I mean, it's not just dangerous to other people, right? It's dangerous to yourself.
If you think that you're going to take an intelligent route through your life, let's say through your relationship, right? This is something that many of us have, right? We have a relationship with some person, right? We care about this personal life a lot. Now how are we going to have a good relationship with that person, right? We might think, “Oh, well. The way I grew up is the way I should have this relationship with this other person, right? Following in the footsteps of my parents.” We might think, “Oh, well that was the right way to do it, because that's the way I was exposed to it.”
Then when we go looking for evidence that that's true, we might only know, if you will, the language that's consistent with what we already experienced, right? Dad goes to work and maybe mom, how is it – maybe she doesn't go to work, or maybe she has a different job, right? I mean, I'm almost harking back to the 50s in a way. I mean, this isn't so much – The modern world is very different, but there's still many of these predispositions, or stereotypes that we carry around with us. When you go to question them, we have to use the language that we already understand.
Imagine typing something in Google, right? It's like, how do I have a better relationship with my wife, for example? Well, you've used the word wife, right? You used the word better, right? You're already, if you will, constrained by your language. You constantly have to be looking out for these new ways to think about it. How many different ways can you ask for a good relationship, can you ask to improve your relationships with people?
[0:13:19.6] MB: That's a great point. I love the idea of how our vocabulary, and I think it applies in a literal sense to the actual words that we use, but also in a broader sense, the vocabulary of experiences and understanding and ideas that we have, fundamentally shape the way that we interact with the world and the experiences we have in the past shape and define how we even begin to approach the problems and challenges of our lives.
[0:13:45.1] TH: That's right. That's right. Many cases, it's really important. Dan Gilbert talks about in his book, Stumbling on Happiness, right? It's if you wanted to have a happy life, or let's say if you wanted to have – you want to make the right decision, right, in a particular context. Let's say the context is you get an opportunity to move to some interesting place, say that you've always dreamt about when you were younger. You get the opportunity to move, but if you’re to do that, you have to open to your family or whatever else right? You might think, “Well, is that a good decision or a bad decision?”
What Dan Gilbert says, this is incredibly valuable wisdom that most of us don't use as often as we should. He says, people tend to have very similar reactions to similar consequences, right? If someone else loses a child and you wanted to know what it's like to lose a child, then you ask that person, right? What's it like to lose a child? If you moved at some point in your life, you might ask people who've already experienced that, because chances are you’re going to have a very similar trajectory, in terms of your experiences.
You might imagine that if you didn't do that, you might imagine that let's say, after the honeymoon is over when you first moved to any place and this always happens, right? You moved in a new place and meeting in personal wherever else and there's sex, that honeymoon phase and everything's beautiful. Then you start to have these hiccups, right? There start to be these situations where it's like, “Gosh, this isn't what I thought it was going to be.”
If you didn't go out and asked a bunch of other people what it's going to be like, you might think, “Oh, well these hiccups reflect the fact that this is imperfect, but this is not the right path for me, right? That I've made a mistake. That I'm never going to recover from this, right? It's never going to get better. It was good and now it's getting bad,” right? If you ask other people who've been in these situations, what they'll tell you of course is what their experience was and many of those things will let you know that in fact, there is a pattern, right? There's a way that people experience these different events.
My central point is that when we are trying to come up come up with a new vocabulary to understand a new way to conceptualize our experiences in different situations, asking other people is really vital, because especially people that we wouldn't normally ask, right? We're really looking for information that doesn't confirm what we already believe. We're looking for new perspectives, new language, new ways to conceptualize the reality of our lives.
[0:16:16.9] MB: You bring up a bunch of really important points. The last thing you said is obviously essential, which is this idea of seeking out perspectives from people who have difference of opinion, or people who disagree with you. Even what you said earlier, which may be a little bit of a tangent, but I think is worth underscoring is this idea of if you want to understand the consequences of anything and you could also use this in a proactive sense, if you want to achieve a certain thing, go look at the people who've done it in the past and study them, whether it's a case study, or even bringing in the mental model of base rates and starting to understand, “Okay, well what does the general experience look like? Is my experience matching up to that, or what is the general roadmap of that particular activity, or achievement look like? How am I on that roadmap?” Are very useful tools.
I want to bring us all the way back and come back to this problem of information overload. One of the important themes from your research was this notion of attentional bottlenecks. Tell me a little bit more. We touched on some of the core ideas around that, but tell me little bit more exactly what is an attentional bottleneck and why are they so dangerous in the way that we process information today?
[0:17:23.6] TH: Yeah. People who studied memory, speech, comprehension, I mean, these are really fundamental ideas in psychology, right? People have been studying them for years, in some cases, hundreds of years. What we know is that you can't process all the information that you experience. I mean, everything that's going on now, there’s all kinds of sounds that they're in probably in the room around you, or that are coming through the speaker, or they're outside the car, if you're listening to this in your car, or whatever. Your brain doesn't process all that stuff.
What your brain processes is a subset of the information that it thinks is relevant at any given point in time. Now that's the first part, right? That's just the first step. Now later on today, you may think back to when you heard this and you may think, “Okay, well what do I remember about that?” There's going to be certain things that your mind says, “Well, Thomas said this and Matt said that. Or this other thing happened while I was listening to this podcast.” Those are things that come to mind, right? Your brain can roll them over and think about which of these is important, which of these is worth remembering.
Then later on, you'll be in a conversation with somebody, right? One of those might pop into your head and you might think, “Oh, well this is worth repeating in this particular context, because it's related to the news this person is reading, or whatever conversation is,” right? All those things, that whole process is a circle. There's a circle from the point at which you hear the information and your brain has to decide which parts of it are important. That's a piece of attentional bottleneck.
Later on when you're trying to remember more particular things, there's another piece of attentional bottleneck, because you can't remember everything you heard. You won't be able to. You have to search for it in your head. Even if you had eidetic memory, where you just coded everything perfectly, you still have to search for it. Your brain still has to – if you will, stumble around in your memory and say, “Okay, well there's this piece and there’s this piece and there's this piece and that was really this thing.” All these things are coming, getting filtered through this process, even to the point where you speak.
Then you speak, right? Now somebody else is at the party with you, or sitting at the table and they speak too. Now there's these other people who are sitting across the table, their attentional bottlenecks come into play now, right? Now they're hearing a bunch of different information and their brains have to decide which parts are worth paying attention to, which parts are worth remembering, which parts are worth repeating later on.
When that happens iteratively. He's got the attentional bottleneck at multiple places along the way. When that happens iteratively, it changes the kinds of information that's out there to listen to. That's just trying to lay out what the attentional bottleneck is and how it has a consequences for the evolution of information.
[0:20:19.4] MB: That's super important, and underscores the big challenge with dealing with all this information, or the way our brains are physically structured, the amount of even in any given moment, let alone when you're talking about news and Facebook feeds and all this other stuff, and any given second, the experience around you is so rich with so much context and so much information that we physically cannot process and store and retrieve all of it effectively.
[0:20:44.7] TH: Yeah, that's exactly right. The weird thing is the Buddhists and some of the Eastern philosophers, and I think this is fabulous, right? They say your brain basically makes it easy for you by categorizing all this information before – oftentimes, before it really understands it, right? This is where you get things like stereotypes and you get, “Oh, yeah. I've heard that before,” right? Your brain just tells you this before you even think about, “Oh, yeah. This isn't interesting, because this person is so-and-so,” right? Or racism and bigotry and things like that. I don't need to pay attention to that. I already know how it works, right? That categorization, that pre-categorization further limits your ability to understand the reality you're in, because your brain is already telling you it knows it all already.
[0:21:32.2] MB: Do you have a great idea for a business, or a passion that you want to take to the next level? If so, have you ever wondered what's holding you back? If the thought of having to handle tedious administrative tasks seems overwhelming, or stopping you from taking action, our sponsor HoneyBook is here to help you get your plans off the ground.
HoneyBook is an online business management tool that lets you control everything from client communication, to booking, contracts and invoices all in one place. If you're looking to start a business, or even if you already own a small business, HoneyBook can help you stay organized with custom templates and automation tools. Over 75,000 photographers, designers, event professionals and other entrepreneurs have saved hundreds to thousands of hours every year by using HoneyBook.
Right now, HoneyBook is offering our listeners 50% off of your first year with promo code SUCCESS. Payment is flexible and this promotion applies whether you pay monthly or annually. Go to honeybook.com and use the promo code SUCCESS for 50% off of your first year. That's honeybook.com, promo code SUCCESS.
[0:22:50.5] MB: I want to dig into some of the specific biases, and you can you touch on a couple of them earlier, but some of the specific biases that we can fall prey to when dealing with this information overload and whether it's selecting information, whether it's all the news that we receive, Facebook, etc. Tell me about, I want to start with one that I think is one of the most prevalent, one of the most dangerous is this idea of negativity bias.
[0:23:18.3] TH: Yeah, so negativity bias is psychologists and social researchers and even economists have seen this for the longest time. There's just so much evidence for this, right? The basic idea is that if there's a bunch of different possible pieces of information you could be paying attention to, your mind ranks all these things, right? Which ones are going to be the ones that are most likely to pass through that attentional bottleneck?
One of the dimensions that it uses is how negative, or dangerous is the information. What that means is if you're talking about something like, let's talk about nuclear power, or something like that for a second. With nuclear power, there's all kinds of positive benefits to this. As soon as we start talking about a nuclear power, you're already – you can already feel these things in the back of your head. There’s like, “What about the dark side, right? The negative side of nuclear power? Aren't there these dangerous things about nuclear power?” That's the negativity bias coming in, right?
We know from countless research studies that this is much stronger than the positive side, right? If you let people talk about nuclear power for a while, there's a study within Robert Jagiello, you let people talk about nuclear power for a while and they share that information with other people and then those people share it with other people and so on, it's called a social diffusion chain in research literature. You have these social diffusion chains. What happens is that you can give the first people very balanced information about a nuclear power, but if you let them talk about it for a while in the social diffusion chain, what happens is all the positive information just gets sucked out of the air. All that's left is the dangerous aspects, the risks, right?
Then people start to worry about it, right? Then the language around nuclear power gets more and more negative. You see this happen with people discussing antibacterial agents. You see this happen with people discussing food additives. You can see this happening in the world around you all the time, because the news is a journalists, or they’re at the front line of this attentional filter. They know that if they talk about the worst thing that happened in the world today that they're going to have your attention, right?
There's all kinds of interesting news going on in the world, right? They know that when you go looking on the news, whatever the worst thing is and obviously, it's worst for you, but of course, it's going to be worse than average for a lot of people. They know if they can just tap into that negativity bias, you're more likely to click on their news article and go, “Yeah, okay. I want to hear more about this bombing, or this explosion, or this person who was murdered by their way wife, or this thing.” That's the negativity bias.
[0:26:06.0] MB: I've never heard of social diffusion chain, but that's fascinating. It's almost a game of telephone, except things just keep getting more and more negative.
[0:26:15.6] TH: That's exactly right. Yeah. It is exactly a game telephone, where things just get more negative. Yeah, that's called social risk amplification, right? It's basically this observation that when people have these telephone conversations, the conversations just get more and more negative.
[0:26:31.4] MB: Nuclear power is a great example of that. We had a previous guest on the show, maybe a year or two ago, a guy named Dan Gardner, we’ll throw his episode in the show notes, but he talks about the exact same thing, which is this idea of we live in one of the healthiest, happiest, safest times in human history. There's not a better time to have ever been alive from all kinds of different metrics. Yet, people who spend all their time reading the news think that the world is getting more dangerous, that there's more pain and suffering and all these different things. It’s really, really fascinating.
[0:27:05.7] TH: Yeah, yeah. That's right. That's right. I'm really curious as to what's driving that, right? Because you might think the causality is that we hear more about negative things, just because telecommunication systems, or whatever else, internet and information is just better, and so there's just more negative information for us to hear about. We just happen to hear about it, because of these filters, right? It might be the case that our concern, our ability to be concerned about different things has changed in the last 100, 200 years.
We've done another research study on the history of the word risk, right? In the 1800s, the word risk was a word that people used about, associated with the loss of lives in war and combat things like that. Whereas these days, that risks are associated with all kinds of things, especially medical and health related things, and risk of dying of cancer and risk of heart disease and all of these kinds of things, right? Risk has become much more prevalent, where it's become a much more negative word.
It may be that really what's happening is our capacity to be worried about things has increased. As alongside of our ability to hear all this negative information, and that's actually helping us to make things safer, despite the fact that we're paranoid, right? We wind up being worried about everything, but it's that worry in a sense that's making life safer and better at the same time. It's a weird catch-22, if that's true. I'm not sure.
[0:28:41.1] MB: It's a fascinating bias. I'm not sure what the cause or cycle is, but it's an interesting discussion. I want to also dig into something you touched on earlier, which is one of the most insidious biases, which is the idea of belief consistency and how it's so easy to seek out information that already confirms what you believe, or want to be true, and put away, or ignore, or hide under the rug things that might shine a different light, or conflict with what you believe.
[0:29:12.1] TH: Yeah. It goes by all kinds of names. Many of them we've heard before, things like confirmation bias, right? Bias assimilation and motivated reasoning. Groupthink is another one, right? It's even related to things like cultural codes, like who are you going to listen to? Are you're going to listen to people who are not in your in-group, people who are in the out-group? You might just discount them immediately. That's another kinds of confirmation bias.
I mean, I always loved – confirmation bias is great, because it's a – it's the perfect criticism, right? It's I write things a little bit online about the true odds of shooting a bad guy. This is an article I wrote for Psychology Today, which is basically about what happens to a bullet when it leaves a gun, right? It's interesting statistical information, right? It’s effectively all the statistics I could find about what happens to bullets when they leave guns. I get all kinds of criticism, because of this article from different people, mainly people who are worried about gun control, right? For some reason, they seem to feel statistics are somehow anti-gun control, which I don't think that's true, right? I think the statistics are actually really important, whichever side of the issue you’re on.
Many of them will say, “Oh, well it's just confirmation bias, right? It's almost a very bland, abstract criticism.” To some extent, it just has to be true, right? It's like, I decided that I was going to write about where bullets go when they leave guns, right? Yeah, I'm biased just by the very nature of the question, right? Then I'm biased by the kinds of statistics that are available, right? I'm biased by the language that I use to describe the victims whoever they are.
The majority of bullets that leave guns go in to the head of the person who's shooting, right? They're killing themselves, right? Most people commit suicide with guns, that's where bullets go, right? I mean, and there's so many implicit biases just in that observation, right? Yeah, I think confirmation bias is for a criticism that lets – it's really easy to use against other people, but I think it's actually really important for all of us to think about what am I biased about? How am I biased about the things I believe, the things I listen to?
Nassim Taleb talks about when you read the newspaper, it makes you stupid. The more newspapers you read, the stupider you get. Why? Because you choose which newspapers to read, right? When you do that, you tend to choose things that tell you what – they slant the news in a way that's already consistent with your prior beliefs, right? That just makes you dumber, right? You have to go outside of your box. You have to get outside of your safety zone, if you will, in order to overcome the confirmation bias. Otherwise, you're a victim just like, well, just like me and just like everyone else.
[0:32:02.1] MB: That's such an important principle. It's easy to hear about something like confirmation bias, or belief consistency bias and think that's a problem that afflicts your opponents, or the people you disagree with intellectually about anything. When in reality, the number one place you should start with this investigation, really any investigation is with yourself, right?
[0:32:22.5] TH: Yeah, that’s right.
[0:32:23.2] MB: Asking yourself, what am I biased about? Am I really pushing myself to get outside of my own intellectual comfort zone to soak up information that I might disagree with, or I might not like, to really figure out what's actually true and what's really – what's reality really look like.
[0:32:41.0] TH: Yeah, I think that's right. I think that's right. I want to use this this gun control thing again just to describe that, because one of the things that was really powerful to me in writing about that, and I've written several of these things also about what would it be like in Norway, Brevik, the Norwegians had guns when the Brevik came and things like that, right? One of the powerful things I found out when I wrote about those things was in fact, how little I knew, right?
I went out of my way to try to find all the evidence I could to describe it and these kinds of things, but there were a number of people who were very vocal and who commented on these things and on these things that I wrote. They basically said, “Look, Thomas. You're wrong, because of this reason and this reason.” They weren't always nice. I wish they were nice, but they weren't always nice. They said, “You're wrong, because of this reason and you're wrong because of that reason and you’re wrong because of this reason.”
I looked into it, right? Oftentimes, they were actually telling me things I didn't know, right? They were saying, “Hey, look. You've got to pay attention to these kinds of – these set of statistics over here, which I wouldn't say I neglected. I just didn't know about it, right? You didn't know about these set of statistics, or you're not thinking about these cultural issues that deal with these kinds of things,” right? That was incredibly valuable. In other words, in order for me to become smarter about the issues, and I won't say I'm smart about the issue, but I became smarter about the issue because I was willing to talk about it and expose myself to criticism from other people. In doing that, I was able to, if you will, disconfirm some of my own biases, because I went into it writing, thinking, “Oh, I know what the issue is with guns and bullets and these kinds of things, right?”
By exposing myself to other people's criticism, basically by making a claim and allowing other people to say, “No, Thomas. You're wrong.” I was able to disconfirm many of my biases. I'm sure I'm still very biased, but by exposing myself, I was able to deal with some of these issues. I think that's one of the key ways that people can help deal with their own biases, right? As they actually make a statement, right? They actually say, “Look, this is how I think it might work. What do you think?” Then they listen to what people say when they respond back to them and they go.
In other words, you're not going into it thinking you're right and you have to defend your flag to the death, right? You're thinking, you're going to put a flag in the ground and you're going to say, “Okay, I put this flag in the ground. I think this is how it works. Now what do you think?” Then other people can say, “Well, I think actually that's the wrong place to put the flag,” which is typically what happens to me. Or it's like, I'm willing to put the flag in the ground, but other people were like, “Well, look. That's really the wrong place to put it. Don't put it there. Put it over there or somewhere else, or something like that.” Then I go, “Oh, yeah. Okay, it actually makes it better. Makes more sense if I don't defend that claim, because that one is wrong. To me, you better defend this claim.”
It actually helps me understand and be more resilient. It allows me to build, if you will, more defensible beliefs, more rigorous beliefs, beliefs that are better able to predict the future and that are better able to help other people understand what it is I'm trying to say.
[0:36:00.3] MB: This comes back to a fundamental question, right? It really depends on what are you trying to optimize for? As you said, by exposing yourself to other people's criticism, you get smarter. When you constantly seek out information that already confirms what you believe, you're actively getting dumber. Really, the question is do you want to optimize for feeling better and feeling you're right and feeling vindicated, or do you want to optimize for actually being smarter? Because the path to being smarter oftentimes involves getting criticized and hearing things you don't want to hear and having people beat up your ideas and tell you why you're wrong, but then you march down the path and you end up being much better off as a result of that.
[0:36:39.0] MB: Yeah, yeah. I think that is so key. There's people, like Carol Dweck who makes this – a relatively well-known psychologist who makes this nice distinction about performance and mastery, or she talks about in terms of mindset. I'm sure mixing up several different languages here. The idea is that if you have a performance mindset, you just want to be right all the time, right? That basically means you're just going to wind up with a closed mouth most of the time. The best way to be right is not to say anything, right?
If you have a mastery mindset, which is to say I really want to understand this. This is so important. I'm willing to be wrong about it, in order to get more right. I want to master this particular thing, then you're willing to make mistakes, right? Then you say, “In fact, I have to make mistakes. I have to get it wrong, in order to figure out what the boundaries are my own understanding.”
[0:37:32.1] MB: Yeah, I'm a huge, huge fan of Carol Dweck; probably one of my all-time favorite research psychologists. She's a previous guest on the show as well, so we'll throw her episode into the show notes for listeners who want to check that out.
I mean, we've talked about a number of these biases. I want to dig into now and when we've honestly started down this path a little bit already, but what are some of the things that listeners can do to deal with this amount of information overload? The fact that the information we're receiving is filtered and curated in ways that might be reinforcing what we already believe and the fact that our brains evolved in a way that makes it really difficult to figure out what's actually true.
[0:38:11.4] TH: Yeah. I think there are many different kinds of ways to deal with this, right? We're starting I think as modern – as members of modern culture, we're starting to maybe accumulate these different ways. I think one of the ways that – a first step is asking yourself, how might I be wrong? There's a good friend of mind and researchers, and Stefan [inaudible 0:38:36.6] in Berlin, he studied this idea called dialectical bootstrapping, which is basically look, you've got to make a decision, okay?
Let's say when was the Battle of Waterloo, or something like that? Or what percentage of the population in the US exercises every day, or something like that, right? It's a very simple question, right? You can just guess, right? Okay, no risk. You just make a guess. Okay, but now ask yourself, how might that guess have been wrong, right? Why might it be wrong? You might say, “Well, it might be wrong because maybe Napoleon wasn't active at that point in time, or maybe I'm overestimating the number people exercise because I'm thinking maybe they're like me, that thing, or I'm not really thinking about the demographics in the US, which is there's a lot more old people now than there used to be.” Whatever, it doesn't matter, right? You come up with the reasons of why you might be wrong, right? Then you make another guess.
It turns out that when you average those two guesses, you're much more accurate than if you took the first guess, or if you took the second guess. It's almost like you're using the wisdom of the crowd in your own head, right? You're basically trying to create multiple voices in your head that basically say, it could be this and it could be that, right? That's just a first step, right? How might you be wrong? I mean, the second step we already talked about, which is trying to get opinions of other people. They might be going on asking them, or it might just be willing to make a claim in a public space, right? That other people can say, “No, no, no. Matt, you're definitely wrong about that,” that kind of thing. That you can get smarter. Then you have to be willing to say, “Okay, they're telling me I'm wrong. Why might I be wrong? Why might they be right in this particular case?” Those are a couple of ideas.
[0:40:32.9] MB: Yeah. I think both of those are great strategies. The hard part honestly, it's easy to intellectually think about asking yourself, “All right, how might I be wrong?” Try to beat ideas up. I think the hard part is getting over that the ego, getting over the resistance and the desire to push back and believe that you already know the answer, that you're already right.
[0:40:55.1] TH: Yeah. Yeah, definitely, definitely. You really want to be right, especially when you're around other people, right? I mean, the people who study this feeling of rightness, or the ego that drives us, right? They're very clear. In many instances, the reason why we care so much about being right is because what we're really interested in our coalitions, right? If we can just show other people that we're right or convince other people that we’re right, then we gain them as allies in this war on reality, or whatever it is that we're in, right? Which is probably really important, let's say 200,000 years ago.
It's important now, but I wouldn't say it’s – probably what really matters for your success in life right now is are you good at your job, right? Can you figure out what it is you actually want to do with your life, right? Neither of those are situations where coalitions are really important. I mean, to a degree, you need to have good relations with people in the workplace and that thing, but that is rarely about being right all the time.
[0:42:05.0] MB: Well, you bring up another really good point, which is this idea that these biases, these problems are rooted fundamentally in the evolutionary forces that shaped our brain and baked these biases in to begin with.
[0:42:17.1] MB: Yeah, yeah. Exactly. I mean, all the biases that I talk about, especially in this paper in the dark side of information proliferation, which is I guess has a lot to do with what we're talking about today. All those biases, the belief consistent one where you're more likely to believe things that are consistent with what you already believe, this is negativity bias, right? Your mind ranks and negative things as being really important. The social bias, it's like, what is everybody else doing in this hyper national social monitoring that's going on with their telephones these days, where we're just constantly monitoring everybody.
The last one I talk about there was this predictive, this obsession, this addiction we have with prediction. I mean, even this this show is an example of that. It’s like, we’re really addicted to trying to figure out how can we make it better and how can we predict what things are going to be like in the future and how can we best optimize these kinds of things? That's a good thing, right? None of us will disagree. This is good. It's good that we want to predict the future and we want to be better at things.
If you think about it just a little bit further, what you realize is that what this means is I constantly have to be following the news, right? The science publications on the news, because I need to be as up-to-date as they can possibly be about what causes cancer, for example, right? Or what the best way is to drink my coffee in the morning, or whatever, these kinds of things. It’s like, I've seen the level of predictive detail that we desire in our lives.
What that means is that we wind up living our lives in the noise, right? If you think about the signal, the signal being all the scientific research that's valid, that in a sense is going to persist, that's the signal. The noise is all the little deep, little fluctuations, these latest articles that said, “Oh, you know, too many olives cause cancer, or something like that.” I don't believe that, but I really – probably isn't out there. It’s like, this causes cancer, or that causes cancer, or this causes incontinence, or whatever, those kinds of things.
That bleeding edge of the news is mostly noise. It's just mostly journalists picking these things that we think, they think we're going to click on. That's the part where typically, there isn't this long history of evidence, right? It's one person, their laboratory with some research finding and it's really provocative. The whole reason it's interesting is because it's new, right? The whole reason that the journalist spots and goes, “Oh, yeah. This is a thing we really have to pay attention to, it’s because people don't believe it already,” right?
Whereas, it's really that the signal is mostly in these things we've known all along to be true. That necessity to be the bleeding edge of predicted patterns puts us in a place where we get battered around by this noisy news.
[0:45:31.4] MB: I couldn't agree more. One of the fundamental mental models that I try to use to govern where I spend my time and where I try to learn is to study things that either never change, or change very slowly over time. The more you can really reinforce and build a knowledge base around things that change very slowly, instead of focusing on the ephemeral, you can start to harness the power of compounding to really build a truly growing knowledge base that helps you accelerate the amount you can learn and understand about reality.
[0:46:03.1] TH: Yeah. I think that's exactly right. Yeah.
[0:46:04.7] MB: What would one piece of homework or action item be that you would give to somebody listening to this episode to start to concretely implement some of the themes and ideas that we've talked about today?
[0:46:16.1] TH: I think probably, one of the most important things that we haven't talked about, but I think it's actually key to dealing with all these kinds of things, is figuring out who you are and what it is you really want out of your relationship with reality. That's about listing your goals, right? I mean, Brian Tracy talks about it in Eat the Frog. There's a substantial amount of research, very strong research on what are called implementation intentions.
Implementation intentions, are you figuring out what it is you want, right? What's the future goal going to be? You're writing down these goals and then you're writing down exactly how you're going to implement them. This is what I'm going to do to achieve this particular goal. Meaning, this is the time I'm going to do it. This is when I'm going to do it. This is where I'm going to do it. This is how I'm going to do it, right? These are the resources I'm going to need. This is how I'm going to get the resources I need.
This sounds unrelated in a way, but it's not, because what happens is if we're going to deal with this information to lose that's all around us, there's so much of it, and in many ways, it's so biased by our own predispositions and by the people that it gets filtered to before it reaches us. If we don't have a really strong direction ahead of time, then we're just playing the noise, right? We're just getting bashed around by whatever the latest tweet is, or the latest social media chat, or this funny thing, or the latest news about this and that.
If we know ahead of time what it is we want out of our relationship with reality and what steps we’re going to take to get there, then we can say, “You know what? One of the things that I want to have is I want to be semi up-to-date on whatever that Twitter has happening in the Twitter sphere.” I'll give myself five minutes in the Twitter sphere, right? What I really care about is learning about, let's say behavioral economics, or something like that, and understanding how I can use that to improve other people's lives.
That means, I'm not going to wind up on the Twitter sphere and spend the rest of the evening there, or the rest of my life there. I know what I really want and I know how I'm going to spend my time to achieve it. Occasionally, I can give myself breaks to entertain, to keep up to date, or whatever and these kinds of things, give myself a break, this kind of thing. I know what my goals are and I know how I'm going to achieve them. That prevents me from being bashed around by the noise.
[0:48:54.1] MB: Yeah, what a great insight. That's fundamentally the same way that I think about it, which is this idea of figure out what you want and how you want to be in the world and what you want to optimize for. If you really spend some time thinking about that, it will make very clear what your priorities are and where to spend your time, and the reality is and in most cases, doing things, like reading the news, or catching up on the latest tweet are not at all where you should be focusing your time and energy.
[0:49:21.3] TH: They're not related to your goals.
[0:49:23.2] MB: Exactly. Yeah, that's a really simple way to put it.
[0:49:25.3] TH: They're distractions and temporary distractions. Not only that, they're mostly wrong in the first place. It's just not real.
[0:49:32.5] MB: Yeah. They're making you less happy and creating more negativity and fear as well.
[0:49:36.9] TH: Exactly. We could go on, right? It’s like the wormhole and it’s connected to something else and goodness gracious, is just – Yeah.
[0:49:45.8] MB: Thomas, where can listeners who want to learn more about you, about your research, about your work, find you online?
[0:49:51.3] TH: Yeah. I'm at the University of Warwick. I think, if you could type my name in and maybe type in Warwick or something like that, I'll pop up. I've written some articles about a wide variety of things. Doesn't matter, algorithm, have a mental health problem and the evolutionary value of shamanism as a way of reconstructing our identities and helping us make sense of really complicated issues. Most of the articles I've written, indeed even the dark side of information proliferation are available there.
Some of them are I should say, more popular than others. I would definitely say the dark side of information is – it is written in a way for a popular audience. People who are more – who are willing, if you will, to go to look a little deeper and go, “Okay, yeah. What is this thing called motivated reasoning, or confirmation bias, or dread risks, or these kinds of things?” The whole vocabulary there for thinking about information.
[0:50:48.4] MB: Well, Thomas, thank you so much for coming on the show and sharing all this wisdom. We'll make sure to include everything we talked about in today's show notes. It was a fascinating conversation and really got to the heart of some of the biggest issues that we're facing today.
[0:50:59.9] TH: Great. Thanks, Matt. I really appreciate it. Yeah. Thanks for having me.
[0:51:03.1] MB: Thank you so much for listening to the Science of Success. We created this show to help you our listeners, master evidence-based growth. I love hearing from listeners. If you want to reach out, share your story, or just say hi, shoot me an e-mail. My e-mail is firstname.lastname@example.org. That’s M-A-T-T@successpodcast.com. I’d love to hear from you and I read and respond to every single listener e-mail.
I'm going to give you three reasons why you should sign up for our e-mail list today by going to successpodcast.com, signing up right on the homepage. There are some incredible stuff that’s only available to those on the e-mail list, so be sure to sign up, including an exclusive curated weekly e-mail from us called Mindset Monday, which is short, simple, filled with articles, stories, things that we found interesting and fascinating in the world of evidence-based growth in the last week.
Next, you're going to get an exclusive chance to shape the show, including voting on guests, submitting your own personal questions that we’ll ask guests on air and much more. Lastly, you’re going to get a free guide we created based on listener demand, our most popular guide, which is called how to organize and remember everything. You can get it completely for free along with another surprise bonus guide by signing up and joining the e-mail list today. Again, you can do that at successpodcast.com, sign up right at the homepage, or if you're on the go, just text the word “smarter”, S-M-A-R-T-E-R to the number 44-222.
Remember, the greatest compliment you can give us is a referral to a friend either live or online. If you enjoyed this episode, please leave us an awesome review and subscribe on iTunes because that helps boost the algorithm, that helps us move up the iTunes rankings and helps more people discover the Science of Success.
Don't forget, if you want to get all the incredible information we talked about in the show, links transcripts, everything we discussed and much more, be sure to check out our show notes. You can get those at successpodcast.com, just hit the show notes button right at the top.
Thanks again, and we'll see you on the next episode of the Science of Success.