How Do You Know
What's True?
That's the premise behind "Disinformation" - with award-winning Evergreen host Paul Brandus. Get ready for amazing stories - war, espionage, corruption, elections, and assorted trickery showing how false information is turning our world inside out - and what we can do about it. A co-production of Evergreen and Emergent Risk International.
Nobody's Fool - Understanding Truth Bias and Disinformation
| S:3 E:2"It's a kind of information that we find particularly appealing that people looking to fool us will use, but it can lead to innocent spreading"
Host Paul Brandus explores the concept of truth bias and its implications in today's society with guests Daniel Simons and Christopher Chabris, authors of the book "Nobody's Fool, Why We Get Taken In and What We Can Do About It." They discuss how our innate trust in familiar sources can make us vulnerable to misinformation and the importance of skepticism in evaluating information. The conversation delves into the challenges of focusing on only what is directly in front of us, potentially overlooking crucial context. Meredith Wilson, CEO of Emergent Risk International, joins the discussion to provide insights on how trust and skepticism play a crucial role in navigating the complex landscape of information consumption. The episode highlights the need for critical thinking and awareness in an age where information is constantly at our fingertips.
[00:02:06] Truth bias and deception.
[00:05:47] Calibrating trust and skepticism.
[00:09:50] Familiarity and trust on social media.
[00:12:25] Focusing on selective information.
[00:17:07] Trust in societal systems.
[00:21:29] Familiarity and trust in information.
[00:24:27] Human nature and information consumption.
Got questions, comments or ideas or an example of disinformation you'd like us to check out? Send them to [email protected]. Subscribe wherever you get your podcasts.
Special thanks to our guests Daniel Simons and Christopher Chabris, our sound designer and editor Noah Foutz, audio engineer Nathan Corson, and executive producers Michael Dealoia and Gerardo Orlando. Thanks so much for listening.
Where to Listen
Find us in your favorite podcast app.
00:05 Paul Brandus: It was one of the more popular TV shows back in the 1950s.
00:09 *Television audio* : Who Do You Trust?
00:14 Paul Brandus: The host, by the way, was a young man named Johnny Carson. It was based on a series of questions that a couple was asked and, if they trusted, what their spouse or friend was telling them. It's an interesting premise and one that holds relevance today. We get a great deal of information today from those who are closest to us. Friends, relatives, co-workers. We're inclined to believe what they share. It's healthy. It's also necessary. As the late educator Stephen Covey, author of Seven Habits of Highly Effective People, put it, trust is the highest form of human motivation. He was right. This inclination is called our truth bias. But sometimes what others share with us, perhaps inadvertently, might not be accurate. It could be misinformation or it could be disinformation. I'm Paul Brandus, and that's the name of this award-winning podcast series, Disinformation, a co-production of Evergreen and Emergen Risk International, or ERI, a global risk advisory firm. As usual, I'll be joined by ERI's Chief Executive Officer, Meredith Wilson, in a few minutes. The issue of truth and trust is one subject of a fascinating book. It's called Nobody's Fool, Why We Get Taken In and What We Can Do About It. I spoke with the co-authors, Daniel Simons, a psychology professor at the University of Illinois. and cognitive scientist Christopher Chabris. He's a professor with the Geisinger Research Institute in Pennsylvania. The discussion began with their explanation of so-called truth bias. The first voice here is Chris.
02:06 Christopher Chabris: So truth bias is, simply put, a default assumption that we all make that whatever we're being told or what we're seeing or information that's being conveyed to us is true, as opposed to just automatically assuming everything is false or, for that matter, not even assigning a truth value, either true or false, to information but just remaining uncertain about it. And so our habit of thinking that things are true sort of automatically is the first thing that makes us susceptible to misinformation, disinformation, other kinds of deception. And it's not a dumb thing to have a truth bias because if you didn't have a truth bias and you stopped and checked every little bit of information that was coming at you from all channels all day long and so on, you'd never get anywhere in life. People can exploit the truth bias to sort of get in information, perhaps before you've had a chance to check it or be skeptical about it or to question it deeply.
03:02 Paul Brandus: I mean, if somebody tells me, for example, if I say, well, what time is it? And they say it's two o'clock, I'm inclined to believe them. I'm not going to necessarily challenge that. At what level, though, should we begin to challenge or be skeptical about something that someone says?
03:23 Daniel Simons: I think there are a couple of times. One is when we know the consequences to be big. So you're making a big investment or you're making some decision about who you're going to vote for or things that have direct and large consequences potentially for you. Those are times when you should check a bit more. But another time you should check is when you're passing along information. So one of the things we tend to do is when we get something, say, on social media, a friend of ours on Facebook or on Instagram or someplace posts something that sounds interesting to us and sounds right to us, we tend to not think as critically about it. We tend to accept it as true. And we don't stop to ask, wait, is that really true before passing it along? And that's often how unintentionally we spread misinformation.
04:12 Paul Brandus: Are we simply too trusting?
04:15 Christopher Chabris: Well, it really depends on the situation. Like, I think overall, we're probably not too trusting. Probably on average, we're probably trusting about the right amount. And one reason I say that is, as a species, we've kind of gotten this far, you know, with some kind of truth bias, and we're still here. So it can't be too far off. it does make us, in certain situations, too trusting. And it does make people who are aware of our truth bias, whether explicitly because they've looked up the concept or just because implicitly they've learned about it and they've learned what they can get away with and what they can't, it does make us vulnerable to those people. The real trick, I think, is calibrating your amount of distrust and the amount of effort you put into checking things. You just mentioned talking to Leon Panetta. There's a great example in all the accounts of the raid that killed Osama bin Laden of being vulnerable to overconfidence and truth bias and believing what you want to believe, but beforehand actually checking it. and having a red team look into alternative scenarios. So one of the ways to fight truth bias is sort of generate alternatives, generate questions, estimate your uncertainty, as opposed to just believing or not believing. A lot of people there said, well, we think there's a 70% chance that he'll be there when we go, or we think there's only a 40% chance. All of those techniques are just not things we do in our everyday life. We don't stop and engage in all that stuff before we make every decision, but we should for the important ones, right?
05:47 Paul Brandus: Difficult distinction, though, when things move so quickly. You mentioned that we need to calibrate our thinking or the way we evaluate things from time to time. To me, that does not seem like an exact science. I mean, it's just very amorphous. I mean, how do you do that? What have you learned about how and when we should do that?
06:09 Daniel Simons: Well, then there's sometimes when it's really obvious that it's not worth their time. So, for example, you go to the grocery store and there's a sign that says organic apples. You're probably best off just believing that those actually are organic apples. You're probably not going to go out to the orchard and watch them to make sure that they only use organic fertilizers for the last 10 years, right? That's just not something you can do. So there are times when it just doesn't matter. Or, for example, you go to the grocery store and it's possible that the grocery store is adding pennies to every item when you check out. So is it worth checking every item against its price on the receipt to the price on the shelf? Probably not unless you're really in financially dire straits and you have to count every penny. Probably not worth your while because the consequences aren't that big. So there are some cases when it's obvious that you probably don't want to spend a lot of time and effort being cynical.
07:02 Paul Brandus: What's obvious is a subjective thing to each individual.
07:06 Daniel Simons: Exactly. Yeah. People may vary in this, right? Some people will want to be very careful to make sure that they don't get taken for even small amounts of money. And other people are going to be calibrated the other way and say, well, you know, if I lose a little bit, okay. The, the end points, the extremes are really pretty easy. And if you're, if you're investing, you're buying a house, you want to make sure that you're not you know, buying a bad house, right? It's a lot of money for everybody. If you're buying groceries and it's pennies, probably not worth your time. But it's the intermediate cases where you don't quite see the consequences from the start and that it can ramp up and become much more consequential that we sometimes – that's where it becomes a more challenging problem to calibrate well.
07:46 Christopher Chabris: Part of the problem, like you mentioned, is speed and distraction, right? Like those two things make it hard for us to actually evaluate how important it is to get this right. How likely is it that someone might be misleading me or have a desire to mislead me, right? When we're just scrolling through social media, we don't necessarily think. What are the odds that someone's trying to deceive me by posting this or by making sure it gets into my feed or something like that? There's no panacea, but slowing down and I think consciously thinking about the idea that it's okay to be uncertain about whether something is true or not. You don't have to have a belief in everything, right? You don't have to put a truth value in everything. You could just say, I don't know, or maybe, or whatever. And people have done studies where they generally conclude that people do want to pass on true information. People don't want to pass on false information. People don't want to be taken in by false information. I think a lot of the time though, they just don't spend the time or effort, uh, for valid reasons, you know, to, to actually, to actually think about that or just to stay uncertain and not pass along stuff that, that they're, you know, that they're not sure is true.
08:54 Paul Brandus: On that point, Chris, that's an interesting observation. It's no secret that, For example, on social media, Facebook, I'm not singling them out, but Facebook and other platforms where people share things. And the phenomenon is if you get something from a friend at work or a relative, somebody you know, tend to trust, you're more likely to believe, place a greater value in whatever it is that they are sending you, and not necessarily give it any additional thought. Tell me about that phenomenon, and maybe it's a subset of that. Is that the sort of an environment that sort of scammers, to use a lack of a better word, can take advantage of, to take advantage of the inherent trust we have in those who are closest to us?
09:50 Daniel Simons: Yeah, in fact, it's what we call a hook. It's a kind of information that we find particularly appealing that people looking to fool us will use, but it can lead to innocent spreading of misinformation as well. So that's the hook we call familiarity, which makes a lot of sense. Again, in most of our daily lives, we should be more trusting of people we're really familiar with, people we've known for a long time. If they deceived us all the time, they probably wouldn't be people we hung out with anymore. So we should have good trust of people who are highly familiar to us. One of the challenges with social media is that we have a lot of friends, in scare quotes, who we might not know all that well. We might have a huge number of followers in social media or a number of friends in social media who pass stuff along, and we see their content all the time, but we don't necessarily know how well calibrated they are. We have this tendency when we have a friend who provides information to trust it, which is a reasonable thing to have developed over time to do. But we tend to do that maybe when it's not merited. It's a whole basis of using celebrities and advertisements, right? These are people we're familiar, we're used to seeing all the time, and we maybe trust them more than we should. But in social media, that's amplified because people are sharing information that they think you would like and be interested in. And we kind of take it as true more when it comes from somebody we know.
11:15 Paul Brandus: Again, my guest is Daniel Simon, a psychology professor at the University of Illinois and cognitive scientist Christopher Chabris. He's a professor with the Geisinger Research Institute in Pennsylvania. Their book is called Nobody's Fool, Why We Get Taken In and What We Can Do About It. In their book, they wrote about how, and again, this is a very human thing, very understandable, how we tend to focus on things that are directly in front of us. That means, though, that we can often overlook additional data or context that can help convey a richer, more accurate understanding of something. If those things are peripheral, again, not directly in front of us, how can we notice it? Chris answered this one first.
12:04 Christopher Chabris: We call this problem the problem of focus. And of course, focus is a good thing because when we focus on something, we're able to do a lot more with it. We can understand it better. We can process it more deeply. We can think more detailed thoughts about it. But when we're focusing on one thing or a set of things, Other things might get no thought at all, no attention at all, might not enter our consideration at all. And, of course, marketers and salespeople and influencers will often, you know, be aware of this and use it to direct our attention towards, let's say, only their success stories. And their success stories, if we pay too much attention to them, we won't even think about, well, how often did they fail? What did their worst customer engagements look like? What are those people saying? Or, you know, stock picks, you know, anything like that. It's, I think, a general problem of disinformation, by the way, also, that you can disinform people by telling them only true things. So, you know, if someone fact-checks, you know, a bunch of true stories about, let's say, you know, immigrants who committed crimes, all those stories could be true. But if no one tells you any stories about crimes that not committed by immigrants or about immigrants who didn't commit crimes or about, you know, non-immigrants who didn't commit crimes, you're missing, like, almost all of the relevant data to evaluate the meaning of what you've been shown. And then you could be massively disinformed by completely true but unrepresentative information because that's all you paid attention to.
13:26 Paul Brandus: by taking something just completely out of context.
13:28 Daniel Simons: Yeah, or just giving you a partial view, which, of course, is what happens in any negotiation where one side has more information or any sort of performance where a magician will not tell you everything that they're going to do. And con artists, of course, present exactly what you want to see and not all of the information they don't want you to see. So that happens a lot, even in completely innocent ways. People will pass along The examples that you see on your social media feed are probably the things that you agree with and you're not seeing the counterexamples because nobody's sharing those.
13:59 Paul Brandus: You mentioned advertising a minute ago. Advertising is often crafted in a way that appeals to our desires, our expectations, makes us want to go out and try a new product or go to some destination or something. That is a very powerful thing. We want to believe that these products are better. or that this restaurant is better than something else. It makes us want to, you know, tell me about that. We want to believe the message that we're being exposed to, don't we?
14:36 Christopher Chabris: Well, often, as you mentioned, expectations. Advertisers and many other communicators are well aware of people's expectations. And if you receive a message that you were expecting to get, you're more likely to believe that it's true. So sometimes advertising and other communications can work well on a principle of surprise. And often those are the ones that, you know, sort of maybe, like, get the awards for cleverest Super Bowl ad because something really surprising happened. But that's an ad for aesthetics, not necessarily for success in sales, right? You know, like, that's an artistic achievement in a way. Even outside of advertising, let's say in science, which is our field, Sadly, there's fraud in science. There are scientists who don't really deserve the name who actually just sort of create fraudulent data, fraudulent papers, and so on. But they don't create fraudulent papers that claim to have discovered a new planet orbiting right next to Earth or something like that. They fabricate discoveries which are sort of exactly the next thing that people in their field would expect to be discovered or would expect to be shown true or supports a theory that everyone already believes or something like that. They satisfy their audience's expectations in much the same way that advertisers, politicians, and others do. communicators do by understanding sort of what people are predicting is going to happen and then showing them, Hey, that's exactly, that's exactly what happened. Not a new planet orbiting right next to earth, but something, you know, more modest than that, but of value to them to convince us of.
16:06 Daniel Simons: Yeah. It's just a slightly newer and fresher product, slightly better than what was there before. And that's enough to make them say, Oh yeah, maybe that's, that's right.
16:17 Paul Brandus: Let's take a short break. When we come back, I'll chat with Meredith Wilson, of Emergent Risk International.
16:24 ad read: This series on disinformation is a co-production of Evergreen Podcasts and Emergent Risk International, a global risk advisory firm. Emergent Risk International. We build intelligent solutions that find opportunities in a world of risk.
16:49 Paul Brandus: Welcome back. Let's bring in Meredith Wilson now. She's chief executive officer of Emergent Risk International. I asked her about the so-called truth bias that Daniel and Chris wrote about in their book. The thesis being that we're inclined, and this is healthy, to believe what others tell us.
17:07 Meredith Wilson: Oh, I think that's absolutely true. You know, you think about particularly if we're just speaking about Americans and, you know, we can talk about other countries, too. But when you think about the US and you think about. how many of our systems and even our economic system, our government, so much of that is based on trust. And in order for it to work, you have to trust people. And over the past six, eight decades, we got pretty good at trusting each other, which is one of the reasons that things work. And back in the day, we used to trust the media for the same reasons. The double-edged sword there is if you are constantly questioning everything, you tend not to trust people. So yeah, I think there is a truth bias, and I think that there is a a real dilemma in terms of how to manage that effectively when you want a society that trusts people, you want people to trust people, but if you become too trusting, particularly in the current environment, how much are you missing?
18:24 Paul Brandus: So skepticism is warranted, but the question is, if I ask you what time it is and you say, well, it's two o'clock, I'm inclined to believe you. I'm not going to presume that you're lying about the time. The question then is, at what point, how serious does the issue have to be before we do become skeptical about what we're being told? If you tell me the time, OK, I believe that. But farther down the line, as the issues get more serious, when do we start to become more skeptical, or when should we become more skeptical about what we're exposed to?
19:03 Meredith Wilson: That's a good question. I think, again, if you look internationally and you look at some of the countries where corruption runs rampant, where people are far more likely to enrich themselves and their families, also more likely to hire family members, a lot of that is due to those societies having very, very weak trust in each other. healthy skepticism is important and critical thinking is important. And when we talk about disinformation, a lot of what we, you know, we really need to see in people is a certain level of skepticism in the information that is presented to them and a good process for determining what is true and what is not true. And If I had the exact answer to how to fix that, I'd be a very rich person right now. But it really, there has to be a balance, right? You have to be able to trust that the person that's coming up to the traffic light across from you is going to stop. Otherwise, you would sit there all day long. You have to be able to trust that, you know, when you ask somebody for directions, that they're going to give you decent directions or you'll never ask anybody anything. I guess the directions thing isn't as important now, but just we have GPS, but it's a hard question to answer.
20:35 Paul Brandus: They also talk about something called familiarity. I don't think this is exactly new, but it's a very powerful concept in terms of media and how we read and see and hear things today. And basically that is that if someone we know and trust and like, a coworker or a relative or someone like that, share something with us, send something to us, we are more inclined to believe it because of our views toward that person. In other words, we can drop our guard which can make us more susceptible to false narratives, a very human instinct, very understandable, but it also makes us potentially vulnerable. How serious an issue is that? And if you think it is a serious issue, how can we overcome it?
21:29 Meredith Wilson: The concept of familiarity is definitely not new, and it's something that As a society, we've definitely learned how to capitalize on. So whether it's advertisements and public figures and endorsements, we definitely know how to capitalize on that. And there is definitely an aspect of disinformation where that's exactly what's happening, right? if you look at sort of Information Warfare 101, involving people who are familiar to the people that you're targeting is very much a methodology that's in use today, but has been in use for a long, long time. How to overcome it is a harder question, and I think probably The beginning of that is educating people and helping them understand how that happens and how that works. But can we overcome it completely? I don't know.
22:32 Paul Brandus: You know, one issue that they mentioned here is the sort of the peripheral issue. And what they mean by that is we tend to focus on things that are directly in front of us. And doing so, when we look at things that are right in front of us, we can overlook say additional data or context that's on the periphery that can help convey a richer, more accurate understanding of something. But again, if they're not right in front of us, we might not notice it. So we kind of cheat ourselves out of a fuller understanding. How can we notice things if they're not in front of us?
23:11 Meredith Wilson: Well, one tool that we have in the analytic field, because this is an analytic bias too, where you tend to look for confirming information for things, but oftentimes fail to look for disconfirming information. So when you are researching a topic, people will find those little details that very succinctly support their case, and analysts are trained to do that. But as you said, if you don't very specifically look for the information that might tell a different story, there's a very good chance you will not see it because you're not looking for it. So having that sort of gut check, is there something out there that disconcerns my theory? Is there something out there that is, you know, is counter to what I think is accurate, is a good methodology. Whether or not people who are supporting a particular politician or supporting a particular policy are interested in finding that information is a whole other question with a whole other answer.
24:27 Paul Brandus: Yeah, well, that's the issue. Dan and Chris say, look, people should take the time to explore more, to ask more questions. But they acknowledge, look, human nature is such that, you know, not calling people lazy, but people aren't going to take the time necessarily to do that. They're just going to eat up what's in front of them. And that's part of the problem.
24:47 Meredith Wilson: It depends on if they're incentivized to do so and or if it speaks to their own interests, right? But it also when you look at the way that we consume information now in this sort of scrolling, you know, small bites of information that are just kind of running past us. It's also a question of whether they are cognitively aware that they're taking in that information and actually thinking it through because, you know, we're all prone to do things now like watching TV and reading on our phones at the same time. And so part of it too is that the focus that we put on that information sometimes is not enough for us to be putting enough brain power behind whether or not it's something we should believe or not. Does that make sense?
25:35 Paul Brandus: Yes, it does. Thanks to Daniel Simons and Christopher Chabris, their book highly recommended, Nobody's Fool, Why We Get Taken In, and What We Can Do About It. Our sound designer and editor, Noah Foutz. Audio engineer, Nathan Corson. Executive producers, Michael Dealoia and Gerardo Orlando. And on behalf of Meredith Wilson, I'm Paul Brandus. Thanks so much for listening.
Hide TranscriptRecent Episodes
View AllUnmasking Disinformation: A Deep Dive into Russian Information Warfare
Disinformation | S:3 E:8No News Is Bad News - News Deserts & India, pt.3
Disinformation | S:3 E:7The Intentions of the Adversary: Disinformation and Election Security
Disinformation | S:3 E:6OSINT, pt 2: Global Affairs and Speed & Accuracy
Disinformation | S:3 E:5You May Also Like
Hear More From Us!
Subscribe Today and get the newest Evergreen content delivered straight to your inbox!