That's the premise behind "Disinformation" - with award-winning Evergreen host Paul Brandus. Get ready for amazing stories - war, espionage, corruption, elections, and assorted trickery showing how false information is turning our world inside out - and what we can do about it. A co-production of Evergreen and Emergent Risk International.
A Whole of Society Approach - January 6th, One Year Later
| S:1 E:8
On the second anniversary of the attack on the U.S. Capitol, we look at the government’s need to recalibrate its approach to domestic terrorism. But a revamping of basic American institutions is needed as well - featuring Dr. David Gioe, Associate Professor of History at the U.S. Military Academy at West Point and Director of Studies for the Cambridge Security Initiative and co-convener of its International Security and Intelligence program.
It was the event that shocked the nation and the world.
January 6th, 2021 attack on the U.S. Capitol, the worst attack on Washington since 1814 when British troops burned the Capitol and the White House, causing President James Madison and First Lady, Dolley Madison to flee.
But the 2021 attack was not the work of a foreign invader. It was the work of American citizens angry that the 2020 election had not gone their way. They were convinced that Donald Trump, the sitting president, had been robbed the victim of a fraudulent election.
In fact, the attackers were in a way, victims themselves, victims of false narratives about the election, victims of what has become the scourge of our time: disinformation.
I'm Paul Brandis, and that's the name of this series, it's called simply, Disinformation.
And I'm Meredith Wilson, founder and CEO of Emergent Risk International, and I'll be providing analysis throughout each episode.
History's usefulness to us is maximized when we examine it through an honest and objective lens. It is honest therefore, to say, with no shred of hyperbole that what happened on January the 6th, 2021, will be remembered as one of the most shocking and disgraceful events in our country's history.
Of that day, this question must be asked, how did we get here?
I came up with two answers as to I think what went wrong …
Dr. David Gioe is a British Academy Global Professor in King's College, London's Department of War Studies, also Associate Professor of History at the U.S. Military Academy at West Point, where he serves as history fellow for the Army Cyber Institute.
He brings over two decades of intelligence and security practitioner experience to his academic research and teaching.
The first one is that I think the overwhelming flood of mis and disinformation pouring over the American people by both foreign and domestic actors was actually unprecedented. And as a historian, we don't use that term “unprecedented” lightly. Historians will say, well, there's a precedent for everything.
But really, the speed and the volume of the tidal wave of disinformation and misinformation (I'll just say disinformation from now on for ease) really did overwhelm us, I think. And the second and related answer was that we as a country, and particularly, we as a U.S. government, we didn't consider disinformation to be a national security issue.
So, we didn't understand the gravity of the threat. We didn't understand how it really can be existential.
The second is that this internal threat fueled, as Dr. Gioe says, by a flood of disinformation is forcing the federal government, in fact, the American people to redefine what constitutes a national security threat in the 21st century, and where it comes from.
Since the end of the Cold War, we've been so obsessed with either the war on terror or now, great power competition with Russia and China, that we sort of forgot what threats might be lurking here at home, and how might disinformation connect those threats and connect those people, and amplify those threats.
And the violence actually came home to roost, I think, on the 6th of January in Washington D.C. And so, I think this is really an urgent question.
This redefinition of what our threats are, means we need to move the threat posed by disinformation up the list, way up.
I argued that we should actually place it at the top of the list. The other threats that you mentioned (China, Russia, North Korea, on and on), I mean, they're certainly real threats and we have to not underplay them by any means.
But just going back to the 6th of January, look, it wasn't China that vandalized the Capitol. It wasn't Chinese intelligence that broke into Nancy Pelosi's office. It wasn't Russia that brought the flex cuffs for Nancy Pelosi to try to kidnap Vice President Mike Pence. It wasn't Al-Qaeda that plotted to kidnap Michigan Governor Gretchen Whitmer, and it's not the Iran’s Quds force that came to America and shot up the FBI's Cincinnati field office.
How has it come to this? It's not enough to go back to the 2020 election, you have to go back much further.
The January 6th attack was what I like to call a downstream event. People impacted, influenced, moved to act by an upstream event: namely, the dumping of lies into that stream. Like a factory dumping pollutants, they all float downstream with prominent public officials and media personalities — not journalists, but media personalities driving them as a hammer might nail deeper into our minds.
That this occurs today, isn't surprising. Meredith Wilson, the Chief Executive Officer of Emergent Risk International says this repetition, this never-ending cycle of disinformation is made possible by the very technology we rely on to stay or pretend to stay informed in an objective manner.
Because of the way that the internet has evolved into a space where we have these algorithms that drive what we see in our own feeds, we have become very prone to looking for things that validate our own sort of ideas about the world.
So, if we tend to lean more liberal, we tend to look to liberal sources, we tend to want to read things that agree with our worldview, and we tend to get very angered by things that maybe don't. And I think people across the board (conservative or liberal) tend to fall into this category more and more because we've gotten very used to being served up the information that we want to see, and information that reflects what we believe should be the correct way of doing things.
And in doing so, it's become just kind of almost a vicious cycle where we are fed more information that feeds that bias, and we continue to consume those news sources. In the meantime, we miss out on really big pieces of information that might be being featured elsewhere.
Whether that is maybe less positive information about a particular political figure that we happen to like, or whether that is a policy that we don't like. The sources that we go to tend to feed us the information we want to see.
We run into problems with only seeing maybe the positive side of the things of the issues and the people that we like, and we don't necessarily see the downside.
Which helps explain why (and I'll bet you can relate to this) if you're talking with someone with whom you disagree, whether it's online or in person, that it's difficult, almost impossible, in fact, to convince them that they're wrong about something. You are right and they're wrong. You are sure of that, and yet the other guy is convinced that he's right and it's you who's wrong.
This dynamic, all too familiar today, could have something to do with the way our brains are wired. At least that's what Professor Stephan Lewandowsky thinks. He's Chair of Cognitive Psychology at the School of Psychological Science at the University of Bristol in England.
He spent years studying how our brains are wired to receive, process, and store information. He says our brains are like computers, and that once we store information, we're more likely to believe it, and it can be difficult for that information to be dislodged.
The key notion here appears to be twofold. Number one, whenever people are encountering information, they tend to believe it by default.
Now, that makes a lot of sense, because 90% of the time when I'm interacting with the world, and I talk to people outside, they're going to tell me the truth. If I ask you what time it is, everybody will tell me the truth.
So, it's a very sensible thing for our system to be built that way, that by default, we accept everything as being true. Now, the problem with that though is what happens if something turns out to have been false?
Maybe somebody by mistake told me it was Tuesday rather than Monday when I asked them what day of the week it is. Then I have to update my memory and my mental model of the world. And the question then is, how do you do that?
Well, and that's where things become very interesting, complicated, and also, concerning when it comes to the welfare of our societies. And that is that in order to unbelieve something, people have to identify it as being false. You can't just remove things from memory.
Once you have encoded information in memory that you believe, you can't just yank it out. Memory doesn't work that way. There's no delete button.
Did you hear that? There's no delete button. So, go ahead, tell folks that there's no evidence that the 2020 election was rigged. Tell them until you're blue in the face, it probably won't work. Cogent arguments, links to reputable websites, armies of fact checkers, these efforts to delete what their brains have previously processed probably will fall upon deaf ears.
So, all of this helps explain how we got here and the scope of this clear and present danger. The question going forward as we mark the second anniversary of the Capitol attack is this, what can we do to keep it from happening again?
That battle, to my earlier analogy, is perhaps best waged upstream, where perhaps we can find a way or ways to keep the pollutants of disinformation from being dumped into the river in the first place. We'll talk about that after this short break.
This series on Disinformation is a co-production of Evergreen Podcasts and Emergent Risk International, a global risk advisory firm. Emergent Risk International, we build intelligent solutions that find opportunities in a world of risk.
Welcome back. We were talking about how to prevent disinformation from seeping into our minds, if that's even possible. This is a huge question, a critical one, and one for which there is no single answer, but let's take a stab at it.
In my conversation with Dr. Lewandowsky, the Professor of Cognitive Psychology, he used an interesting word: “unbelieving.” It's difficult to get people to unbelieve once their brains have processed and stored what they've been exposed to.
This makes it hard to tell them that they're wrong. It's one reason for all the arguing on TV and social media. He says what does not work is hammering others. But what can work is approaching others in their beliefs with a more subtle, layered approach — again, very difficult to do.
If you start out by letting them explain their opinions, and if you can then engage with that explanation, and you can affirm part of it.
So, the key thing, if you're confronted with that situation where somebody is refusing to believe the evidence is not to try and tell them, “Oh my God, you're so wrong.” That's not a good strategy.
That doesn’t work.
That doesn't work. But what you can do is you can say, “Well, you know what? You actually got a point.” And then you say “I actually think you're coming from a good place. You have these feelings and you're expressing them, and that's all wonderful, and you're partially right. Oh, but by the way, this belief that you're holding actually turns out to be false.”
Now, if you do that sort of layered approach of letting them explain, affirming where that feeling and opinion is coming from, and then you slip in a correction, then you will find (we can show that in experiments) that people are more receptive to the new information.
That sounds pretty hard — softening others up by telling them that they're coming from a good place and then gradually, slipping in your version of the truth. Probably takes the patience of a saint. But Dr. Lewandowsky says that's what his research shows to be effective.
Dr. Gioe, on the other hand (the West Point and King's College Professor) has some other ideas, a more multifaceted and fundamental approach that starts in the classroom.
The first one is just to teach civics in schools, but to community groups as well. There's some terrifying pew polling results that suggest that a small number of Americans understand, for instance, that we have three branches of government, and a small fraction of that fraction can actually name them.
It's much harder to be fooled about something if you understand what it is. Just think about if I'm an expert in diamonds and I go to a jeweler and he's trying to hustle me, well, I know a little something about diamonds, or I take my car to the mechanic or whatever: “Oh, that's a converter problem.” Oh, I know something about engines and it's not.
So, we have to know something about it first. And I think that that starts in school, but I think it's a lifetime learning thing. And actually, some public libraries are actually teaching civics again to grownups, and I think that's fantastic.
Here's some data to back up what Dr. Gioe is saying. A survey released in September, 2022 by the Annenberg Public Policy Center at the University of Pennsylvania shows that about one in four adults cannot name any of the three branches of government. That same percentage, about one in four, could not name any of the basic First Amendment freedoms. The Annenberg survey shows that these numbers, in fact, are trending in the wrong direction.
It's this lack of understanding that contributed to the Capitol attack. Somehow, there was a belief that the Vice President at the time, Mike Pence, could halt the counting and certification of electoral votes.
Because if Mike Pence does the right thing, we win the election. This from the number one or certainly one of the top constitutional lawyers in our country, he has the absolute right to do it.
Of course, that's not true, and Dr. Gioe has another idea.
The second thing I think, is to teach critical thinking; fact versus opinion stuff. When I was a kid, in school, they would have these workbooks and you would have to put an “F” next to something that was a fact, and an “O” next to something that was an opinion.
And we don't do this in school anymore, to my knowledge. I mean, at least anecdotally, they didn't do it with my son. And so, I would do that with him. And I would give him a fact. the Philadelphia Eagles are the football team in Philadelphia. My opinion is that they're the best football team in the history of the NFL.
And he has to disentangle those things and learn to use evidence to evaluate arguments. And I think we need to do more of that.
Critical thinking and the separation of fact from opinion, what a concept. These are things that obviously would make for a more informed electorate.
We must remember that opinions and facts are quite different. Dr. Gioe's claim, for example, that the Eagles are the best football team in history, he’s obviously never heard of the ‘85 Chicago Bears. But then again, that's just my opinion.
Seriously, though, if we don't know civics, don't know the basics of our government and constitution, if we don't know how things work, we're more susceptible to disinformation, more susceptible to those who would eagerly inject lies into the public arena, more susceptible to another January 6th.
Thanks to Dr. David Gioe and Dr. Stephan Lewandowsky for their insights, our sound designer and editor, Noah Foutz; audio engineer, Nathan Corson; executive producers, Michael DeAloia, and Gerardo Orlando.
And on behalf of Meredith Wilson of Emergent Risk International, I'm Paul Brandus. Thanks so much for listening.