That's the premise behind "Disinformation" - with award-winning Evergreen host Paul Brandus. Get ready for amazing stories - war, espionage, corruption, elections, and assorted trickery showing how false information is turning our world inside out - and what we can do about it. A co-production of Evergreen and Emergent Risk International.
The immense power and reach of social media and the internet can act as accelerants for disinformation. Paul's guests include Meredith Wilson, chief executive officer of Emergent Risk International and ERI's Brady Roberts.
What you're about to hear is fake. This audio of house speaker, Nancy Pelosi.
Nancy Pelosi:
“I'm sorry, to do what?”
Reporter:
“Accuse a TV host of murder on Twitter.”
Nancy Pelosi:
“I'm sorry. I'm not … I don't understand. What did he do?”
Paul Brandus:
In a deliberate, malicious attempt to make it seem like she was intoxicated, her voice was slowed down, certain words omitted, and the whole thing made to sound like she was slurring her words. Again, totally fake but a high-profile example of how audio and video can be manipulated.
The technology is so sophisticated today that anybody can do this. Anybody can manufacture and distribute content that is false. There's a word for this disinformation, and that's the name of this series.
[Music playing]
In our first two episodes, we show the power and effectiveness of disinformation during the two principle geopolitical events of the 20th century, World War II and Cold War.
We're now well into the 21st century, of course, and those events have receded into the history books. But the lessons learned from using disinformation to achieve goals, be it political, economic, or something else, remain.
To this, we can now add the immense power of the internet and social media. This combination, disinformation, and the internet and social media is potent and dangerous.
Meredith Wilson:
When it comes to individuals, particularly for the sort of over 18 voting population, this is a real challenge because it's almost too late to put the genie back in the bottle with some of these folks. Whether they're far left, far right — wherever they stand on the political spectrum, there are a lot of people that are taking in disinformation every day without realizing it.
Probably, you and I take in some disinformation every day without realizing it. There's so much out there, and it creates these narratives that just kind of go on and on and on. And so, what that does, is it kind of embeds itself in people's belief structures, making it much harder to debunk claims or to really separate those narratives anymore.
Paul Brandus:
Brady Roberts, a veteran of the Central Intelligence Agency, is the Chief Operating Officer of Emergent Risk International, a risk advisory firm which is partnering with Evergreen in the production of this series. When he says it's easy to manipulate audio or video, my question is this; “How easy?”
Well, pretty easy. It turns out technology has gotten to the point where you only need two things, really, a phone or tablet, and also, an imagination. One app, for example, it's called Reface (the name speaks for itself) allows you to post anyone's face on another image. Its motto: be anyone and reface anything.
And in a matter of moments, I was able to do just that. My face in movie clips or in still photos, I even turned myself into a heavyweight boxer.
Another app it's called Avatarify. Same thing, it only takes a moment. It's really fun here, but the possibilities for disinformation are endless. There are plenty of other apps out there too, the technology is agnostic. The problem is when people with nefarious intentions get ahold of them, as former President Barack Obama outlined in this speech at Stanford University in the spring of 2022.
Barack Obama:
“Indeed, one of the biggest reasons for democracy’s weakening is the profound change that's taken place in how we communicate and consume information. For more and more of us, search and social media platforms aren't just our window into the internet. They serve as our primary source of news and information.
No one tells us that the window is blurred, subject to unseen distortions and subtle manipulations. All we see is a constant feed of content where useful, factual information and happy diversions, and cat videos flow alongside lies, conspiracy theories, junk science, quackery, white supremacists, racist tracks, misogynist screeds. And over time, we lose our capacity to distinguish between fact, opinion, and wholesale fiction.”
Paul Brandus:
So, fake audio, fake video, tweets, Facebook, TikTok — wherever people congregate online, that's where disinformation can be found. In other words, it's everywhere.
Making all this worse are two things. First, falsehoods spread farther, faster, and deeper than true information. That's according to a 2019 study by the Massachusetts Institute of Technology.
And the second, is our own individual behavior. Here's Meredith Wilson, the Chief Executive Officer of Emergent Risk.
Meredith Wilson:
Well, so when we think about accelerants, the first thing that I think about is going kind of back to some of the early ways that (and this kind of get back to the Russian disinformation thing) that the Russian Internet Research Agency targeted U.S. audiences. And it's the literary verbal equivalent of throwing a bomb into the middle of a discussion.
Paul Brandus:
And again, no one tells us that the window is blurred.
Adding to this myopia is a narrow focus that comes from doing what so many of us do today, and that is to seek out only information that reinforces our preexisting beliefs. This too, makes us more susceptible to disinformation.
One of the problems here is that technology seems to be advancing faster than our human ability to adapt. Meaning that the ability to manufacture and distribute disinformation outpaces our ability, even willingness to make an effort to tell fact from fiction.
Meredith Wilson:
Technology is moving so fast right now that if we as ordinary citizens don't start to pay more attention to that and start to get a better understanding of what that means, it's going to move so fast that we won't even have a chance to catch up with that and really do anything about it.
Brady Roberts:
Well, and that's the big challenge, Paul, is helping encourage people to understand that there is some individual responsibility here. You want to understand how disinformation and how misinformation works, how it's influencing you as a person, as a singular person, and the responsibility you have.
If you're going to make a really important decision to be able to break out of that cycle and get a much better understanding of whatever the issue is that you're looking at or you're thinking about.
And so, I think that that's probably one of the most important takeaways that we hope to people will receive from this podcast, is getting a better sense of literacy around dis and misinformation.
And recognizing that, “Hey, it's on me.” Because that's a really hard problem for us globally to solve right now, this idea of individual responsibility. And I think the only way really to do that is to continue having these conversations over and over again.
Paul Brandus:
Individual responsibility, what a concept? But what can we as individuals do? Here's an idea again from former President Obama, who said this at a commencement address at the University of Michigan back in 2010.
Barack Obama:
“Today's 24/7 echo chamber amplifies the most inflammatory soundbites louder and faster than ever before. And it's also, however, given us unprecedented choice. Whereas most Americans used to get their news from the same three networks over dinner or a few influential papers on Sunday morning, we now have the option to get our information from any number of blogs or websites, or cable news shows.
And this can have both a good and bad development for democracy. But if we choose only to expose ourselves to opinions and viewpoints that are in line with our own, studies suggest that we become more polarized, more set in our ways. That'll only reinforce and even deepen the political divides in this country.
But if we choose to actively seek out information that challenges our assumptions and our beliefs, perhaps, we can begin to understand where the people who disagree with us are coming from.
Paul Brandus:
That was in 2010, and if you think people were polarized and set in their ways, then it certainly appears even more so today. And again, when you're only listening to the side you like, you're more susceptible to information that is slanted or downright false.
So, the idea then, we should be consuming more information and different kinds of it. In other words, if you always read or listen to stuff coming from the left or right, try expanding your news diet to include a little bit of something else.
It might make your blood boil. And as Obama said, your mind might not be changed, but the practice of listening to opposing views is essential for effective citizenship, and it is essential for a healthy democracy.
It is also helpful in weeding out disinformation because you're developing a broader perspective, a broader exposure to facts, figures, and insight that just might make us better informed than we think, better citizens too. But as Brady Roberts says, “We've got to work at it.”
Brady Roberts:
Well, and along these lines of critical thinking, and also, there are comments earlier around expanding your news diet, there’s also expanding our relationship diets, meaning breaking down walls and barriers and not allowing those to exist.
If I feel like I sit on one side of an ideological fence, making sure that I'm really intentional about not just looking at, for example, another news site that's very easy to do, but having tough conversations with people that disagree with me and allowing those conversations and those debates to break into my psyche and allow me to see things from a different perspective.
I think that's one way that we ease tensions in society, but also, it helps us as professionals and in our case, as intelligence analysts, to see things through different lenses and breakdown bias. And I think that's something that it's not just advice for an intel analyst (although it's really good advice for an intel analyst) but it's something that can help all of us.
Paul Brandus:
Let's take a quick break here, and when we come back, we'll talk about one intriguing way that just might make a difference in defeating or at least getting the upper hand on disinformation.
Voiceover:
This series on disinformation is a co-production of Evergreen Podcasts and Emergent Risk International, a global risk advisory firm. Emergent Risk International, we build intelligent solutions that find opportunities in a world of risk.
[Music Playing]
Paul Brandus:
In our prior segment, we talked about how it's up to us as individuals to be proactive at broadening our news diet, to work at being better informed citizens, more knowledge and broader knowledge, and a willingness to think critically about the information we're exposed to. All this can help us see through the disinformation that we're often unknowingly bombarded with.
But let's shift now to a more urgent matter, a matter of life and death, and that's dealing with disinformation during wartime. Prior to the Russian invasion of Ukraine in February 2022, the Kremlin spent weeks denying that it would attack using a blitz of disinformation that blamed the West, notably, the United States, for stirring up warfare.
But satellite imagery and communications intercepted by the U.S. told another story. What the U.S. did here was extraordinary and may represent a turning point of sorts in the way that governments deal with disinformation, at least when it comes to war and peace. The U.S. shared its intelligence analysis with the world about what the Russians were likely to do.
Jake Sullivan:
“If a Russian attack on Ukraine proceeds, it is likely to begin with aerial bombing and missile attacks that could obviously kill civilians without regard to their nationality. A subsequent ground invasion would involve the onslaught of a massive force.”
Paul Brandus:
That's National Security Advisor Jake Sullivan. The Russians continued with their disinformation and denials, even claiming at one point that they were actually pulling troops back from the Ukrainian border. This led to President Biden himself sharing the analysis. Listen over the noise of his helicopter, Marine One.
Joe Biden:
“They have not moved any of their troops out. They’ve moved more troops in, number one. Number two, we have reason to believe that they are engaged in a false flag operation to have an excuse to go in. Every indication we have is they're prepared to go into Ukraine and attack Ukraine.”
Paul Brandus:
And of course, we know what happened next.
Vladimir Putin:
“[Speaking Russian]”
Paul Brandus:
That's Vladimir Putin, of course, announcing what he called a “special military operation” in Ukraine. The U.S. and its European allies called it something else; a flat-out cold-blooded invasion. Meredith Wilson says there's a word for the tactic used by the U.S. government in the run up to the Russian attack; pre-bunking.
Meredith Wilson:
Some of the more recent studies have shown that pre-bunking might be a way to help people stay out of the disinformation hole. And by pre-bunking, it's things like saying, “Hey, here's the real story,” when we know that disinformation maybe coming down the pike.
This has actually been pretty effective in the Russia-Ukraine conflict in terms of the U.S. sharing information ahead of time and saying, “Here's what Russia's going to do,” so that when Russia turns around and tries to spin a story of disinformation, the real story's already out there.
Paul Brandus:
So, that's pre-bunking, getting ahead of somebody else's narrative with your own information. In a crisis, such transparency can save lives.
During the Cuban Missile Crisis of 1962, when the U.S. and Soviet Union nearly waged nuclear war, the U.S. had intelligence, aerial reconnaissance proving that the Soviets were putting nuclear missiles in Cuba.
The Soviets used disinformation to deny it, but then President John F. Kennedy told his United Nations Ambassador, Adlai Stevenson to share U.S. intelligence with the UN Security Council.
Adlai Stevenson:
“Which you can all examine at your leisure, shows three successive photographic enlargements of another missile base of the same type in the area of San Cristóbal.”
Paul Brandus:
This was not pre-bunking, it was debunking using information to counter someone else's lies after the fact. Of course, this was long before the internet, long before social media, and long before anyone could spread disinformation.
This is what makes our current era, a quarter of the way into the 21st century, so dangerous. Anybody can do it. We'll discuss this in our next episode.
[Music playing]
Thanks to Meredith Wilson and Brady Roberts of Emergent Risk International. Thanks to C-SPAN and the National Archives; our sound designer and editor, Noah Foutz, audio engineer Nathan Corson, executive producers Michael DeAloia, and Gerardo Orlando.