That's the premise behind "Disinformation" - with award-winning Evergreen host Paul Brandus. Get ready for amazing stories - war, espionage, corruption, elections, and assorted trickery showing how false information is turning our world inside out - and what we can do about it. A co-production of Evergreen and Emergent Risk International.
What lessons have social media firms learned from the 2020 election — and what are they doing as the 2022 midterm election looms? Analyst Meredith Wilson - the chief executive officer of Emergent Risk International - and guest Katie Harbath, a technology executive, chief executive officer of Anchor Change and former director of public policy at Facebook, weigh in.
“This is an apple. Some people might try to tell you that it's a banana.”
You might remember this TV ad that ran during the 2020 election here in the United States.
“They might scream banana, banana, banana, over and over and over again. They might put banana in all caps. You might even start to believe that this is a banana, but it's not. This is an apple.”
The message is true. An apple can only be an apple, and you can't change that simply by calling it a banana.
Now, if I tell you that ad was produced and aired on CNN, you might not think it very credible, depending on your view of that cable news network. And yet, it's still undeniable that an apple will always be just an apple no matter what others might say.
And yet, saying something that is obviously not true, like calling it a banana, and then repeating it over and over and over, can have an effect on some people. If repeated enough, the information may be perceived as being true.
Scientists call this phenomenon, “The Illusory Truth Effect”. It's also called something else, disinformation.
I'm Paul Brandus. Welcome to this series. It's called simply, Disinformation.
And I'm Meredith Wilson, founder and CEO of Merchant Risk International, and I'll be providing analysis throughout each episode.
It's no surprise that politicians say things that aren't true. They've been making things up, pedaling distortions and disinformation since the beginning of the republic. That's what politicians do.
For example, a month after taking office, President Biden said this to CNN's Anderson Cooper about the Coronavirus vaccine.
“Well, as you remember, but when you and I talked last, we talked about it's one thing to have the vaccine, which we didn't have when we came into office …”
Of course, that's not true. By the time Biden became president, the vaccine was already developed thanks to Operation Warp Speed, a crash effort by drug companies and the prior administration of President Donald Trump to get a vaccine to market.
The fact is, before he was even sworn in as president, Biden had been vaccinated twice.
Now, depending on where you stand on the political spectrum, Biden was either lying, or it was Joe Biden being, well, Joe Biden, a politician known for verbal gaffs and misstating basic facts. You can decide for yourself.
But what Biden did not do was repeat that false claim. He didn't beat it to death. That's important because in our rapid 24/7 world of short-lived news cycles, that one-off comment faded away. Had he repeated that false claim again and again, it obviously, would've been deliberate and malicious, a completely different matter.
When it comes to disinformation, the importance of repetition cannot be overstated.
“This is an embarrassment to our country. We were getting ready to win this election. Frankly, we did win this election.”
The most prominent example of such repetition was the false claim that the 2020 presidential election was rigged.
Donald Trump said this so often, beginning on election night 2020, as you just heard. And saying it so insistently that it resonated deeply on a cognitive basis with those who wanted to believe it, who needed to believe it. It helped spark the deadly attack on the US Capitol.
Two years later, the former president continues to spread this disinformation, warning of hijinks in the November, 2022 midterms. He spoke recently in Michigan.
“But frankly, your vote is the only thing that can stop it. So, on November 8th, Michigan patriots have to shatter every record because they cheat like hell these people, they cheat like hell.”
As we've seen in prior episodes, one result of disinformation, if not one objective of it, is to undermine institutional trust. And that includes trust in the century’s old institutions of our democracy.
“But I'm afraid we have never had … and I don't believe we'll ever have a fair election again. I don't believe it.”
But don't think all this started with Donald Trump. What he has said and continues to say is really only the most visible example of electoral related disinformation. It frankly predates him, is far more subtle, it's also localized, and has roots in Russia.
So, in the last couple election cycles, particularly in the presidential election cycle, and previous to that, in some of the election cycles in Moscow, we've seen a lot more of this hyper-local disinformation.
One of the things that they found post-2016 was that people were far more likely to trust very local news sources. And so, what kind of came out of that was that, first of all, there was already a lot of disinformation flowing through these Twitter accounts and different Facebook pages that looked like they were attached to like a WXYZ local station.
Where they would start off with very mundane news and then they would gradually kind of turn it just a little bit and then a little bit more and a little bit more until there was a lot of disinformation flowing through it.
So, we saw more of that in a more coordinated manner as the 2020 elections approached, where it was actually political strategists that were doing this, where they were running these … I'm trying to remember what they're called, like PinkNews sites that essentially were targeted to hyper-local electorates and looked like local news sites, and they were basically just political propaganda.
So, we're likely to see some more of that.
So, what does all of this mean for America's next election, the November midterms? Katie Harbath works at the intersection of elections, democracy, civics and technology.
She's the Chief Executive Officer of Anchor Change, which advises clients on technology policy issues. She's also a fellow at the Bipartisan Policy Center, the Integrity Institute, and a non-resident fellow at the Atlantic Council. She's also, the former Public Policy Director at Facebook.
What's the big picture in terms of the midterm election here and disinformation? And I suppose the question is, what are the key lessons that social media companies learn from say, 2020, and what have they applied to that now?
So, if you look at the tech company announcements for what they're doing for the midterms versus 2020, it's pretty much the same. Which both worries me, but also, is good because sometimes the midterm elections get less attention, but …
So, they're certainly doing a lot, but I think there's sort of the open question of are they doing enough to look around corners of how this activity might be adapting?
I think the other challenge that is different from 2020 to now, is that the concept and how to deal with mis and disinformation has gotten harder as you now have more candidates themselves just openly pushing disinformation.
And that really comes into tension with this country's traditions around First Amendment, protecting political speech, the right of candidates and stuff like that. That is a new challenge that I think they've had to face since 2020.
And a lot of the platforms, while they have policies to not allow things that could lead to election related violence or suppressing of the vote or things of that nature — one of the things that I think that they're doing less of is some of that labeling that we saw in 2020.
One of my favorites was when Donald Trump posted, “I won this by a lot,” and then underneath was a label that said, “Joe Biden is the projected winner of the presidency in 2020.”
So, it's hard to say. A lot of these things we learn a lot after the fact. I think a lot of this will depend too, on how the candidates accept or don't accept the results, what happens when there's some close elections, and things of that nature.
So, it's a bit hard to look into the crystal ball, but it's certainly going to be something to keep an eye on.
One thing to remember that's different from even just two years ago is that the number and type of social media platforms has really taken off giving disinformation more places to spread and take root.
We are also seeing this content get pushed to other platforms. So, there's more platforms that you're seeing. In fact, the big ones are actually doing quite a good job that you hear a lot of folks are — podcasts, live audio, live video continue to be a big vector in this.
I think you're going to see, obviously, TikTok continues to explode and be used by more candidates. Platforms like Twitch and Discord are being used by actors to push out information. You have messaging apps and in particular, encrypted messaging apps that are even harder to deal with this as well.
So, not only do you have it being candidates that are pushing a lot of this, you have a lot of different platforms and areas that they have to push their messages, but also other bad actors and folks that would want to disrupt this stuff also have to push it into that — are going to be some of the bigger challenges here for the midterms and then going into 2024.
I'm glad you brought up the encrypted angle here. I was going to get to that in a second. But we're talking about things like Telegram, I think, and WhatsApp that have either end-to-end encryption or partial encryption and so forth.
In terms of being able to spread disinformation, whether it's a candidate or just somebody at home just making things up and manufacturing it; either a manufactured video or audio or something or just text that's incorrect. The fact that you could encrypt that and no one else can see it, tell me about the impact of disinformation spreading and not being able to thwart it.
So, I think with encrypted apps, where you have to look at is more on … one-fold is, and WhatsApp did a lot of this when I was at Facebook and stuff like that. There's digital literacy campaigns to try to pre-bunk things or help people just become more critical consumers of the news that was important.
Some of these apps as well, create partnerships with different fact-checking organizations so people can WhatsApp the things that they're seeing to that fact checking org to see if they're true or not.
But then what the platforms have to do because they can't see the content, you then have to be looking at the behavior. For instance, is an account spamming a lot of different groups in a short period of time.
Now, that could be spammy behavior, we don't know if it's mis or disinfo, we don't know what their goal is, but you have to go much more off of that versus what the content actually says.
This series on Disinformation is a co-production of Evergreen Podcasts and Emergent Risk International, a global risk advisory firm. Emergent Risk International, we build intelligent solutions that find opportunities in a world of risk.
Welcome back. The subject is the midterm election and disinformation. Our guest is Katie Harbath, the Chief Executive Officer of Anchor Change, which advises clients on technology policy issues.
You mentioned before the labeling of information. We see that on the major platforms like Twitter and so forth. The others, are they doing that, is the first question. And I think the subset of that, I've read a variety of things that even question the efficacy of that. Does it work? Does it make people more likely to seek out other material or do they ignore it?
I mean, tell me about the labeling of content that is not true.
So, like Facebook has said they're going to put things … they're going to use more — in a strategic and discretionary way, I think. I have to go look at the exact language. Which means they're not going to do it on everything and they're going to be making that decision. And it's sort of a black box in terms of how they'll actually make that decision and where they'll apply it.
Twitter has labels too, that they said they'll put on. Twitter has said that they've seen labels be effective. I think the jury still out in terms of the effectiveness of labels. Some studies show they work, some don't.
Also, there's different designs of labels. So, like Facebook has labels for misinfo, where it's the black box that goes over an image to do something. There's other ones that go underneath the content. So, I think there's a lot of experimentation still happening too on the right design of these labels and how they may work. And then also, how they work on different platforms.
So, in addition to there being more platforms than two years ago, there's the question of just how disinformation is flagged, if it's even flagged at all.
So, first, most of the platforms don't use humans to apply the labels. Most of the time, it would be like machine learning classifiers trying to find the content and apply the labels, as they were doing it. And there can be humans that do it too, but I think for the most part, they try to automate a lot of those different things.
But yeah, other platforms aren't necessarily doing labels. I think the most famous ones would be the rise to some of these more free speech platforms. I'm putting it in quotes right now that we saw rise from the right side of the aisle, your Truth Socials, your Parlors, your Gettrs, your Gabs, stuff like that, where I would highly doubt that we would see any sort of labeling.
Are candidates gravitating to those platforms then knowing that there aren't going to be those kind of barriers in their way?
I don't actually know how much they've really … I mean, obviously, President Trump is using Truth Social and all of that. But most of the candidates are still using your more traditional social media more because it's still a good place to do fundraising, to get emails, stuff like that.
And it's also not necessarily the place … like I don't think many of these candidates — most of them are pushing saying that the 2020 election was fraudulent. Like it's not like their entire campaigns are around mis and disinfo, and they truly believe that the 2020 election was fraudulent and many of them will police their own speech on those platforms to try to not get ticked off.
I'm not sure how many of them are using the other ones, but the user bases are so low that they certainly aren't the only ones that a lot of these candidates could use, I think.
I have also read, Katie, that efforts to stamp out either disinformation or its cousin, misinformation, it's really, I've read it sort of compared to the game of a whack-a-mole. Is that a reasonable description? But how would you describe it?
That is an exact description that I used quite a bit with that. So, it is very hard in a reactive basis to catch all of this, and it really does feel like whack-a-mole.
And so, that's why I think too, that as we continue to think about how we need to approach mis and disinformation, you're seeing some other tactics. All of the platforms are pushing what they call authoritative information. Information from election officials about where, when, and how to vote, how to register to vote, about the process.
We are seeing some successes in terms of pre-bunking. So, this worked with the Russian invasion in Ukraine — getting out there ahead of time that people might encounter mis and disinformation, what that might look like, et cetera.
I also think that the general populace is getting much more savvy in terms of trying to and being just better consumers of information online. Now, it's not to say it's not still a problem, but I think it is better than 2016 where people didn't realize that this was a problem at all, and maybe wouldn't have been as skeptical as they would be today if they see something.
How can you do pre-bunking, which is a great concept? And the Ukraine example is really just a great example of that. How would you apply that though, say, to an election when the result isn't going to be known, how to get ahead of that kind of thing?
Yeah, this is some of the work that the Bipartisan Policy Center elections team does and works with election officials and others on.
I think, yes, you can't predict what the result's going to be, but you can try to instill more confidence in the process and help people get more educated about the process of voting, how the votes are counted, how those are tied and kept secure as part of that.
I think most people think that after they cast their ballot, they don't think about it anymore. And a lot of people got quite the education in 2020 about the electoral college and other things and now, many of these officials are worried that this is going to spread to gubernatorial, senate, congressional races, stuff like that. So, helping people to just be more knowledgeable there, I think is really important.
And then also, just explaining what the process is afterwards. Like here are the legitimate ways that if somebody is questioning the vote, here's how recounts … when recounts get triggered, the role of the courts, and stuff like that for people who have legitimate grievances with the process have to go through that. And I think that's just something that most of your everyday Americans don't really pay attention to.
So, a lot of this is civics 101; making sure that people know how the electoral process works and what the legitimate lawful process is when a close election is contested.
But a better understanding of how all this works is one thing, having a respect for and trust in the institutions that for nearly a quarter of a millennia have been the foundation of our fragile democracy, is another. That trust is absolutely essential.
And as the midterms loom, I asked Katie what she worries about most.
I am nervous about that we are going to come out of the midterms without having a January 6th type event because it's not a presidential election year. And that we might think that 2020 was a bit of a fluke. That would be a very bad mistake for us to make.
Immediately coming out of the midterms, we're going to have the question of whether or not Donald Trump should be let back on the platforms. We are going to have many state legislatures, I think, introducing more bills about not just the voting process, but also, around content moderation on social media.
We have a couple of Supreme Court cases that we are going to be looking at for the court to rule that could dramatically inhibit the platform's ability to fight mis and disinformation and bad activity on their platforms that we have to watch out for.
I'd like to repeat what Katie just said: “Just because this is not a presidential election year,: which means there will not be any counting of electoral votes, is no reason for complacency.
At some point after the midterms, we'll hear from both President Biden and former President Trump about whether they'll run in 2024. One of the other may run, perhaps both, perhaps neither. No one can say for sure.
What is possible to say though, is that disinformation and much that stems from it; distrust, anger, even the possibility of violence, is likely to remain.
In our next episode, only three people have ever served as both Secretary of Defense and Director of the Central Intelligence Agency. We'll talk with one of them.
Thanks to Katie Harbath of Anchor Change for her insights this week, our sound designer and editor, Noah Foutz, audio engineer Nathan Corson, executive producers, Michael DeAloia and Gerardo Orlando, and for Meredith Wilson of Emergent Risk International.