How Do You Know
What's True?
That's the premise behind "Disinformation" - with award-winning Evergreen host Paul Brandus. Get ready for amazing stories - war, espionage, corruption, elections, and assorted trickery showing how false information is turning our world inside out - and what we can do about it. A co-production of Evergreen and Emergent Risk International.
Lest There Be Any October Surprises - Fighting Disinformation Before the Ballot Box
"The problem in America isn't so much what people don't know, the problem is what people think they know that just ain't so"
In this episode of Disinformation, Paul Brandus discusses the prevalence of false campaign narratives and disinformation in elections around the world. He highlights the upcoming presidential election in the United States and the numerous elections taking place in other countries, including Taiwan. We delve into China's extensive use of disinformation campaigns targeting Taiwan, with the goal of sowing discontent and mistrust among the Taiwanese people. This episode also explores the effectiveness of these efforts and their potential impact on Taiwan's relationship with the United States. The conversation then shifts to the 2024 U.S. election, the radicalization of the population, the rapid development of technology, and the lack of oversight and regulation in the information environment.
[00:03:23] Disinformation in Taiwan.
[00:04:48] China's influence in Taiwan.
[00:09:25] Misinformation in Taiwan elections.
[00:13:35] Oversight of social media platforms.
[00:18:27] AI-generated disinformation in elections.
[00:22:45] Social media and micro-targeting.
[00:26:58] Educating people about media.
Got questions, comments or ideas or an example of disinformation you'd like us to check out? Send them to [email protected]. Subscribe wherever you get your podcasts. Special thanks to our guest Dr. Simona Grano, CEO of Emergent Risk Meredith Wilson, sound designer and editor Noah Foutz, audio engineer Nathan Corson, and executive producers Michael DeAloia and Gerardo Orlando. Thanks so much for listening.
Where to Listen
Find us in your favorite podcast app.
00:06 clip audio: "Please raise your right hand and repeat after me ... I, Joseph Robinette Biden Jr., do solemnly swear…"
00:10 Paul Brandus: One year from now, someone will stand on the West Front of the United States Capitol, raise their right hand, and, with the guidance of Chief Justice John Roberts, utter the 35-word presidential oath of office. Dating back to George Washington nearly 235 years ago, every president has uttered those words, which are enshrined in Article 2, Section 1, Clause 8 of the United States Constitution. Of course, who that person will be is unknown at the moment. The world will be watching closely as our election plays out. But the United States is hardly the only country holding an election this year. Voters in some 40 countries, including 8 of the 10 most populous, will also hold elections this year. Forgive the cliché, but it is correct to say that the stakes have never been higher. Elections, at least in countries where they are openly and vigorously contested, are generally loud and raucous affairs, often messy. And these days, one increasingly powerful dynamic is playing an increasingly ubiquitous role—the use of false campaign narratives. There's another word, of course, for such narratives—disinformation. I'm Paul Brandus, and that's the name of this award-winning podcast series, Disinformation, a co-production of Evergreen and Emergent Risk International, or ERI, a global risk advisory firm. As usual, I'll be joined by ERI's Chief Executive Officer, Meredith Wilson, in a few minutes. In addition to the U.S., I mentioned a minute ago that dozens of countries will also hold elections this year, among them India, the world's biggest country, Indonesia, the largest Muslim country, as well as Brazil, Mexico, Pakistan, and South Africa, among others. But the first major one, now just hours away, is in the island nation of Taiwan, a western democracy, a high-tech powerhouse that sits just 100 miles off the coast of China. China, as you know, has made no secret of its intention to fold Taiwan into the People's Republic, if necessary, by force.
02:51 Tsai Ming-Yen (in Chinese): There are 700 related false information cases. Those who need to be investigated will also be transferred to judicial agencies with the right to investigate. False information must be…
03:01 Paul Brandus: In terms of Disinformation, however, China has already been bombarding Taiwan for years. You just heard Tsai Ming-Yen, Taiwan's Director of National Security, telling lawmakers recently of hundreds of disinformation campaigns and or cases that have been uncovered. He calls disinformation, quote, evil, fake, and harmful. Beijing's relentless efforts are such that a study by the Digital Society Project, a Swedish-based research group, says Taiwan is the world's biggest target for disinformation from foreign governments. The question, though, just how effective are Beijing's efforts? I put that question to Dr. Simona Grano, a China and Taiwan specialist and senior fellow at the Asia Society.
03:53 Dr. Simona Grano: Well, it's been quite effective. I mean, China, of course, has been spreading rumors and various type of disinformation or misinformation in Taiwan since many years, including about the democratic system being a failing system, including about the DPP, this being the party that wants more distance from China, being actually an agent of the U.S. But what we see recently is that these kind of attempts have taken a more political elections directed dimension, where, for example, China and Beijing, they really target DPP politicians and all of those who actually take a tough stance on China in Taiwan. For example, recently they started to spread rumors that Xiao Beikang, she's the vice presidential candidate for the DPP, is actually a US citizen, so she should not be allowed to run for vice presidency. Or, for example, a very popular topic, especially after the Russian invasion of Ukraine, was to highlight that the US will not stand to Taiwan's side in case China invades because they have not done so in the case of Ukraine and they have retreated from Afghanistan, right? And that has made many people confused. And I think this is the real goal that China has. It's not about proving anything, but it's about sowing discontent and trust and of course and mistrust. I'm sorry. And we can also expect China to increase its rhetoric on the risk of war if the DPP wins the elections for a third consecutive term in order to try and sway the Taiwanese to vote for the China friendly party. But I think to a certain extent, China is also doing this, trying to influence other countries, for example, through establishing contacts with local politicians. I don't know if you heard of the recent scandal here in Europe where it came out that They were basically trying to establish contact, and they did for many years, with a right-wing Flemish politician of the European Parliament that was then paid to sway, for example, votings in the European Parliament so that this bloc would vote for things that are more China-friendly rather than together with the United States. So I think that China utilizes the same techniques in many countries, but Taiwan is at the forefront for sure.
06:03 Paul Brandus: You called it effective. Tell me how it's effective on Taiwan.
06:11 Dr. Simona Grano: I think the main example that I gave you before shows that it is effective. I'm talking about the one regarding the Russian invasion of Ukraine, because that was something that really touches many Taiwanese firsthand when they think that maybe one day they may have to go to war to defend their country and defend themselves. And so they know that on their own, Taiwan could not withstand a Chinese attack and it would need the United States. So if you're so distrust in your main security guarantor, the United States, Then, of course, most people, this is the hope that Beijing has, will start to think, how can we avoid such a horrible scenario? And that is getting closer to China. So there are certain narratives that are more effective than others. And those, of course, that touch Taiwanese in their daily lives firsthand have more of a chance of swaying something, of swaying the people for the Beijing-friendly quasi option.
07:03 Paul Brandus: Well, it sounds like you're saying that American actions in a way are helping the Chinese in showing a doubt among Taiwanese citizens with regard to American reliability.
07:17 Dr. Simona Grano: No, that's not what I meant. I believe that the situations are very different, right? So I won't speculate as to why it's obvious why actually Americans have not directly intervened in Ukraine, but have only intervened by providing weapons. The same have other NATO countries and European countries as well. Without America, probably the war would be over and would have been won by Russia. So that's not what I meant. But what I meant was that China is able to utilize what China sees as discrepancies in very difficult scenarios and situations to try to apply them to the Taiwanese situation. Personally, I think that Taiwan is much more geostrategically and geopolitically and also ideologically much more important for the United States and that they would most probably intervene. But of course, from the perspective of China, this is a very useful narrative to use.
08:08 Paul Brandus: What are the Taiwanese authorities doing to work this tidal wave of disinformation. What are the Taiwanese authorities doing to kind of a blunt thing?
08:20 Dr. Simona Grano: I mean, the Taiwanese have done a lot because they are, as I said before, at the forefront of these attempts. So they have a lot of experiences with them. What they do, of course, is that they have established a variety of think tanks and centers. One of them is called Taiwan Fact Check Center that routinely and solely work with the purpose of finding out what are real news and filtering them for the Taiwanese public and separating them from fake news and misinformation and disinformation. What they also do, of course, is that they try to Go back to the source right of these of these misinformation attempts and very often they find of course sources coming either from mainland china or anyway accounts that are maybe in content from somewhere in southeast asia but most possibly also. paid by someone in mainland China. But it is very difficult, you know, because, of course, this requires a lot of money, a lot of support and a lot, of course, of personnel being involved in daily fact checking. So it is really time consuming also for the government to set up this kind of facilities.
09:25 Paul Brandus: Thanks to Dr. Simona Grano, a China and Taiwan specialist and senior fellow at the Asia Society, she joined us from Switzerland. Meanwhile, as I mentioned, there are dozens of countries scheduled to hold elections this year. The most important is our own, now just 10 months away. In terms of mis- and disinformation, our thing's shaping up. I'll speak with Meredith Wilson about that after this short break.
09:57 ad read: This series on disinformation is a co-production of Evergreen Podcasts and Emergent Risk International, a global risk advisory firm. Emergent Risk International, we build intelligent solutions that find opportunities in a world of risk.
10:22 Paul Brandus: Welcome back. More than 155 million Americans cast ballots in the 2020 election, a number that could be exceeded this November. There are, of course, hundreds of races, including local and state races, all the way up to every House seat, one-third of the Senate, and, of course, the marquee contest for the White House. As usual, we'll be bombarded with campaign messages from all sides, which may be, shall we say, less than accurate or taken out of context. Voters are left to decide for themselves what's what. This is hardly new, of course, the great satirist Will Rogers observed one century ago that, quote, the problem in America isn't so much what people don't know, the problem is what people think they know that just ain't so, unquote. What is new, however, is the technology and ubiquitousness of it that allows false narratives to be manufactured by anyone, targeted with great accuracy, and spread at the push of a button. Artificial intelligence, algorithms, social media, and all the rest. Let's bring in Meredith Wilson now, once again she's Chief Executive Officer of Emergent Risk International. 2024, in terms of myths and disinformation, U.S. elections, give me your outlook in terms of myths and disinformation. What's 2024 looking like?
11:57 Meredith Wilson: Well, I wish I could be more optimistic, but I think it's going to be pretty ugly. We have a real lack of oversight in the information environment right now, which is worrying and in some ways surprising, considering all of the things that happened in the days after January 6th. There's a lot of radicalization happening in the broader US population, and it's happening in silos, so people aren't necessarily seeing it. And for ordinary people, a lot of people aren't necessarily seeing it because they're going about their day-to-day lives. And so these things develop, they fester, and then in comes an election cycle. And we're likely to see a lot of that surface in people taking even more extreme sides, even more sort of radical narratives than we've seen in the past. I think we've seen several polls now that have come from different polling groups, some of them partisan, some of them nonpartisan, but most of them are in pretty solid agreement that we have more Americans now than we have at any other time in the recent past, saying that violence is okay in pursuit of a political outcome. And so those things are worrying. In the meantime, we have the disinformation, misinformation sort of environment has gotten much, much more polluted. And we have less controls and less oversight on some of the social media platforms that previously had really worked to, you know, to build in better oversight. We also have these other social media platforms like TikTok and Gab and WhatsApp and all of these ones where we can't actually see necessarily what's happening the way that we can on say a Facebook or something like that. And that's all happening almost out of the view of, you know, of a lot of people. Long answer to a short question, but I think the outlook is not great for this coming election.
14:11 Paul Brandus: I want to get into the oversight angle of that in just a second, but You know, it seems to me that there are a lot of folks who maybe might not have much of a memory, if you will, about the past. And by that, I mean 2020 for a lot of people is really almost ancient history. And they might say, well, uh 2022 the midterm election in fact that seemed to go pretty smoothly there was uh, there were not a lot of issues around that therefore I think 2024 could go pretty smoothly. I think that's a misunderstanding of the difference between a midterm election and a presidential cycle, but what do you think?
15:01 Meredith Wilson: I think you're right. Um midterms are different, you know and and really um It doesn't take a rocket scientist to see that, you know, midterms are quieter. They don't have the same media attention. They don't have the big conventions that get people all stirred up. They don't have the presidential whistle stops that you have all over the country where you have, you know, going out to, you know, factories and visiting and going to, you know, stopping at airports and doing rallies and things like that. It's an entirely different ballgame. And this one, beyond that, is different, just like 2020 was, because of the personalities involved and the sort of very charismatic sort of individuals who have very strong followings, very strong followers. And we have, you know, for lack of a better way to say it, we have a number of politicians now that are willing to engage in a type of electoral warfare that, you know, was not common before 2016. And all of those things contribute to this sort of cauldron of problems that are just going to be bigger and more in front of us than they are when you have a off-term or a mid-term election, and it's senators and representatives versus presidential candidates.
16:31 Paul Brandus: Speaking of sort of you mentioned 2016, the arc of what has changed since then, we had 2016, 2020, it got worse. Be more specific, if you could, about 2024. What are we actually going to see next year?
16:52 Meredith Wilson: Yeah, well, you know, I mean, this is where we need to be paying really close attention to what's happening both overseas and other elections and what's happening kind of right in front of us online. You know, for example, this morning there was an article about the memetic warfare that one of the candidates is engaging in. And people don't necessarily see the connection between the candidate and the memes unless they're really following this, right? But then there's a whole group of individuals out there that are just putting out nasty, funny, maybe not funny, maybe really, really toxic memes that may not seem particularly impactful, but they do, they are extremely impactful. And there's been a lot of study around that. These just things that pop up in people's news feeds and things like that. So you see those little clues of what's to come. And we saw a lot of that actually in 2016, maybe a little bit less of it in 2020. But those things contribute to the narrative. They contribute to the way that these things start to play out in people's mind. you start listening to the rhetoric of the candidates. What are they saying when they're on the campaign trail? You know, in this case, we have a major candidate who is under several has several different indictments. And, you know, so there's some there's some sort of radicalized discussion around that and whether or not, you know, a former president can even be held to account for things. There is If you look at the overseas elections, if you look at Argentina, for example, Argentina, the president, the new president, in his campaigns, he used a lot of AI to create things that simply were not true, that were completely fake. And as of now, we don't have any controls over that for this coming election in the US either. We don't have anything that says that you're not allowed to do that. There have been attempts to do that, but nobody's actually done it. So how much of what we're going to see this time is actually completely fake. How are we going to know that? There's no law that says you have to watermark something that is AI generated. AI is going to be able to speed up the process of how much of this disinformation comes out and where it comes out and how often it comes out. Where we were looking in 2020 at what seemed to be a fairly sophisticated machine, now take that and amplify that. by 30, 40, 100%. So there's a lot of new technology that's going to affect the way this election goes. There's a lot of old stuff that we've seen before that's going to be amplified even more than previously. And we have several social media platforms that no longer have any real controls on them in terms of what can and can't be shown, how the algorithms will sort through that, how overseas state, non-state actors play into all of this. So there's a lot that's different and it's mostly just more extreme than it was in 2020.
20:10 Paul Brandus: So in in the aggregate of everything that you're saying here this arc from 2016 to now is a disturbing one And something else about that that I want you to comment on is the fact that all of these things appear to be just uh, um, they've been unnormalized at least if that's the right word in The minds of a lot of folks are not maybe aware of these things, they're not top of mind, they're not paying attention. So in other words, it's not enough of an issue for them to sit up and say, hey, this isn't right. Or, no, this is just something very normal to them. They're not disturbed by it. Does that make sense at all?
20:52 Meredith Wilson: Yeah, and I don't think that people aren't disturbed by it. I think people don't feel like they have any control over it. You know, it's a little bit like watching a runaway train. You know, when you talk to most people, they will tell you that they're pretty uncomfortable about this upcoming election. And, you know, that's average people, you know, people who aren't necessarily involved in politics and that there's a lot of people that will tell you that they're really not comfortable with the direction that things are headed right now, but they don't know what to do about that. They don't necessarily know how to address that. Most people, when it comes to technology right now, just simply do not know what they don't know, because technology is moving so fast that if you aren't spending every day reading about the new technology that's coming out, you're behind. And that's 95 percent of the population for no other reason than that. They have busy lives and they have things that they have to worry about beyond, you know, whatever technology comes out.
21:50 Paul Brandus: The issue then, one of them at least, is that things are moving so quickly, technology is developing so rapidly, that most folks simply cannot acclimate quickly enough. The advantage then goes to the folks Meredith mentioned who are creating content that can be, in her words, toxic, nasty, and impactful.
22:13 Meredith Wilson: The problem with that is, is that we don't realize oftentimes until it's too late what has been happening with technology. So for example, in 2016, when we, you know, there were a handful of us that understood prior to the elections that there was a ton of disinformation circulating. There were a large majority of people understood something wasn't right, but they weren't sure what. And and then afterwards, you had the Cambridge Analytica thing break and you had, you know, all of the sudden people went, wait. So all of that information that I've been pumping into social media all these years has been used to micro target me. But they didn't know that before the election because nobody ever sat down and explained that. And that's because social media people didn't understand how social media was being used. to target them from a marketing perspective, and then, you know, in this particular case, from a political perspective. Now, people sort of understand that, and you see them pulling back from what they share on social media. But what they probably don't understand is how AI is being used and how all of that data that's been collected since the dawn of, really, the World Wide Web, how all that data is being used politically to still target them. And it's far beyond their social media, right? So those are some of the things.
23:44 Paul Brandus: It seems that what you're talking about here, and I think you alluded to this a couple of minutes ago, is really kind of a lack of guardrails here in terms of, you know, there's no legislation. You mentioned a requirement that things have to be watermarked, for example. The legislative process has not caught up with this, if there is in fact even, you know, a will to do something about it. And then in The marketplace, are there any kind of financial incentives that would inhibit people from doing this kind of thing? I'm not really aware of anything of a substantive nature that would inhibit people from manufacturing these kinds of things.
24:29 Meredith Wilson: No, there definitely isn't. And there are definitely financial incentives to doing it. Just like we found out in previous cycles, creating fake news websites is actually, for a long time, was far more lucrative than actual reporting the news as a journalist. Make $40,000 a year as a journalist, or you could just pump out fake news and get clicks on ads and make $150,000 a year. I don't know that those exact incentives still exist, because, you know, Google in particular has done a fairly decent job of pushing down those algorithms so that they don't, you know, incentivize so much fake information. But And there are lots and lots of reasons to make a YouTube channel that still pumps out fake narratives. And because they're so much more sensationalist, they still get far more clicks, and people still make far more money on them. It's the same reason that news headlines tend to be slanted towards the very, very dramatic, because in order for those news sites to make money, you have to have something that people want to read. And that, you know, is going to be far more interesting if it's salacious, if it's violent, if it's slightly slanderous. Right. And then you get to the article. And if you read the article, you might find that the headline is actually not really what the article is about. But a lot of people only get their news from headlines these days. So I don't see a lot of I don't see a lot of financial incentive not to create disinformation right now.
26:07 Paul Brandus: There was a late New York Yankees baseball player, Yogi Berra, who was known for his, you know, Berra-isms, I think they called him. And one of them was, it's getting late awfully early. And adopting that, is it too late to do anything about 2024? But what can be done, if anything, at this late date to kind of thwart these things that you've been talking about?
26:37 Meredith Wilson: Yeah, I think, you know, there are definitely people out there fighting the good fight on this. There are a lot of people that are pushing, you know, to to get the word out about what is disinformation, what is not about what you need to be watching, what you don't. There's a lot of people who have jumped into this space in the last four years and are doing some really neat things, either technologically or just good old-fashioned education, trying to get more people more savvy about what they're seeing. I don't know that there's any one solution. I think the, you know, the biggest thing still unfortunately or fortunately comes down to educating people. And, you know, that starts in the classroom and then it, you know, works its way up all the way through the workforce in terms of letting people know. You know that this is what's out here. The challenge is always how do we do that in a nonpartisan manner, right? how do we do that in a way that says I'm Challenging you to think critically about these things not necessarily to think the way that I do But to think critically and ask the right questions so that you can draw your own conclusions about the truth behind what you're seeing and I do think there's a lot more effort at that right now. I just there is a there is a tide that we have to swim against with the pace with which technology is developing and our inability to regulate that because it's just all happening way too fast. And we don't we don't live in a country where we are likely to preemptively regulate things because we want to innovate and and all of that. And so this is the downside of that.
28:23 Paul Brandus: Thanks to Dr. Simona Grano of the Asia Society, Sound From Taiwan Plus News. Our sound designer and editor, Noah Foutz. Audio engineer, Nathan Corson. Executive producers, Michael DeAloia and Gerardo Orlando. And on behalf of Meredith Wilson, I'm Paul Brandus. Thanks so much for listening.
Hide TranscriptRecent Episodes
View AllUnmasking Disinformation: A Deep Dive into Russian Information Warfare
Disinformation | S:3 E:8No News Is Bad News - News Deserts & India, pt.3
Disinformation | S:3 E:7The Intentions of the Adversary: Disinformation and Election Security
Disinformation | S:3 E:6OSINT, pt 2: Global Affairs and Speed & Accuracy
Disinformation | S:3 E:5You May Also Like
Hear More From Us!
Subscribe Today and get the newest Evergreen content delivered straight to your inbox!