How Do You Know
What's True?
That's the premise behind "Disinformation" - with award-winning Evergreen host Paul Brandus. Get ready for amazing stories - war, espionage, corruption, elections, and assorted trickery showing how false information is turning our world inside out - and what we can do about it. A co-production of Evergreen and Emergent Risk International.
Unmasking Disinformation: A Deep Dive into Russian Information Warfare
| S:3 E:8"These things are as endemic to Russia as vodka, the ballet, and long winters."
In this episode, host Paul Brandus discusses the pervasive issue of disinformation, particularly focusing on Russian efforts and the impact on global security. He interviews Nina Jankowicz, founder of the American Sunlight Project, who emphasizes the importance of building societal resilience and promoting information literacy to combat false narratives. The conversation delves into the role of government in addressing disinformation and the upcoming challenges in the 2024 election. Meredith Wilson from Emergent Risk International provides insights on the evolving landscape of disinformation, emphasizing the need for ethical considerations and high-quality information sources in business decision-making.
[00:01:18] Russian information warfare efforts
[00:07:21] American Sunlight Project.
[00:12:30] Media literacy education in Finland.
[00:18:01] Russian weaponizing societal divisions.
[00:26:15] People defending dishonesty from power.
[00:33:04] Challenges of disinformation and AI.
[00:40:15] Importance of quality information.
Got questions, comments or ideas or an example of disinformation you'd like us to check out? Send them to [email protected]. Subscribe wherever you get your podcasts. Special thanks to our guest Nina Jankowicz, and the NATO-Stratcom Center in Latvia, our sound designer and editor Noah Foutz, audio engineer Nathan Corson, and executive producers Michael DeAloia and Gerardo Orlando. Thanks so much for listening.
Where to Listen
Find us in your favorite podcast app.

00:09 Paul Brandus: The bells toll from the ancient Spasskaya Bosnya, the Spassky Tower, and the Kremlin were for more than a century the seat of power for the Soviet, and since 1991, Russian governments. Also for more than a century, those governments have run massive and increasingly sophisticated campaigns to undermine Russian opponents through the use of false narratives. These things are as endemic to Russia as vodka, the ballet, and long winters. Such lies and deception in Russian can be called maskarovka or perhaps disinformatsia, which sounds an awful lot like our word for these false narratives – disinformation. I'm Paul Brandus, and that's the name of this award-winning podcast series, Disinformation, a co-production of Evergreen Podcasting and Emergent Risk International, a global risk advisory firm. Whatever you call it, maskudovka, disinformation, active measures, Russia's information warfare efforts continue to gain ground. A study out in recent days from NATO, the North Atlantic Alliance, details intensive Russian efforts. Here's a portion of that report. Coordinated groups on social media continue to pose a significant threat, now compounded by the use of generative AI content. Our research highlights the mixed use of coordinated groups on social media, automated cross-referencing, and AI-generated content. We have identified 17 coordinated groups of accounts for 344 sources," unquote. Elsewhere, the report adds, I'm quoting again here, large language models, LLMs, are actively used to create information threats. We have identified automated groups leveraging LLMs to generate noise and scrape websites to repost political news content, unquote. How pervasive is all this? Well, the report estimates that for the six-month period ended in May, automated comments about the alliance accounted for approximately 7% of English-language Twitter also known as X. It is an eye-popping report. A White House source tells me that Russian active measures will certainly be discussed at next month's NATO summit here in Washington. It's the alliance's 75th anniversary, by the way. The North Atlantic Alliance tracks Russian disinformation in lots of ways. Here's one. On the Komsomolskaya Koblestone Street in Riga, Latvia, once part of the Soviet Union but independent since 1991 and since 2004 a member of NATO, you'll find something called the NATO Stratcom Center for Excellence. The rationale behind it is an understanding that hostile actors, like the Russians, see information as a weapon, potentially effective in its own way as missiles, artillery, and all the rest.
03:30 Sohnke Nitteringhaus: Hostile actors try to exploit military conflicts, election processes, and just recently a global pandemic for their strategic goals by means of communications.
03:42 Paul Brandus: Sohnke Nitteringhaus, a lieutenant colonel in the German army, is posted at the Stratcom Center.
03:49 Sohnke Nitteringhaus: They enforce battles of narratives, information laundering techniques, or robot trolling tactics. To counter those threats and to account for a shift towards hybrid means of warfare, NATO developed a capability called strategic communications, or short STRATCOM. Our role as a center of excellence is to support the further development of this capability, especially in the field of education and training.
04:18 Paul Brandus: The U.S. and its NATO allies have a mantra, train as you fight and fight as you train. Within the context of information warfare, this means that words, messages and the ways in which they are communicated and received by both friend and foe can possibly provide an edge. Way back in the very first two episodes of season one of this series, we gave examples of how communications played a role in defeating the Nazis and Japanese in World War II, and how the Soviet Union used its own communication efforts in the never-ending battle for hearts and minds. As the NATO report details, Moscow's efforts continue using every high-tech strategy and tool available. At its essence, the Stratcom Center's efforts are an attempt to thwart the Kremlin. A robust messaging strategy, as the Stratcom Center notes, can have a direct effect on the success of NATO operations and policies. Keys to this include leveraging traditional media and the Internet to engage with the public to build awareness, understanding, and support for its decisions and operations. Other disinformation news, I mentioned Twitter a minute ago, or X as it's officially called. A new study in the journal Nature notes that in the week after the January 6th, 2021 insurrection, the attack on the US Capitol, Twitter suspended some 70,000 accounts associated with the right-wing QAnon movement. The company blamed QAnon for helping to spread misinformation that it says helped fuel violence. The study, by professors from four universities, says the suspension of those accounts had, quote, an immediate and widespread impact on the overall spread of bogus information, unquote. The authors suggest that if social media companies want to reduce misinformation and banning habitual spreaders, may be more effective than trying to take down individual posts. And they add, with the 2024 election approaching here in the U.S., the study shows that it is possible to rein in the spread of online lies if platforms have the will to do so. Repeated efforts for comment from X were unsuccessful. These are good topics for my guest. She's Nina Jankowicz, founder and CEO of the American Sunlight Project. She's the author of How to Lose the Information War on Russian Use of Disinformation as Geopolitical Strategy. Among her other accomplishments, she was a Fulbright Fellow in Kiev, working with the Foreign Ministry of Ukraine, also a disinformation fellow at the Woodrow Wilson Center here in Washington. Tell me what the American Sunlight Project is.
07:21 Nina Jankowicz: Yeah, the American Sunlight Project was born in response to the ridiculous accusations that any work to counter disinformation is censorship. It's just not true, first of all. And so what we're trying to do is increase the cost of lies that undermine our democracy. We're not in the business of telling people what's true or false, but we are looking at the deceptive information practices that are pretty much everywhere you look these days in our politics and our body politic. And we're also doing some public education, reaching out to vulnerable communities and helping them navigate today's information environment. And we're doing this in a bipartisan way. This is not something that is a partisan issue. Democracy is a democratic issue, in my opinion. And we're right now concentrating on the lead up to the 2024 election, but we don't think disinformation is going to go away after that. Elections are an inflection point, not an end point. So we hope to be in this for the long run.
08:22 Paul Brandus: Now, one of the books that Nina mentioned is called How to Lose the Information War. and Russian use of disinformation as a political strategy. It's really an amazing book. And I think the question to ask Nina about that is, are we losing that war?
08:42 Nina Jankowicz: Well, certainly when the book was published in 2020, I argued that we were. And I'm sad to say four years later, so much of what I wrote in that book, which I had been working on since 2017, is still true today. The thesis of the book is that America approached the information crisis with a large degree of hubris. that we didn't learn from the early mistakes and misadventures of our allies in Central and Eastern Europe, who, of course, had been dealing with Russian disinformation for many decades before we even recognized that it was a problem. And I don't just mean Soviet disinformation and propaganda. I mean even, you know, 21st century online disinformation. Estonia, Georgia, Czech Republic, Poland, Ukraine had all dealt with it, had all been battling with it. And we thought, no, no, there's no way that this is coming for us. One of the main theses of the book is that this is not just a technological problem. It's not one that we can play whack-a-troll to get ourselves out of. We can't just remove content. We can't just fact-check our way out of it, which, of course, makes the criticism of me and my work even funnier. I don't condone any of those strategies as kind of the antidote to disinformation. But what I found in my research and in talking to civil society organizations and government officials in all the countries I just mentioned is that building societal resilience is one of the keys to responding to disinformation. And when I talk about societal resilience, it's things like information literacy, helping people navigate today's information environment. That's not telling them, this source is good and this source is bad. But if you feel yourself getting emotional after consuming a piece of content, there's probably a reason why. Are you being targeted to get you to buy something? Is somebody trying to change your political behavior? Is the platform that you're on simply trying to monetize your rage in order to keep you clicking and scrolling? All of that context is really important for today's consumers of information. And I think a lot of people lack those skills. And then beyond that, things like civics, really important, right? So much of the disinformation that we see is based on a misunderstanding of how our democratic systems work. And the book also talks about the importance of investing in public media. We spend something like $1.39 per person per year the Corporation for Public Broadcasting in the United States. That's an old statistic too, that was back in when I published the book, I'm sure it's gone down since then. And certainly PBS and NPR have been embattled over the past couple of years as well. But when you look at the countries that are quite resilient against disinformation, they're all investing in public media. So that people have a trustworthy source of information that's not selling to them during times of crisis.
11:30 Paul Brandus: Now, you mentioned a couple of minutes ago media literacy, civic awareness, civic education, that kind of thing. As you know, there are countries like Finland and other Nordic countries in particular who are well ahead of us in these areas. Tell our audience, Nita, what does media literacy even mean, for example? What's the definition of that?
11:58 Nina Jankowicz: Yeah, I think the definition varies from person to person. And again, I kind of now prefer the term information literacy, although in the book it does say media literacy. I prefer information literacy because it's not just about media outlets anymore, but it's simply kind of giving people the tools they need to navigate today's media and information environment. Understanding how you're being marketed to, how information is making its way to you. And you mentioned Finland, a great example of a media literacy program there, one of the ones that's most famous, of course, is that in kindergarten, they start their media literacy education by explaining to children what advertisements are. We all remember sitting in front of the TV on Saturday morning watching cartoons and seeing all the new commercials for toys, right? not really understanding the difference between the programming that we were looking at and the commercials. Well, finished kids get that from five years old. And that builds from there, understanding the difference between news reporting and advertising, between news and opinion. how reporting is done. And then from there, I would even expand that to understanding how today's infrastructure of the internet works and the way algorithmic targeting works, the way we're tracked across the internet. So that when I'm looking at a pair of shoes for a wedding that I'll be going to over the summer, suddenly I'm seeing ads for those shoes on every social media platform that I'm on because I'm being tracked across the internet and they really want me to buy those shoes. And suddenly, I'm convinced that it's a good idea, right? Understanding all that context is key to being an informed digital citizen today.
13:36 Paul Brandus: It seems to me that this stuff is really a generational, maybe a multi-generational effort, and you've got to start young. I mean, for adults, today who have had this steady media diet of all the things that you've been talking about, it's awfully hard to get through to them. It seems like, as the case in Finland that you illustrated, starting young is crucial, but is there anything that can be done to adults today who are just bombarded by all this stuff day in and day out?
14:11 Nina Jankowicz: I actually do think that there is a pretty compelling case for reaching voting age adults with media literacy activity. There was a case in Ukraine early on before this most recent full-scale invasion in which the different Ukrainian CSOs reached out to librarians in Ukraine and trained them in media literacy education. They then went home to their libraries all over the country. and trained more individuals and the uptake of that training was very high. People were more accurately identifying true versus false news. They were able to identify kind of outlets that were oligarch owned versus kind of public owned. All of that stuff you know, to me says that adults don't want to be misinformed, right? They want to be right. But it's just about giving people, again, those kind of tools, the heuristics that they need to understand quickly how to navigate today's information environment. I also encountered a lot of programs in my research in places like Georgia and the Czech Republic that are geared at delivering media literacy education in easy to consume ways. So in the Czech Republic, for instance, reaching out to senior citizens who were learning how to use their iPads or new iPhones. to FaceTime their grandchildren, but including a little bit of media literacy in that education as well. I call that kind of the peas in the mashed potatoes approach. And then in Georgia, using infotainment, educating performers, so comedians, actors, musicians, about the dangers of disinformation, and then sending them to their shows in regions across Georgia, often in their hometowns, places other than the capital of Tbilisi, where they would do their set for free, And it would have their normal kind of comedy routine, but they'd include some material about disinformation, sort of like the Jon Stewart of the Republic of Georgia. That sort of approach is really interesting and one I think that can reach adults who don't want to sit and listen to Schoolhouse Rock.
16:28 Paul Brandus: Earlier in this episode, I mentioned this report by NATO on Russia's information warfare efforts. It was outlined in the journal Nature. And you read, I think, the executive summary of that, if not more. What are your thoughts on that? It was a kind of a gloomy report just talking about how pervasive these Russian active measures are. What's your take on that?
16:56 Nina Jankowicz: Yeah, I think we've given Russia no reason to stop these active measures, right? Particularly during the Trump administration, they were all but encouraged. Of course, we've sanctioned Russia for its activities in Ukraine and some of its informational activities. When you look at it, this is the only way that Russia really has to influence us now. And that's key to Russia's continued survival, not only to its attempt to win the war in Ukraine, but to reinsert itself into the global stage, it wants to be at the global negotiating table. And especially with the potential of a Trump administration on the horizon, where we saw Russia being kind of welcomed back to the global negotiating table, I think Russia has to make that play. Its future depends on it. So I'm not surprised about that at all. But one thing that I will say is that It is our weaknesses that Russia is weaponizing, whether that is fatigue about funding parts of the war in Ukraine or supporting Ukraine, whether it is division over hot button issues like LGBTQ rights or abortion rights or gun rights. These are all issues that Russia has amplified before. And I think a big misconception about disinformation is that it's cut and dry false narratives and we can just debunk our way out of it. But it's actually deeply held grievances and beliefs in society that Russia is identifying very surgically and weaponizing. And so, yeah, I'm worried about Russian disinformation, but I'm more worried about the fact that we've got folks here at home who continue to divide us, who aren't working on building something up. They're working on tearing things down and splitting Americans apart. And that, frankly, I think is anathema not only to our national security, but the morals and values that we hold dear in this country.
18:53 Paul Brandus: Yeah, that's an interesting point, one that has come up time and again. They are simply taking advantage of what we're doing to ourselves with their homegrown divisiveness and all of that. That being said, when you look at all of that, Nina, the divisiveness here at home today, the hyper-partisanship, the deep splits on these issues, the shouting at one another, the turning of people from, not opponents, but to enemies, all the rest. How much of a role is it possible to quantify this? I don't know. How much of a role are the Russians playing in this? I know they're piggybacking as we agree on what we're doing to ourselves, but how much are they actually doing in terms of, you know, bombarding us with messages? Can you quantify that at all?
19:42 Nina Jankowicz: Yeah, it's really difficult to quantify, in part because the Russians are ingenious in the way that there is such a blurred line between the overt kind of public-facing initiatives that they do, like the Internet Research Agency and Russia Today, or RT, and the covert stuff that their security services are taking care of. So while we have some vague ideas of the facts and figures there, and I actually don't know what the budget of the IRA is anymore, but it was in, I think, the $100 million range for for the US operations back in 2016, 2017, which is not that much money. I mean, it's more than what the US was spending time on similar operations to counter that stuff. But still, it's really difficult to put a dollar figure on it. And then in terms of engagement, One of the worst things that's happened over the past couple of years is that we have lost our window into the transparency and oversight over the social media platforms because of the politicization of the issue of disinformation. So down came the gate to the Twitter API, the application programming interface that allowed us to track things on there. When Elon Musk took over, their transparency reporting about foreign operations was gone. Facebook still does some transparency reporting, but we only get the data from them. There's not really an opening to study that data. The platform CrowdTangle that used to exist, Facebook is shutting that down just ahead of the election. TikTok is basically a black box, similar for YouTube, right? Even if we were able to quantify engagement and things there, we're still missing the dollar amount figure. So it's really difficult to know is the short version. And I don't want to sound the alarm bell too much in terms of the Russian impact here, because again, so much of it is being laundered through the American information ecosystem, through the mouths of Congress people, through mainstream news outlets. So I don't think we should discount any of that, particularly because that is where so many Americans are listening to these disinformation talking points.
21:58 Paul Brandus: Now, Nina, some people, as you know, probably know you only within the context of your short stint as head of the Disinformation Governance Board that has since been disbanded. Of course, I know you're involved with assorted legal issues around that. I wish you all the best with all of that. But the broader issue around that is what role can and should the government play in trying to help tamp down false narratives, tamp down disinformation. What is the role for government? If there is one, what should that be?
22:35 Nina Jankowicz: Yeah. So first of all, let me say, I think Congress has really dropped the ball in making sure that we have oversight and transparency over the social media platforms and setting guardrails for what cooperation between government and platforms can look like. All of this brouhaha that has been happening over the last couple of years is because Congress has had that major dereliction of duty. So I wanna start there. There are plenty of conversations to be had about this stuff. They are good questions that I understand why Americans are concerned about, but we need to start with those basic guardrails. And I hope we'll have them soon, perhaps in the next Congress. I have never been in favor of government deciding what is true or false online. I have never been in favor of government doing any fact-checking efforts. In fact, if the people who had criticized me had actually read my book before criticizing me, they would have found out that the Czech Republic attempted to do some fact-checking similar to what I've just described through its Center Against Terrorism and Hybrid Threats back in 2017. And that was met with great consternation from the people. And I concluded, First of all, they communicated very poorly about this effort because that's not all they were doing. But second of all, it wasn't a great look for government in general to be in the fact-checking business. What I do think government needs to be in the business of is putting out good information, understanding what is being said online about certain initiatives or about issues related to national security and our democratic infrastructure, and making sure that American citizens are equipped with the facts. That's not necessarily fact-checking. It's it's telling good stories. It's saying okay We know that this is really dry about how our election infrastructure works But we want you to understand that your vote is safe and that work had been getting done at the Department of Homeland Security That's the sort of thing that we wanted to expand on with the disinformation governance board. There's a lot of talk lately about the the limits of cooperation between the government and social media platforms in terms of the government flagging certain posts or letting them know to be on the lookout for things particularly coming from foreign adversaries. I do understand concerns about the government potentially using that power to flag the posts of political rivals or things like that. Personally, from the evidence that I've seen, I don't think that that has happened yet. I understand why people would be worried about that happening with regimes in the future. And again, that is an area where I think if we had Congress step in for some regulatory and oversight mechanisms, establishing those a nonpartisan body, similar to the FCC that could review that cooperation. I think that is something that would be really useful so that we're not relying on the platforms for these disclosures themselves. I mean, that's asking like a student to grade their own test.
25:36 Paul Brandus: You know, when we do have these, without naming names or anything, I mean, there are politicians who say things that I think any reasonable person would understand are blatantly not true. We've talked about the importance, the necessity, if you will, of information quality. I don't understand why some folks are so adamant Nina, help me understand, why are some so adamant about defending their right to be dishonest? I mean, the old adage about not being able to yell fire in a crowded theater, that still holds true, does it not?
26:15 Nina Jankowicz: Well, you'll have to ask Jeff Kossoff about that. He has a whole book about liar in a crowded theater. And I think, I'm not a lawyer, I think there are some areas where you are allowed to actually yell that. So I'm not gonna comment on that. But let me say that I do think there are people who are happily defending their right to be dishonest because it is making them money or keeping them in power. That is why we are seeing certain politicians, I will have to say, most on the far right, but there are some on the far left as well, who are defending that right because it is, again, keeping them either in positions of power or making them money. They are worried about the people who are telling the truth and drawing attention to these deceptive information practices that we are seeing that have become so common in our body politic because it is undermining their access to power, their access to capital. And I think that that is one of the things that we are trying to really underline at the American Sunlight Project, that this isn't just a case of being mistaken. It's very deliberate. It's very coordinated. And this campaign against the truth, and particularly the campaign against disinformation researchers, is one that is meant to keep these folks in power and to keep them making money in 2024 and beyond.
27:43 Paul Brandus: So the pursuit of power and money, hardly new, that takes on whole new within the context that we're talking about, rather disturbing, I think. A final question, put on your, look through your crystal ball, if you could, where are we going to be in three to five years with all of that? Where will there be regulations? Will disinformation, will it get better in terms of information quality? Where are we going to be in a couple of years?
28:14 Nina Jankowicz: I think a lot depends on what happens in November and how willing Congress is to start to work on bipartisan initiatives that are put forward for the good of the American people. I know bipartisan initiative and Congress are words that you don't often hear in the same sentence these days. But truly, I do believe it's been a dereliction of duty that Congress hasn't addressed this problem yet. We have millions of women around the country being affected by deepfake pornography, and yet no federal statute outlying it. We have children who are at risk because of child sexual abuse material, because of content that is encouraging them to commit suicide, into eating disorders, things like that. And we're dropping the ball on that, not to mention health misinformation, misinformation that is affecting our national security, and deliberate disinformation campaigns coming from foreign adversaries. This is an imperative that our government needs to address. We have seen other governments attempt to do this. The EU, of course, has its Digital Services Act. Australia, the United Kingdom all have online safety bills that are trying to keep their democratic citizens safe online so that they continue to they can continue to express their voices without fear of attack. And I don't just mean mean words on the Internet, I'm talking about real threats to their life, right? Again, the US needs to step up. And if we don't, I think Again, we're going to look in the mirror in a couple of years and we're going to see something that we don't like. I think many people are already seeing that. So I hope Congress comes to its senses. Very much depends on what happens in November. And so I would say to anybody who is wondering about whether it's worth getting out and voting, even if you don't feel like there are candidates on the ballot that really speak to you. Democracy is on the ballot and the health of our information environment is on the ballot. And so you should go and vote your values there.
30:16 Paul Brandus: All right. Important issues indeed. Let's hope for the best. Dana Jankowicz, founder of the American Sunlight Project. What a pleasure having you on. Thank you so much.
30:29 Nina Jankowicz: It was my pleasure, Paul. Thanks for having me.
30:33 Paul Brandus: Let's take a short break. Now, when we come back, a final chat with Meredith Wilson of Emergent Risk International.
30:42 ad read: This series on disinformation is a co-production of Evergreen Podcasts and Emergent Risk International, a global risk advisory firm. Emergent Risk International. We build intelligent solutions that find opportunities in a world of risk.
31:05 Paul Brandus: Welcome back. For the last three years, this program has been a co-production of Evergreen Podcasting and Emergent Risk International, a global risk advisory firm. Let's bring in my friend, Meredith Wilson. Now she's ERI's founder and CEO for some final thoughts. You know, Meredith, over the past three seasons of this series, we've traced disinformation over the last 80 years or so, quite an arc. I mean, we started with stories of World War II, the Cold War. We've heard from a former Secretary of Defense and CIA director, but that's all in the past. Today, disinformation more ubiquitous than ever, and thanks to accelerants like artificial intelligence and social media, it's far easier to manufacture and distribute. And of course, generative AI as well is a big part of that. Where do things stand now? I mean, so much is happening, and it's happening very quickly.
32:08 Meredith Wilson: Yeah, absolutely, Paul. And I think that's part of the point is that AI and along with it, disinformation are moving far too fast for most people to keep up with. It's nearly impossible for people to process what is actually happening in front of them. The impact, I think, is in having sort of this uncontrolled disinformation environment that is beyond our measurement at this point. The narratives, the deepening of polarization, the deepening of viewpoints that one side is throwing disinformation at the other makes it really hard to even measure just how much of what's coming into our viewpoint every day is disinformation. a friend of mine, Matt Abrams, just the other day, and he's been on this show, pointed out that we're in this extended period of what we call VUCA, V-U-C-A, volatility, uncertainty, complexity, and ambiguity. And that, I think, is very capably classifies where we all live right now in terms of global events, global weather issues, all of the different things that are calling into question things that we used to believe or believe to be true. And I think when you take that and you layer on top of it the disinformation, it has become this very inextractable cycle of information. So not a very positive answer, but perhaps.
33:54 Paul Brandus: A realistic one, perhaps.
33:55 Meredith Wilson: Yeah, perhaps a realistic one.
33:58 Paul Brandus: In terms of leveraging new technology, latching on to new trends, there's a frequent dynamic. hardly knew that the bad guys always seem to be a step ahead of the good guys. That certainly appears to be the case. Now, why is this in your view and how might it be reversed? Is that even possible?
34:23 Meredith Wilson: I don't know that that's really possible only because the bad guys tend to create vulnerabilities out of whatever technology comes about, right? So they certainly do try to red team technology these days in a way that they didn't back in the day so that they can actually figure out how it's going to be misused. But the problem is there's always something that we don't think of. And so oftentimes we will be following up the bad guys once we figure out how they've decided to weaponize something that's dual use.
34:56 Paul Brandus: You know, we've been focusing on this for three years. You've been on the show many times. What is your sense of the best way or ways to counter false narratives? We've spoken in the past of pre-bunking, which seems to be rather effective in certain cases. Certainly in the run-up to the war in Ukraine, it was. Beyond pre-bunking, what tactics can be used to counter all this?
35:24 Meredith Wilson: Yeah, I mean, unfortunately, I don't think that we've landed on a real consensus on this. I do think that it is not a single solution. I think the constant vigilance sort of evolution and iterative approach is probably the best option we have available because talking about the bad guys or talking about the evolution of technology There will always be something new on the horizon, which makes it hard sometimes to keep up with how we fight this. One day, pre-bunking might work. The next day, maybe it doesn't because people get tired of it. It's not a one-size-fits-all solution. It's very, very much like the rest of cybersecurity, something that has to be fought one issue at a time. But I think the other thing, and we probably don't talk about this enough, is that it's not all technology solutions. Probably one of the biggest things out there that we need to be doing is talking to people more, talking to people we disagree with, listening to people we disagree with, listening to what they're hearing every day, listening to why those narratives catch on for them. I think when you look at the way that Some people are prone to grabbing onto conspiracy theories and narratives, and we've explored this several times in the podcast. A lot of the reasons they do are because of uncertainty. understand something or they fear something they don't understand. And so these narratives are a good way to explain something that they otherwise can't. So the more that we talk to each other, the more that we build bridges, the more that we are out in the community again, instead of stuck behind our computers, the better shot we have at countering those narratives as individuals and as people.
37:28 Paul Brandus: Let's turn to elections quickly. We've had a lot of elections around the world, the most recent big one in India, of course. But of course, the world is also looking at the big election here in the United States, now just five months away. Based on everything that you've seen this year around the world and all of these other elections, what do you think could happen here as November nears?
37:58 Meredith Wilson: You know, I think that there's a lot, obviously. I think people are looking for things to go boom again. They're looking for those January 6th type events that may or may not happen this time. I think the biggest thing that or the scariest thing is that the disinformation is so successful that people don't even realize they've been a victim of it. That is truly the ultimate goal of disinformation, right, is to get people thinking that what they're doing or who they're voting for or whatever the case may be is actually their own idea, is actually predicated on their own version of events which they believe to be true. There's a lot out there that suggests that disinformation has played a role in a lot of elections that have just happened and moved on. We've discussed this before too. Is that disinformation or is it marketing? Who knows? But I think the worst thing that could happen is that people don't realize that they're being manipulated and or they realize too late and we do end up with more violence. We do end up with, you know, people in office that are willful in, you know, manipulating these technologies to their advantage and to the disadvantage of democracy. Those types of things are probably the worst place we could end up.
39:30 Paul Brandus: Wilson's company advises firms on how to deal with all this. Disinformation, artificial intelligence, and the rise, the acceleration of false narratives. It's an awful lot.
39:42 Meredith Wilson: I think right now, the biggest thing is that there has to be somebody keeping up with what's happening. And it has to be not just how are we keeping up with this from a profit perspective, not just how are we keeping up with this, you know, making sure our technology matches the other guys. But in fact, it has to be what are the ethical boundaries here that we need to be thinking about? What is the You know, what is the right thing to do versus the profitable thing to do? Because they're not always the same. From the perspective of education, we really have to work hard to make sure that our businesses and the people that are working in them are getting high quality information in order to make business decisions every day. And so that kind of takes it out of the realm of, you know, you're reading your news on your Twitter feed or your X feed every day and you should be a well-informed enough citizen. That's not really where we're at today. And so are your employees getting access to good news sources? Are they getting access to research? Are they getting access to information that is not just coming from any news source, because all of those news sources, though some may be good, some may be less good, still are only reporting a very small slice of information. Helping people rediscover things like blogs, not blogs, sorry, like journals and think tanks and really high quality sources of information is a good place to start in that way. But it is also, on companies to make sure that they are also providing good cyber hygiene training and good information hygiene training. And those two things are on the company because it's part of what keeps them protected.
41:42 Paul Brandus: Thanks this week to Nina Jankiewicz and the NATO Stratcom Center in Latvia. Our sound designer and editor, Noah Foutz. Audio engineer, Nathan Corson. Executive producers, Michael Dealoia and Gerardo Orlando. And on behalf of Meredith Wilson, I'm Paul Brandus. Thanks so much for listening.
Hide TranscriptRecent Episodes
View AllNo News Is Bad News - News Deserts & India, pt.3
Evergreen Podcasts | S:3 E:7The Intentions of the Adversary: Disinformation and Election Security
Evergreen Podcasts | S:3 E:6OSINT, pt 2: Global Affairs and Speed & Accuracy
Evergreen Podcasts | S:3 E:5OSINT: The Tools of Truthseeking In The Age of Disinformation
Evergreen Podcasts | S:3 E:4You May Also Like
Hear More From Us!
Subscribe Today and get the newest Evergreen content delivered straight to your inbox!