That's the premise behind "Disinformation" - with award-winning Evergreen host Paul Brandus. Get ready for amazing stories - war, espionage, corruption, elections, and assorted trickery showing how false information is turning our world inside out - and what we can do about it. A co-production of Evergreen and Emergent Risk International.
How to stamp out—or at least dilute—the power and danger posed by disinformation? We'll go over the numerous ideas on the subject.
Featuring Clare Melford, Chief Executive Officer of Global Disinformation Index, Megan Marelli, Editorial Director for Meedan, and Meredith Wilson, Chief Executive Officer, Emergent Risk International
Speakers: Paul Brandus, Meredith Wilson, Megan Marrelli, & Clare Melford
Paul Brandus:
Disinformation, powerful, pervasive, potentially destructive, and it seems to be getting worse. Why? Well, I'll give you a few reasons.
First, wondrous technology has emerged in recent years and advanced quickly, indeed, often faster than our ability to adapt.
Also, in the last 15 or 20 years, new media platforms have burst onto the scene with low barriers to entry and little rigor attached to ensuring that content is accurate and fair.
Another reason, the human mind is easily receptive to false narratives. It's just the way we're wired. In this episode, the first of several, we're going to talk about what can possibly be done about all this. What can be done to diminish this scourge of disinformation?
[Music Playing]
I'm Paul Brandus, and welcome to this series. It's called simply Disinformation.
Meredith Wilson:
And I'm Meredith Wilson, founder and CEO of Emergent Risk International, and I'll be providing analysis throughout each episode.
Paul Brandus:
A minute ago, I mentioned several things that are fueling disinformation, new technology, new platforms, the way our brains work.
There's another problem too. We're simply inundated, overwhelmed with disinformation, overwhelmed with data, overwhelmed with opinions that often masquerade as facts. This tsunami of disinformation that washes over us every day.
I spoke about that with Megan Marrelli. She's the editorial director at Meedan, a technology company that builds software programs aimed at strengthening journalism, digital literacy, and the accessibility of information.
One problem is just the sheer volume of disinformation. It's everywhere. It's voluminous, fact checkers can only do so much and that's the first problem.
And then the second problem is that folks whether it's cognitive dissonance, whatever the issue is, people are skeptical of what fact checkers say anyway, because they believe what they want to believe.
People tell me all the time, “Yeah, a fact checker from CNN or the New York Times, well, of course they're biased.”
How do you overcome these kinds of things, that sort of bias and also just the sheer volume of stuff that isn't true, how do you overcome that?
Megan Marrelli:
Yeah, that's a great question. And it's a good observation too, that there are really millions and millions and millions of claims online. It would be impossible to keep up with all of the different things that users are constantly sort of touting online, from fake COVID-19 cures to false political claims. Lots of different things like that.
My response to sort of just tackling the sheer number of stuff out there is that there's power in numbers. Fact checkers can work together, and they have worked together really effectively to sort of divvy up the work so to speak.
So, we've worked, for example, with five different newsrooms recently in Brazil. And we collected all together through those newsrooms, 300,000 audience questions that then got divvied up and responded to.
In the Philippines similarly, around the Philippines election recently, we worked with over a hundred groups doing sort of similar type of work.
So, that's kind of Meedan’s response to how to tackle just the sheer volume of claims out there online. So, rather than having 10 different organizations all checking the same claim, we work together, share pool resources, sort of share data with each other to make sure that we're not duplicating efforts.
[Music Playing]
Paul Brandus:
But the problem with fact checking is that it's of the horse has left the barn variety. A false narrative can be broadcast, printed, tweeted, shared. A fact checker can later point out that it's false, but hasn't the damage already been done? After all the falsehood has already been consumed.
Is there a better way? Is there anything that can be done to disrupt this dynamic? We'll take a closer look at one idea after this short break.
Voiceover:
This series on disinformation is a co-production of Evergreen Podcasts and Emergent Risk International, a global risk advisory firm. Emergent Risk International, we build intelligent solutions that find opportunities in a world of risk.
Paul Brandus:
Welcome back. There are lots of ideas to counter disinformation, lots of smart people thinking, developing new technology and experimenting with possible solutions. I met one of them recently at a disinformation conference in Brussels.
Clare Melford:
My name is Clare Melford. I'm the Chief Executive of the Global Disinformation Index.
Paul Brandus:
The Global Disinformation Index, or GDI is a London based nonprofit. Its website defines its goal, “A world free of disinformation and its harms which undermine the trusted information ecosystems that are the foundation of democratic processes.”
A world free of disinformation, a rather lofty goal indeed, but probably impossible too, when you consider that disinformation has always been with us, with examples dating back to ancient times.
And despite that goal, Clare Melford, by the way, prefers not to even use the word disinformation. She prefers to call it “adversarial content.”
Clare Melford:
So, the word disinformation has been co-opted by people who used to use the word fake news and who like to use it as referring to anything they don't like. So, the term has become quite loaded in certain parts of the world.
So, we prefer to use the term adversarial narrative conflict. So, adversarial narratives are narratives, pieces of content that over time create an adversarial, a conflictual relationship between the reader and the subject of the story.
Paul Brandus:
Adversarial content, that's an interesting phrase, and Melford is of the opinion that one of the real goals of people who manufacture and distribute it is to win the support of like-minded followers. And in the process, divide one group of people against another.
Clare Melford:
And the subject is usually in these three board categories. It can be at risk groups, so content that is adversarial against people of color, different sexualities, genders, religions, et cetera.
It can be science. So, content that is adversarial against science; undermining science, COVID conspiracies, anti-vax conspiracies, climate change.
Or content that is adversarial against democratic institutions; voting systems, democratically elected officials, institutions that underpin liberal democracies, the justice system, the media, et cetera.
Paul Brandus:
Melford's basic idea is to disrupt this by using a follow the money approach, by tracking advertising dollars as they flow from company to website, often passing through an intermediary and ad buyer whose basic job is to place the company's message in places that attract eyeballs. But is it the quantity of eyeballs or the quality that counts?
Clare Melford:
So, if you are a media agency or an ad tech company, you have multiple responsibilities. Yes, you have to get the best reach for the greatest value, which most advertisers will also have other metrics that they are trying to solve for, such as the right type of people seeing your brand and the right brand spillover, the right context for your brand to be seen within.
So, it is more the days of reach at the lowest possible cost, which is where the internet was a decade ago. Those have long gone.
And now, most agencies, most ad tech companies are offering advertisers more control and more filters to ensure that, yes, they get the reach, but not necessarily at lowest cost, if the cost of that is quality.
Paul Brandus:
The importance of brands and ad dollars being associated with so-called quality can be contrasted (Melford says), with certain platforms that pose reputational risks for a brand.
Clare Melford:
The events of the last few years have made it quite clear to advertisers that there is huge brand risk to their brands from being associated with content that is highly divisive. No brand wants to be associated with anti-vax content, for example, which has led to significantly higher rates of death and illness from COVID, for example.
Or next to anti-Semitic content, which has led to examples like shooters going into synagogues in various places in the U.S. and elsewhere and killing people.
So, advertisers don't want to be next to the content from a brand point of view, but increasingly, and I think the events of January the sixth made it extremely clear. People are realizing that highly adversarial, polarizing content has consequences for all of us that we do not want our societies to fracture into unstable and ungovernable violent places.
Paul Brandus:
One problem with this though, is that companies sometimes have little control over where their ads actually appear.
Clare Melford:
They let the intermediary, the ad tech company that does the buying, choose where their ads end up. And the advertising technology sector is so complicated that it's actually extremely hard for anyone to know where their adverts are going to end up because ad inventory is sold and resold and resold by multiple different people in a chain, all in the instant that we as a user click on the browser. There are multiple auctions going on, which can make the chain from the publisher to the advertiser incredibly difficult to track.
Paul Brandus:
That's why disrupting the current model is so difficult to ad market, and the way it works is built for efficiency, not political expediency.
Some companies do try and take a more proactive approach here. One is Vodafone, the British Telecom giant. It uses something called an inclusion list; a list of sites it considers acceptable for its brand to be associated with.
But like Melford says, many companies don't do that. And opposite of the inclusion list is something called an exclusion list.
Clare Melford:
So, an exclusion list is a term that advertisers use for a list of sites they do not want their adverts to end up on. And exclusion lists are used by media agencies and anyone who manages the advertising buy of advertisers on the internet.
Dynamic just means that the list changes all the time as we assess new sites, new countries, new narratives, sites go onto the list. So, a lot of new sites went onto the list around the invasion of Ukraine, a lot of new Russian sites, Ukrainian sites, Serbian sites.
And sometimes sites come off the list either because they're defunct and they're no longer used, or sometimes because sites change their policies.
Paul Brandus:
Of course, this isn't a black and white issue. As I mentioned, ad dollars are often scattered about landing an advertisement on countless platforms. This is where Melford tosses another phrase into the mix, something she calls narrative density.
Clare Melford:
Yes, that's a term we use to describe what proportion of the content on a website is adversarial. So, many sites will occasionally have a piece of content that is polarizing, divisive, or factually inaccurate. That can happen to any news outlet, especially if they don't have good policies and procedures.
But many news sites are set up primarily to push a particular agenda. And so, every single story on that site will be highly divisive, polarizing, and adversarial. So, narrative density is a measure of the density of stories on a site.
Paul Brandus:
Or on a television network. The amount of time perhaps that tiny measured on broadcast.
Clare Melford:
We don't measure broadcast. We just look at open web news sites. But you could use a similar content.
Paul Brandus:
Okay. And there's more to it than all this, but let's take a step back instead and ask a few questions. Who's the judge here who gets to determine whether a website is considered to be of sufficient qualities to attract ad dollars?
I might have one opinion about a website and whether it's any good, and you might have another. So, the idea of somehow rating websites, how exactly would that work? Movies have ratings, food packaging has nutritional labels. Who would do that sort of thing for a website? I put that question to Meredith Wilson of Emergent Risk International.
Meredith Wilson:
So, I think it's going to be a bit of an uphill battle to reach people that most need to hear it. And I'm not certain if attaching labels to websites is something that can really take off without attaching government regulation to it.
So, I could see that working in some countries where maybe the population is a little bit more unified on what constitutes freedom of speech and what doesn't.
I have a hard time seeing it work in the United States, at least from a regulatory perspective, just given where our Congress is on things and their ability to agree on, well, pretty much anything.
Paul Brandus:
Just from a practical standpoint, I think people already have a sense if they're being fair and honest with themselves, I think most people have a sense of what's reasonable in terms of being accurate and transparent in terms of platforms and channels that we don't need to name. I think people understand what's what without labels, right?
Meredith Wilson:
To a certain degree, I think so. But I think that honestly, the people that most need to be reached on this are probably not going to be convinced by a label. If you think about all of the sort of initiatives that people have undertaken around fact checking and things like that, the ability for people to sort of cognitively dismiss those things if they disagree with their own perspective is still there.
That said, there are some groups that are working towards this, and I know there's a couple right here in the United States that are doing something similar too. And so, I'm interested to see how it develops in terms of kind of broader adoption of the idea. And I certainly think that bringing the advertisers into it is an incentive that could help move that in the right direction.
Paul Brandus:
So, this is all rather subjective, it seems, judging a website, unless the website is so egregious, so over the line that to any reasonable person, it is not even remotely debatable. But this raises a new difficulty.
Alex Jones:
“I don't know exactly what happened. Sandy Hook's inconclusive.”
Paul Brandus:
For example, a guy named Alex Jones has used his site, it's called Infowars, to spread for years the worst kind of disinformation about all sorts of things. His most despicable claim that the 2012 massacre at Sandy Hook Elementary School in Newtown, Connecticut was a hoax. He has said the 26 people, 20 of them children, who were gunned down that morning are actually still alive.
This disinformation has a multiplier effect. His followers, who buy into his slime, have harassed and threatened families of the victims, adding to their incalculable pain and suffering. They've sued Jones and various court judgements have ordered him to pay more than a billion dollars in damages. Jones has mocked this smirking and condescending, and the legal wheels continue to turn.
But here's the point about all this, and sadly, it shows the limits of what Clare Melford is trying to do by disrupting the advertising market. You see, Jones' site doesn't have advertisers. Sure, there are one or two small ads for some kind of off-brand toothpaste and some cannabis products.
But Jones survives by getting his followers to give him money, and they do. Try disrupting that.
[Music Playing]
So, tinkering with the advertising market, the flow of money and messages, interesting as that potentially could be, it can't stop hucksters like Jones from asking fans to dig into their pockets to keep him afloat, to keep the disinformation flowing.
If you like this show and this series, I hope you'll go to the Apple or Spotify page or wherever you're listening to this and give us a review. And if you have questions, comments, or ideas for me, my email is [email protected].
Thanks to Megan Marrelli and Claire Melford, our sound designer and engineer, Noah Foutz, audio engineer Nathan Corson, executive producers Michael DeAloia and Gerardo Orlando.
And on behalf of Meredith Wilson of Emergent Risk International. I'm Paul Brandus. Thanks so much for listening.