That's the premise behind "Disinformation" - with award-winning Evergreen host Paul Brandus. Get ready for amazing stories - war, espionage, corruption, elections, and assorted trickery showing how false information is turning our world inside out - and what we can do about it. A co-production of Evergreen and Emergent Risk International.
The National Dilemma: The Battle of Truth and Disinformation
| S:2 E:4
"What is the line where your ability to say what you want stops and your ability to make up things that harm people starts?"
In Elon Musk's recent interview with the BBC, he discussed free speech and his vision for the platform to serve as a "digital town square." However, the challenge arises when determining what constitutes misinformation in a divided society where basic facts are contested. The episode explores the concept of disinformation and the concerns surrounding its potential resurgence on Twitter under Musk's ownership as well as on other social media platforms. Join host Paul Brandus as he delves into the complex issue of disinformation in the digital age.
[00:02:08] Peddlers of disinformation return.
[00:04:52] First Amendment and social media.
[00:10:59] Tackling disinformation upstream.
[00:12:11] Public education campaigns.
[00:16:26] Extending the Pareto Principle to social media.
Got questions, comments or ideas or an example of disinformation you'd like us to check out? Send them to [email protected]. Subscribe wherever you get your podcasts. Special thanks to our guests Meredith Wilson of Emergent Risk International and Ella Irwin for their insights. Our sound designer and editor Noah Foutz, audio engineer Nathan Corson, and executive producers Michael DeAloia and Gerardo Orlando. Thanks so much for listening.
00:00Paul Brandus Elon Musk, the owner of
Twitter, recently sat down with the BBC for an interview. He was asked
about free speech and his vision for Twitter, which he describes as a
digital town square. Elon Musk, Twitter CEO In order for something to
serve as a digital town square, it must serve people from all political
persuasions, provided it's legal. It must, he says, accommodate all
points of view. That's fair. Different ideas, different points of view
should be discussed openly and robustly. There is, Musk adds, no other
way.
00:45Elon Musk Free speech is meaningless unless you allow people you don't like to say things you don't like.
00:52Paul Brandus Otherwise, it's irrelevant.
But what if that free speech is flat out dishonest or regarded by many
people at least as crazy, like Q-Anon types, spreading conspiracy
theories, for example, or hate speech? Here's what Musk said. Well,
you know, who's to say that something is misinformation? Who's the
arbiter of that? Who is the arbiter of that? Our country is so divided
today that we have trouble agreeing on even basic facts. What is true
to some could be completely false to others. What is true to some could
be, in the eyes of others, completely made up, perhaps maliciously so. There's a word for this, Disinformation. I'm Paul Brandus, and that's the name of this series that's called simply Disinformation.
01:38Meredith Wilson And I'm Meredith Wilson, founder
and CEO of Emergent Risk International, and I'll be providing analysis throughout each episode.
01:52Paul Brandus When Musk bought Twitter, there were fears that peddlers of
disinformation previously banned would return and pick up right where
they left off. And decide for yourself what it means that two weeks
into Musk's tenure, the Twitter executive responsible for trust and
safety, a man named Yoel Roth, quit. After he left, he told National
Public Radio that it was painful to watch what was going on at Twitter. But Roth also put that in perspective.
02:26Yoel Roth I'm heartbroken and devastated by what I see happening at
the company and what I see happening to decades of investment by
professionals into building the service into what it is. But I think we
need to have two parallel conversations. We can be upset and concerned
about what's happening at Twitter specifically, but we would be remiss
if we only had that conversation. We should think about what the
tradeoffs here are. Trying to strike a balance between three factors
that are often in enormous tension with each other. Those are safety,
speech and privacy.
03:09Paul Brandus Safety, speech and privacy.
Those are the big three issues. And as Roth said, they're often in
conflict, sometimes favoring one can come at the expense of the other
and the lines can often be blurred. Meredith Wilson is CEO of Emergent
Risk International.
03:28Meredith Wilson When you look at
disinformation, the biggest problem with regulating it is that it
infringes on First Amendment rights in the US. And it's this sort of
constant catch 22 that we're living in where, you know, if we wanted to
regulate disinformation, we would have to figure out what that line is, right? What is the what is what is the line where your ability to say
what you want stops and your ability to make up things that harm people
starts? Now, we have defamation laws, so there are some applications
of that. But broadly, our politicians cannot agree on what we should do
with disinformation.
04:18Paul Brandus So dealing with disinformation
apparently is not quite as cut and dry as you think, as Meredith says,
where are the lines, at least the lines that we can agree on? Evidence
of this was visible earlier this month when a federal judge in Louisiana
issued an injunction restricting contact between the Biden
administration and social media firms. The administration has been
seeking to work with social media firms to better control what the
White House thinks is false or misleading information. But the lawsuit
filed by the Republican attorneys general of Louisiana and Missouri
alleged that the administration was pressuring social media companies
into deleting or suppressing content, content, the lawsuit said, which
is free speech and thus protected under the First Amendment.
05:13Jim Hoff We report the truth. We're
committed to the truth. We correct our mistakes and we're generally
correct with with what we call. And and we know that the mainstream
media is feeding people a bunch of garbage.
05:28Paul Brandus That's one of the plaintiffs in
the lawsuit against the government, Jim Hoff, a St. Louis man behind a
website called The Gateway Pundit. You just heard him say that he's a
truth teller, though others point out that his site, which has been
around for many years, is associated with disinformation about
everything from COVID vaccines to the 2020 election. It also has a
page dedicated to supporting those charged in the attack on the US
Capitol two years ago. It calls the insurrection, quote, the events of
January 6th, 2021, unquote, and says those who have been convicted and
sent to prison are victims. And I'm quoting again here of an American
Gulag, unquote. Mr. Hoff declined to comment for our podcast. His
platform and his deep unshakable belief that he is a conveyor of the
truth is writ large, reflective of our national and binary dilemma.
One side is completely convinced that it is right, that what it says is
correct and that the other side is flat out wrong, if not maliciously
so. On that grim note, this question, where do we go from here? That
and more after this short break.
06:48ad break This series on
disinformation is a co-production of Evergreen Podcasts and Emergent
Risk International, a global risk advisory firm. Emergent Risk
International, we build intelligent solutions that find opportunities in
a world of risk.
07:13Paul Brandus Welcome back. I mentioned that
the Biden administration has been dealt a setback in its attempts to
get social media companies to rein in what it considers disinformation.
The White House press office issued a statement about this. It says,
quote, The administration has promoted responsible actions to protect
public health, safety and security when confronted by challenges like a
deadly pandemic and foreign attacks on our elections, unquote. It
felt that content on social media platforms ran counter to this goal,
but the appeal of the injunction on those grounds was also shot down.
The U.S. district judge in both instances, Terry Dowdy, is a Trump
appointee. In shooting down the appeal, he writes that the defendants,
the administration have no legal right to contact social media
companies for, quote, the purpose of urging, encouraging, pressuring or
inducing in any manner the removal, deletion, suppression or reduction
of content containing protected free speech posted on social media
platforms, unquote. Does this mean that social media companies can do
whatever they want? Yes and no.
08:31Ella Irwin Well, you know, as a general
matter, social media companies do have an incentive to engage in
content moderation and to protect their platforms from misinformation
and disinformation because they want people to keep coming back. And
the more their content is infected with misinformation, disinformation,
the less likely people are to come back.
08:48Paul Brandus Jamil Jaffer is founder and
director of the National Security Institute. He also served as counsel
to the assistant attorney general for national security at the Justice
Department. He spoke on CBS.
09:02Ella Irwin The manager you refer to, a
rating and the like. The general consensus in this country has been for
200 years that the best way to come back, a speech that you don't like
is not to shut that speech down, but to engage in the speech of your
own. Taking on that sort of mantle, some of the social media companies,
and remember, these are private platforms, so they can do whatever
they want with their platforms, have decided that the primary
methodology for them of addressing potentially problematic content is to
allow the community there to weigh in on it, to provide additional
context, provide links to what they believe might be correct
information, and that that's a better way than the platform itself
taking decisions to take down content. At the same time, clearly
problematic content has been taken down historically for many years by
these platforms as well. And so there's a little bit of a balancing act
these platforms try to engage in in dealing with what they see as
problematic content.
09:54Paul Brandus But what's different now, one
quarter into the 21st century, is the impact of speed technology and
the sheer volume, the firehose of content on this traditional dynamic.
Earlier, I mentioned Yoel Roth, a former trust and safety officer at
Twitter. After he left just two weeks into Musk's tenure, he was
replaced by Ella Erwin. Erwin didn't last long at Twitter either. She
left after seven months. Unlike Roth, however, who was not bound by any
confidentiality agreement, she was unable to discuss the internal
workings at Twitter or her dealings with Musk. But she was able to
share her ideas on what could be done to tamp down, not eliminate, for
that's impossible, but at least tamp down false narratives posed by what
she calls bad actors. She thinks the best way to deal with that is not
by employing armies of fact
10:53Ella Irwin checkers downstream, but tackle
disinformation upstream where it's produced. We have to do things
upstream to ensure that we know who is using the technology and the
tools. And we have to be transparent about where the communication and
the information is coming from so the public can make smart decisions
about the information that they're receiving. A lot of the regulation
that you see being proposed is still focusing very much downstream on
the information that's published. And there's very little being
proposed around understanding who is using the technology and the tools
and ensuring there's appropriate due diligence and controls in place
around the scale at which that they can use these tools. Right. And
that doesn't have to apply to everyone. The reality is, as you said,
most people have good intentions. Most people are not using technology
and tools at the scale and with the speed that a bad actor would be
attempting to use them. And so it does involve some friction. It does
involve some controls, many of which are not in place today, and
companies would have to implement them. But it allows us to really
understand who is trying to use these tools now that separately there
have to be appropriate public education campaigns. And, you know, we
have to look at do we have appropriate criminal penalties in place for
people who actually do engage in disinformation campaigns and harm the
public? I mean, those are all really important things as well. But
going upstream and actually putting some controls in place around who
gets access to even send information to millions of people is really
important.
12:36Paul Brandus Why don't tech companies do
more from the beginning to keep their platforms safe? Well, Irwin
explains that startups have one initial focus, just getting off the
ground and surviving. Any focus on the bad guys comes later. She also
theorizes that when those fraudsters do show up, it's more efficient to
focus only on the big fish, those with big followings who are
responsible for most of the bad behavior and not the minnows, the guy
with a small following.
13:08Ella Irwin I think companies, as a general
rule, follow a certain pattern when it comes to fraud and abuse. You
know, first, when a company is starting out, there's a lot of fraud.
They're very focused on growth. They're focused on developing products
and features for good reason. Right. I mean, you have to you have to
have a customer and you have to have revenue and you have to get to
some level of success. And the reality is that's also when you start to
attract bad actors because your product is useful and bad actors start
to see how they can exploit it. I don't know many companies who at
this phase suddenly decide that they need to invest heavily in sort of
testing their own capabilities in a way that says, here's how they can
be abused. Here's how a bad actor can exploit, find and exploit gaps
that we have. Most companies will wait and take action, unfortunately,
when they're now facing significant harm. Maybe it's causing revenue
loss. Maybe users are complaining. Maybe it's causing reputational
damage or regulatory inquiries. That's typically when a company starts
to say, OK, well, we need to now start to mitigate this. And even when
they start to mitigate it, they're typically not setting up a team
that's going to stress test and look for gaps. They're just trying to
put out. They're just trying to put out the fire. They're just trying
to put out the fire. And so it's interesting because there are
companies who who once they have implemented teams like that have done
an amazing job at preventing issues that could have been huge issues
down the line. But it's it's few and far in between.
14:54Paul Brandus This reminds me of something
called the Pareto principle or the 80-20 rule. You know what that is?
It's when, say, 80 percent of your problem is caused by 20 percent of
its causes. It's more efficient to focus limited resources on the small
number of people who are causing the most trouble. But you know how it
is. In the future, some folks who could one day find their ability to
spread false narratives limited might complain that this is an
infringement of their First Amendment rights. This is nonsense and
fundamental misinterpretation of what 1A is. The First Amendment
shields us from government retribution if we criticize it or protest
against it. For example, you and I can yell at the top of our lungs that
Joe Biden or Donald Trump or any other official is an idiot and
there's not a thing the government can do about it. But if you
criticize or embarrass, say, your employer in public, that is not
protected speech and you can be fired. The First Amendment has
limitations. A company, be it social media or otherwise, is not
obligated to have you as a customer. And the small number of people who
are bad actors should not presume otherwise. Perhaps you've seen a
sign outside a store. It might say no shirt, no shoes, no service.
That store is saying we're not obligated to let you in or do business
with you unless you meet our standards. Perhaps that same philosophy
could be extended to social media. Thanks to Ella Erwin for her
insights. Sound from the BBC, NPR and CBS. Our sound designer and
editor, Noah Foutz, audio engineer, Nathan Corson, executive producers
Michael Dealoia and Gerardo Orlando. And on behalf of Meredith Wilson,
I'm Paul Brandus. Thanks so much for listening.