Edward R. Murrow:
“This just might do nobody any good.”
This was the opening line of a speech two-thirds of a century ago by the most prominent broadcast journalist in America, perhaps the most prominent whoever lived. His name was Edward R. Murrow.
On October 15th, 1958, at a convention of radio and television executives, he took his own industry to task, warning the television, an extraordinarily powerful way of reaching millions could, if misused, inflict grave damage upon the country.
Edward R. Murrow:
“But I am seized with an abiding fear regarding what these two instruments are doing to our society, our culture, and our heritage. Our history will be what we make it.”
At the end of Murrow's address (he spoke for 37 minutes), came his most memorable passage, and as you hear it, ask yourself if Murrow's words then, could describe the internet and social media now.
Edward R. Murrow:
“This instrument can teach, it can illuminate — yes, and even it can inspire. But it can do so only to the extent that humans are determined to use it to those ends. Otherwise, it's nothing but wires and lights in a box.”
Substitute the words “internet” and “social media” for television, and Murrow's words remain true today.
But one thing is different (very different), and that is the easy access to these modern day platforms by anyone who can easily, with no filters, make things up, manipulate audio or video, and then, with the mere push of a button, make that false made-up content available theoretically, to anyone around the world.
There's a word for this deliberately false content: disinformation.
I'm Paul Brandus, and that's the name of this podcast series. It's called simply, Disinformation.
The topic of disinformation is huge, ever-evolving, and touches upon every nook and cranny of our society; war and peace, the economy, politics, elections, culture, finance, religion, our belief system, everything. And as I mentioned, anyone today, can manufacture and distribute it easily.
This series, a co-production of Evergreen and Emergent Risk International, a global risk advisory firm, is devoted to exploring these complex and intertwined issues.
I mentioned before the internet and social media, there's obviously, no question that these two things have acted as accelerants, allowing disinformation to be produced faster, spread faster, and do more damage faster than our ability to respond in an effective way.
So, when we think about accelerants, the first thing that I think about is going kind of back to some of the early ways that (and this kind of get back to the Russian disinformation thing) the Russian internet research agency targeted U.S. audiences. And it's the literary verbal equivalent of throwing a bomb into the middle of a discussion.
Meredith Wilson is Chief Executive Officer of Emergent Risk International. Prior to this, she spent several years with the Defense Intelligence Agency and in the private sector, primarily in the oil and gas industry.
Her phrase, “throwing a bomb into the middle of a discussion,” that's what happens when you take a divisive issue, an incendiary one, and let the unleashed power, speed, and reach of the internet and social media do the rest.
A good example of this occurred back in 2014 in the Missouri town of Ferguson, after Michael Brown, a black man, was shot and killed by police officer, Darren Wilson. He was white. Riots erupted the very next day, reopening the national debate on familiar divisive issues like race relations, police tactics, and the relationship between law enforcement and African Americans.
When the Ferguson riots happened, all of the Twitter feeds about this, all the Twitter discussions, the sort of disinformers would throw in these really, really outlandishly nasty comments about people either on the far left or on the far right, which would get people angry and responding, and push that up in the algorithms. That's kind of a classic accelerant.
Wilson mentioned something called an algorithm. When we talk about the internet and social media as accelerants, this is really important. A social media algorithm is basically software that uses your past preferences to help determine what you'll see online. Whatever your preferences are, an algorithm ensures that you'll get more of it.
You are constantly targeted with information that you agree with. So, if you're a left-wing sort of reader, you're going to get all of the left-wing media. If you're a right-wing reader, you're going to get all of the right-wing media, and everything that comes with it. All of the junk, all of the garbage, all of the conspiracy theories that flourish on the extremes of both of those ends.
And you may not understand that if you don't understand how the internet works and how information works today.
And that's the problem; when you only get one side (your side), it helps divide us into camps. And the dividing lines are many Democrats versus Republicans, pro-life versus pro-choice, race, guns, immigration, on and on.
When you think about the way that media has evolved with the internet, the way that we draw attention to the things that we want people to pay attention to, is with sort of clickbait headlines.
And so, if you are a CNN or a Fox News or one of these news outlets, cable news outlets that is looking to draw clicks, which most media companies are (that's how we make money), the extreme sort of headlines that you see every day.
And remembering that some Americans don't go beyond reading the headlines. They read the headline and that's how they got their news. And those exaggerated headlines that push us even further in one direction or the other to the point that it creates this divisive animosity where people start to reject the headlines on the other side.
They won't even read them because they so disagree with their sort of cognitive dissonance. And, and oftentimes, because they're so outlandish that they start to reject that and only trust the media that agrees with them.
And so, while the prevalence of so many channels, so many platforms, theoretically, can be good, more voices in a loud and boisterous democracy, what we've seen is the splintering of the media, and as a subset of that, a splintering of us allowing us to congregate only with those we agree with.
The divisions are now so stark, so wide, that a September 2022 survey by CBS and YouGov said that nearly half of Americans in both parties see the other side, not just as an opponent, but an enemy. This extreme tribalism is a breeding ground for disinformation. And as we saw at the capitol on January the 6th, 2021, a breeding ground for violence.
This series on disinformation is a co-production of Evergreen Podcasts and Emergent Risk International, a global risk advisory firm. Emergent Risk International, we build intelligent solutions that find opportunities in a world of risk.
We are an incredibly polarized and angry nation politically. In the close to 40 years that I've been involved in homeland security and law enforcement, I have never seen a country this divided.
John Cohen is the former acting under Secretary for Intelligence and the Counterterrorism Coordinator for the Department of Homeland Security. He has also worked in the Office of National Intelligence.
And during his four decades in law enforcement, has served four presidents from both parties. He says this bitter division among Americans, unprecedented in modern times, is a gift to those who seek to weaken us.
And what's different is that we tend to view those who disagree with us, whether they're in government or our neighbors as the enemy. And increasingly, we're seeing people who advocate violence as an acceptable way to express one's disagreement with the government or an election, or a disagreement with another's position on public policy issues.
So, it's this combination of anger and violence that not only provides the opportunity for countries like Russia, but makes the outcomes of an information operation so much more dangerous from a threat perspective.
Now, how much of this are we doing to ourselves? I mean, the Russians, Chinese, North Korean, so forth, and others are obviously, major players in this space, but we have a polarized media. We have websites that cater to one side or the other, and people tend to … this concept of tribalism where you associate only with those with whom you agree.
And you made a good point, John. I mean, folks who disagree, they're not fellow citizens who simply have a different point of view to be respected. They're the enemy and they must be destroyed. And it's a very dangerous attitude that not just sort of the corner bar, but in the halls of Congress as well.
I suppose the question is, how much of this have we done to ourselves? Take Russia, China out of it. What are we doing to ourselves here?
That is the key point right there. Information operations are effective, but they're only effective when there are social fractures that can be exacerbated by the entity that's engaging in the information operation.
So, when China and Russia and North Korean and Iran engage these activities, the first step in the playbook is to study the society, figure out those issues that can be exploited because those are the issues that are exploitable.
And if you look at the United States, it shouldn't be a surprise to anybody that the sociopolitical narratives that often make up the content that's promoted online by Russia, Iran, North Korea, and China have to do with the government's response to COVID, has to do with the 2020 election, have to do with issues of immigration and race, because they know correctly, that those are the most divisive issues in the country today.
Those are the issues that are evoking a passionate and emotional response, and sometimes, even violent responses by those who have strongly held beliefs in those areas.
The Department of Homeland Security, FBI, and National Counter-Terrorism Center recently published an intelligence product that looked at ideologically motivated violence in the United States in 2021, and they found that the overwhelming majority of those attacks that were ideologically motivated, were by people who were motivated by one of those three sociopolitical narrative groups that I just described.
So, Russia's success in destabilizing our society is directly related to two factors. One, the speed that they can deliver a message based on current events into the ecosystem. And two, that they're delivering that content into an environment that's highly polarized, highly tribal, and quite frankly, stoked with anger.
And of course, the internet and social media with their push button speed and global reach, are, as Meredith Wilson, said, earlier in this episode, accelerants.
So, algorithms, tribalism, vilifying fellow citizens as enemies, this is all a dangerous toxic stew. Someone else who has been studying this is Cindy Otis, a former Central Intelligence Agency Officer who has written a terrific, albeit disturbing book called True or False - A CIA Analyst's Guide to Spotting Fake News.
Like Wilson, she worries about the dynamic of today, the speed and reach of the internet and social media accelerants, the like of which we have never seen before.
In my book, True or False, I walk through a number of historical case studies, and then I also put between those case studies, the evolution of essentially, the dissemination technology.
So, from papyrus to printing press to telegraph, radio, newspaper, and then social media and the internet, to show how it really has accelerated the dissemination of false information and the ability exactly as you said, of sort of more the average Joe Smith to take advantage of it and use it for their purposes.
That's the key point, the average Joe can take advantage of it and use it for their own purposes. I return in the end to Edward R. Murrow and his warning of so many years ago.
Edward R. Murrow:
“I began by saying that our history will be what we make it. If we go on as we are, then history will take its revenge and retribution will not limp in catching up with us.”
What he said of television then, is perhaps even more true of the instruments we hold at our disposal today.
Edward R. Murrow:
“This instrument can teach, it can illuminate — yes, and even it can inspire. But it can do so only to the extent that humans are determined to use it to those ends. Otherwise, it's nothing but wires and lights in a box.
My thanks to Meredith Wilson of Emergent Risk International, Intelligence and Homeland Security, expert John Cohen, and former CIA analyst, and now, author, Cindy Otis. We’ll be hearing more from all of them in future episodes. Our sound designer and engineer, Noah Foutz, audio engineer, Nathan Corson, executive producers, Michael DeAloia, and Gerardo Orlando.
I'm Paul Brandus, thanks so much for listening.
Transcribed by Smart Transcribers