How Do You Know
What's True?
That's the premise behind "Disinformation" - with award-winning Evergreen host Paul Brandus. Get ready for amazing stories - war, espionage, corruption, elections, and assorted trickery showing how false information is turning our world inside out - and what we can do about it. A co-production of Evergreen and Emergent Risk International.
Muddying The Waters - Information Pollution and the Struggle for Clarity
| S:3 E:1"This information is polluting the entire ecosystem under which we operate."
In this episode of "Disinformation," Paul Brandus delves into the pressing issue of information pollution in the 21st century. With the rise of social media and artificial intelligence, discerning between truth, misinformation, and deliberate falsehoods has become increasingly challenging. We also discuss how information pollution exacerbates existing systemic risks like global conflict and climate change, making it a critical issue to address for progress and solutions in the modern world.
[00:01:26] Information pollution.
[00:06:08] Disinformation and its elements.
[00:09:30] Information pollution and swatting.
[00:15:29] Companies navigating information dangers.
[00:17:50] Cybersecurity and company vulnerabilities.
[00:22:17] Swatting and disinformation.
Got questions, comments or ideas or an example of disinformation you'd like us to check out? Send them to [email protected]. Subscribe wherever you get your podcasts. Special thanks to our guest Alan Jagolinzer, our sound designer and editor Noah Foutz, audio engineer Nathan Corson, and executive producers Michael DeAloia and Gerardo Orlando. Thanks so much for listening.
Where to Listen
Find us in your favorite podcast app.
00:07 Paul Brandus: But there's so much information swirling about these days, so many voices, so many platforms, so many alleged facts and figures, it's harder than ever to discern what is factual and what is not. Is something true? Is it misinformation, something that is inaccurate but not deliberately so? Or is it deliberately false, something manufactured and distributed with malicious intent? This is something else. This is disinformation. I'm Paul Brandus, and that's the name of this podcast series, Disinformation, a co-production of Evergreen Podcasting and Emergent Risk International, a global risk advisory firm. As usual, I'll be joined by ERI's Chief Executive Officer, Meredith Wilson. Disinformation is hardly new. Examples of it date back to ancient times. But now, one quarter into the 21st century, accelerants like social media and artificial intelligence make it far more sophisticated and far more ubiquitous than ever before. Alan Jagolinzer, a business school professor at Cambridge University in England, compares this false information that we're exposed to and consuming to filthy air and water, which is why he calls it pollution, information pollution. He calls this the most urgent issue in the world. You say that the most pressing issue in 2024 is information pollution. Why?
01:51 Alan Jagolinzer: Well, I think partly because we have a lot of systemic risk already around things like global conflict. We have climate risk. We have human trafficking and human migration and issues like that. And so we have to kind of get our heads around these sort of systemic risks. But all of that is amplified and complicated and accelerated with information, what I call information pollution, in order To do any kind of progress, to fix any kind of problems, we need to have clean information, which is what my field is all about. You can't make business decisions and investment decisions cleanly and carefully if you're being fed garbage information. And so we have an entire apparatus in my field to try to get to some sense of underlying economic truth And when you go out in the other sphere, which would be all media, AM radio, to Facebook, to Twitter, or X, there's so much noise happening there. Much of it is intentional. Much of it is unintentional. But I call that pollution. And it's incredibly difficult to get your head around even what is the problem, and where is the problem, and how do I source it, and how do I even find out communities with whom I can work to solve the problems. So that's it. And then also, we're moving into an election cycle, and there's a lot of lack of trust around the elections. And there's been a whole literature around the decline of, or they call it backsliding democratic processes and threats to some of the fundamental institutions upon which we rely for security, security in banking, security in lack of corruption, even physical security is baked into this. And that infrastructure is being eroded through intentional campaigns to dismantle them and in lack of trust from just misunderstanding them. We saw a lot of that through the pandemic where people just didn't even trust doctors. and because they were feeding off of bad information. To me, as we move into a bunch of global elections where we're going to start deciding who's going to run governments and the resources that governments bring together, to tackle some of these crises and who will be our allies are up in the air. We don't even know at this point whether the United States will retain its position within NATO at this point, depending on the outcome of that election as one example. So how do you solve a regional conflict with an ally when you're not even sure that you're going to retain your allyship moving forward in the next cycle? And all of that's contaminated by the information and the lack of clean information in the environment we're operating in right now.
04:53 Paul Brandus: A lack of clean information, an interesting phrase, the question we face going forward, how to access that so-called clean information. It's not enough to merely trust something these days, Allen says, we have to learn how to verify things too. Very difficult, if not impossible. And with all of the elections the world is facing this year, including our own in November, time is short. I asked Alan if, in fact, it is too late.
05:26 Alan Jagolinzer: So I never believe it's too late because I see people who are capable of asking the questions that I typically ask are, you know, do I trust the information? And if not, how do I validate it? So I think there are enough skeptics out there who are able to kind of see through the fog. I think what we have to do though, is we have to kind of create more people who understand kind of the infrastructure around what I think is the biggest problem, which is the disinformation, which by definition is intentionally misleading information, where we've got specific actors who are engaging it. Let me go down that path and then I'll answer your question about the business leaders and the potential solutions from that perspective. I define disinformation as having six elements, and the most important of the six are that you have a malign actor with intention to craft a false narrative, push it through specific dissemination channels for a already selected, targeted, vulnerable audience to exploit them. And that is inherently the key to disinformation. And this gets into conflicts of interest and incentives and thinking in terms of incentives, because then you can also go, what's in it for them? And that's one way in which I try to think about how to unpack this and how to communicate to an audience who might be exploited that, in fact, they've lost their agency. I give a lot of credit to Diane Benskoder, who is a former cult member who spoke at our summit this summer and who is also working with some people who have been imprisoned for their participation in the January 6 event. She used this word, which I've heard before, but not in this context. She said agency. It's all about loss of agency. They don't really understand that they're being manipulated and that they've lost their agency in this. That's why I like to frame it in terms of Who wants an outcome? What is the outcome? What's their objective? And so when you start talking about feeding information that's polluted intentionally into the ecosystem, usually the incentives are around financial gain, which is fraud, or in this context, it's around political capital and power. Sometimes it's psychological incentives where it's a narcissism that they have to fulfill. And sometimes in some cases, it's also for physical gratification. you know, if I can have access to a human being or something like that, or if I can encroach on them. I think the point of that, though, is that I sit down when I want to have conversations, I'd like to have conversations with business leaders in some sense, because I don't think they fully understand the implications of the environment in which they're operating.
08:09 Paul Brandus: There is no shortage of examples in which companies have been caught off guard by disinformation. Allen tells the story of pharmaceutical giant Eli Lilly, insulin prices and X, or as it used to be known, Twitter.
08:23 Alan Jagolinzer: Businesses are being directly targeted and they're having direct implications of disinformation. You see that with the Eli Lilly issue where there was a fake tweet right after Elon Musk started going with the blue tick marks for $8, I believe. Somebody had posted under a spoof account that insulin was going to be free and their stock price plummeted.
08:45 Paul Brandus: In Eli Lilly's case, the false tweets saying that insulin would be free temporarily cut billions off the company's market cap, all because someone with a few bucks got a blue check, something that used to convey legitimacy. For the record, Eli Lilly lowered its insulin price to $35, the price ceiling millions of Americans can now pay each month for each of their insulin prescriptions, a key provision of the Inflation Reduction Act signed into law in 2022. But again, and to use Alan's broader term here, information pollution, we've seen how it can take many forms. In the last few months, there has been a rise in one form of this pollution, so-called swatting, when someone falsely claims some kind of public emergency, something like a hostage standoff, a bomb threat, or active shooter. The goal is to draw some kind of police response to the location of the alleged incident, someone's house, a school, or place of business. Such criminal hoaxes have occurred with fatal results. Even the White House has been swatted. A 911 caller falsely claiming in January that there was a fire. Here's a portion of the emergency response to that call. For the record, President Biden was at Camp David when that call came in, but just imagine the potential for mayhem from anyone with, say, nothing more than a burner phone to cover their tracks. This is the kind of thing Alan means when he uses the phrase information pollution.
10:32 Alan Jagolinzer: And then we're seeing things like bomb threats and other things, particularly in situations where A senior leader might take a political stance on one of the hot issues in America. Maybe it's gun rights or abortion rights or something like that. So in those contexts, we're seeing what I call direct implications. The other direct implications might be short sellers doing market manipulation, or they might be competitors doing product perception manipulation. I know of some companies that are looking and trying to get companies to be more aware of that. What I don't think is happening, however, where I think we could step up our awareness is for business leaders to think about the systemic risks, even if I'm not being targeted, even if there's nothing really going on around me. I think the idea here is to sit back and say, well, that's somebody else's problem. I'm just going to keep my head down and operate. But what I don't think they realize is that all around them, this information is polluting the entire ecosystem under which we operate. And the systemic risks could include in the scenarios that I'm talking about now building with senior leaders are, what if your workforce now is being rounded up for deportation? Or what if we now have civil conflict and we end up much like before the Israeli-Palestinian conflict, there are a million people clogging up the main highway artery in and out of the city. Or if it's like in Canada where there's some trucker strike that's locked down all of our stuff, or if we have some other transportation. So we've got things like stochastic terrorism risk. We've got things like forced human migration because some new government or whatever, some group of people is going to start rounding up people and pushing them out, or war, or things like stranded assets in a conflict area. So if we have a supply chain or if we have assets deployed, and a classic example I use is BP's investment in Rosnaft. BP has had a long investment in Rosneft. I believe at one point it was roughly 20%. They had the CEO on Rosneft's board. I can't fathom that they anticipated that the Russian leader was going to launch a massive offensive into Ukraine and there were going to be expectations of boycotts, etc. I could imagine where if we have a supply chain in country X and they have, as you noted, India, we could hypothetically call it India. If there's civil turmoil there, are the employees working? Are they safe? Are some being called up for military draft? Are your assets being nationalized in some authoritarian governments?
13:18 Paul Brandus: Let's take a short break here. When we come back, I'll be joined by Meredith Wilson of Emergent Risk International.
13:27 *ad read*: This series on disinformation is a co-production of Evergreen Podcasts and Emergent Risk International, a global risk advisory firm. Emergent Risk International, we build intelligent solutions that find opportunities in a world of risk.
13:48 Paul Brandus: Welcome back. Let's bring in Meredith Wilson now. She, of course, the Chief Executive Officer of Emergent Risk International. This issue of information pollution, as the professor calls it. Let me give you a couple of things he said, Meredith, and just to react to them, if you could, please. He says, I'm paraphrasing, but he says, even in 2024, companies still may not be aware of the information ecosystem and the inherent dangers. What do you think?
14:25 Meredith Wilson: That's a very general statement. I think like most things that we talk about on here, it probably doesn't get to the breadth and complexity of companies in general that are out there and what they are concerned about. I think there are a lot of companies that are very aware of the dangers of the information environment and have thus beefed up their communications groups, their government affairs groups, their security groups. But they are and tend to be the larger companies, the ones that have already gotten into reputational problems because of things that have happened online that were out of their control. There are certainly, though, plenty of companies out there right now that are not necessarily thinking about that because they're focused on their, you know, their day-to-day business and could certainly use the assistance in better understanding how to navigate that environment.
15:29 Paul Brandus: Well, what are the two or three things that you think, there's no question that big companies, S&P 500, Listed companies and so forth are tend to be more aware of uh these dangers than others but mid-cap or small cap companies Perhaps might not be or perhaps they might not have the resources to devote to it Is it possible to say maybe? two or three things that they Can do without the expenditure of too much in terms of capital to kind of you know, uh
16:05 Meredith Wilson: Bolster their defenses against the possible issues I think there's a you know, there's a few things that maybe could be done on a Just a informal basis, you know, just simply keeping aware of the news and what's happening Sort of writ large for example this morning we were looking at a An issue that popped up in Hong Kong where an employee transferred $25 million after being on a deepfake call with his entire staff or his entire team that turned out to be completely faked. Understanding where scams are going and where disinformation is going and how that's affecting companies in general. is something that people can learn just simply from watching those headlines and keeping an eye on, you know, certain types of news like technology news, for example. And, you know, reading mainstream sort of shooting down the middle news sources like NBC and CBS, Washington Post, things like that. But I think there is a deeper amount of work that most companies need to do to understand how that very directly affects them. And that there probably is somebody that needs to be put in charge of that, somebody that needs to be in charge of keeping an eye on. what is happening from a news regulatory information perspective that many companies that are small don't develop until they become, you know, larger mid-sized simply because they don't have the resources to put into it.
17:50 Paul Brandus: The Hong Kong case was interesting, and it occurred to me that the person who transferred that $25 billion would not care to take the time to a double check with that person, contact them later and just say, I just wanted to confirm that this is you. Are we at the point now where taking these extra steps, these extra precautions are going to be mandatory to prevent this kind of thing. I know that slows down business, costs extra money, takes time, that kind of thing. But it seems like it's necessary to prevent this sort of thing from happening again.
18:30 Meredith Wilson: Yeah, in this case, I don't know the enough of the specifics of the case to know exactly what happened there But imagine if you thought you were on a call with 10 of your colleagues How strange it would seem to get off of that call and then call those 10 colleagues, right? It's um, it's new. It's uh, you know, it's a this is a brand new thing. And so yes, uh, there will be Both on the electronic sort of IT side, companies like Zoom and those that are facilitating video calls will be looking at how do they ensure that this doesn't happen through their software. And on the company side, companies will be looking at how do we ensure that we have mechanisms in place so that employees don't do that. Those things already exist for, you know, other scams, like there used to be a lot of scams where somebody would just simply send an invoice to a procurement department and they would pay it because that was their job. And so that's where we have now. purchase orders in place and, you know, people having to sign off on spending a certain amount of money and that's exactly why all those mechanisms exist. So, absolutely, there will be others that, you know, that come out of this scenario and everything that develops from here.
19:54 Paul Brandus: Now, we mentioned big companies a minute ago, even though they have this greater awareness of these issues, they have more resources to devote to combating them and so forth, things can still happen. There was a recent example, of course, involving Eli Lilly, the big pharmaceutical giant where somebody with a fake twitter account or x account purporting to be ill i literally said that to going forward we're going to offer insulin for free when in fact that was not true. Are the company's market cap got knocked down by several billion dollars just in a matter of moments now they eventually recovered it but. just from a short-term standpoint, things can happen even to big companies, and they're kind of pushed into this reactive stance where something happens, they're caught off guard, and then they have to play catch-up. What do you do about that? Again, we're talking about a big pharmaceutical giant here.
20:57 Meredith Wilson: Well, I think they did what they needed to do about that. As far as, unfortunately, ex-Twitter is concerned, the controls on that side now have been taken off that would have allowed that to not happen in the first place, right? There used to be a system in place so that you knew whether that was an official account or not, right? So in the past, somebody would have looked at that and said, oh, that's not an official account. removing those controls allowed that to happen, right? And unfortunately, there's a limited amount the public can do about that if these platforms are not putting those things in place. And we've seen a whole bunch of that stuff rolled back now, and we'll probably see it go back the other way now with the elections coming up. And unfortunately, it will probably be another scenario where we have to learn from a big mistake like that. But those controls were in place previously. They're just not there anymore.
21:59 Paul Brandus: As for the issue of swatting, which Professor Jagolinzer mentioned, again, that's when someone falsely claims some kind of public emergency in an attempt to trigger a police response. Like the professor, Meredith also calls this a form of disinformation.
22:17 Meredith Wilson: It is. It's a much more violent form. So, yeah, so that's been around for at least a couple of decades and really kind of goes back to gamer culture. This is something that started with video gamers that were competing with each other. It's also happened in several cases with just kids fooling around, not realizing the implications of what they were doing. And I want to say it was probably mid-2000s, there were a couple of deaths associated with those response calls. And it seemed to die down for a while. But now it seems to be more of a political tool. And where we've been seeing it has been with high-profile political actors, both on the left and the right, who have made decisions that somebody was unhappy with.
23:16 Paul Brandus: And the White House swatting?
23:18 Meredith Wilson: Well, I think the thing to remember to know is that when it happens to these high profile individuals, there is a probably enough safeguards in place that they're not going to be harmed. It's the it's the non high profile people so People who are working elections volunteers things like that. Um, you know people who are uh, you know making uh, You know during the pandemic era when you had a lot of public health officials being targeted People like that that don't have the kind of executive protection type protections are the ones that are really at risk when it comes to that kind of thing. Not so much the high-profile actors where, yes, it's going to be time-consuming. It takes a lot of resources. It's not a good thing at all. But the people that really end up getting hurt are the ones that are not the high-profile victims.
24:09 Paul Brandus: Swatting is a federal and often state crime that can be classified as either a felony or misdemeanor. On the federal side, swatting can be considered wire fraud when someone voluntarily and intentionally uses interstate communication, a phone or computer, to commit their crime. Convictions for swatting can carry a stiff sentence, years, even life in prison. Thanks to Dr. Alan Jagolinzer, Professor of Financial Accounting at Cambridge University Business School. Our sound designer and editor, Noah Foutz. Audio engineer, Nathan Corson. Executive producers, Michael Dealoia and Gerardo Orlando. And on behalf of Meredith Wilson, I'm Paul Brandus. Thanks so much for listening.
Hide TranscriptRecent Episodes
View AllOSINT: The Tools of Truthseeking In The Age of Disinformation
Disinformation | S:3 E:4Hearts & Minds In Africa, Pt. II - Moscow's Continued Disinformation Effort
Disinformation | S:3 E:3Nobody's Fool - Understanding Truth Bias and Disinformation
Disinformation | S:3 E:2Lest There Be Any October Surprises - Fighting Disinformation Before the Ballot Box
DisinformationYou May Also Like
Hear More From Us!
Subscribe Today and get the newest Evergreen content delivered straight to your inbox!