Intimate Conversations with America’s Change-Makers

Burn the Boats is an award-winning podcast featuring intimate conversations with change-makers from every walk of life. Host Ken Harbaugh interviews politicians, authors, activists, and others about the most important issues of our time.

Listen on Apple Podcasts Listen on Spotify

Evergreen Podcasts Earns First-Ever Ambie Nomination for 'Burn The Boats'

Press Release
prnewswire.com

Dr. Kate Starbird: Disinformation in the Age of Social Media

| S:1 E:59
Dr. Kate Starbird: Disinformation in the Age of Social Media

Dr. Kate Starbird is an Associate Professor at the University of Washington, where she co-founded the Center for an Informed Public, which is dedicated to resisting strategic misinformation, promoting an informed society, and strengthening democratic discourse. Her academic research focuses on how people use social media platforms during crises such as natural disasters, mass shootings, and lately, antidemocratic insurgencies.

You can follow Dr. Starbird on Twitter at @katestarbird

Ken Harbaugh:

Burn the Boats is proud to support VoteVets, the nation’s largest and most impactful progressive veterans organization. To learn more, or to join their mission, go to VoteVets.org.

Dr. Kate Starbird:

And so in some ways it's almost teaching folks to be a little more in tune for how their emotions are being manipulated when they're encountering content online, and less about trying to come up with some countermobilization for positive, because I'm not sure that there's a pathway for that that makes sense.

Ken Harbaugh:

I’m Ken Harbaugh, and this is Burn the Boats, a podcast about big decisions.

My guest today is Dr. Kate Starbird. She is a co-founder of University of Washington's Center for an Informed Public, which is dedicated to resisting strategic misinformation, promoting an informed society, and strengthening democratic discourse. Her academic research focuses on how people use social media platforms during crises such as natural disasters, mass shootings, and lately, antidemocratic insurgencies. Lots to talk about today, Dr. Starbird. Welcome to Burn the Boats.

Dr. Kate Starbird:

All right. Thanks for having me on.

Ken Harbaugh:

We get a lot of politicians, and I guess operators, on this show talking about the impact of disinformation, but we've got an academician here, and I want to start by asking you about I think a term I've heard you use, the infrastructure of disinformation. It's not just all random rumor. There are highways laid down to feed this stuff into our discourse. Can you elaborate?

Dr. Kate Starbird:

I haven't used that term, but there's a lot of great studies coming out that talk about the infrastructure of disinformation. One of the things that we've been thinking about is how the social media platforms, a lot of my work looks at social media platforms and their intersection with other parts of the information ecosystem, but if we look at social media platforms, it's almost as if those networks are now wired for the spread of disinformation, and not because they designed them that way, but because they were leveraged by different kinds of campaigns that eventually laid down this infrastructure, connected websites that share content, networks of following relationships that can be repeatedly leveraged to spread this disinformation.

The algorithms in the platforms have been gamed in so many ways that now some of the algorithms themselves reflect these operations that have happened over the course of many years. So, we've got this integration of how the platforms, these digital platforms, work, and how these tactics have taken advantage of them, and together, that has this sort of corrupted infrastructure that we can see repeatedly activating to spread false and misleading content for political and financial gain of different sets of actors, to be honest.

Ken Harbaugh:

Can you give us a concrete example of that in action, whether it's the algorithm elevating disinformation, or things that in your research have jumped out at you?

Dr. Kate Starbird:

Oh, my gosh. So many cases, I mean we can trace- Some of this stuff we saw in 2016 we could trace back to efforts to do follow trains in 2012, created these networks of artificially inflated accounts and relationships between people that were real people that are very politically motivated, and a lot of hybrid accounts that are like cyborg accounts that were part automated and part human operated, and those early operations helped to shape different kinds of following networks on Twitter. To be honest, a lot of our research is on Twitter because that's the data we can see. But we imagine similar things are now happening, like Facebook groups where folks have gamed the system in some ways and invited people into groups and grown a certain group around a certain set of ideas or objectives, and then the algorithms of the platforms recommend those groups or those following relationships to other users that might've retweeted one of them once or talked about something similar.

All of a sudden, the algorithms that don't really know what they're doing, but notifying like-minded people, recommend more of those connections. So, what we see is the development of these very dense networks of highly politically-motivated people, some of them witting in the spread of disinformation, many of them unwitting; They don't really realize that they're part of these networks, or they don't realize that they're being manipulated within these networks, but they're very densely connected, and they can very quickly be activated to spread disinformation around a new campaign.

Ken Harbaugh:

On the surface of things, it wouldn't seem to me that the strengthening of networks or creating super dense information networks is necessarily a bad thing, but then I read your research and your peers' research, which suggest that these networks are really much more activated by incendiary disinformation than by the positive spread of truth. Is that my bias coming through, is that how it actually works in the world?

Dr. Kate Starbird:

I think that's how people work. When you think of what causes us to take action in the world, it's because we are emotionally activated to do that, and we both have thought about the disaster context before, but you can be activated to volunteer or try to help when you see images of someone in pain or suffering a disaster event, but you can be activated by outrage, political outrage, to spread content, spread misinformation and disinformation, using some of the same kind of emotional, psychological processes. So, what we see in online spaces is that content that emotionally activates us, that's sensational and interesting, is more likely to spread because we're more likely to spread it. So, folks that want to manipulate us can take advantage of that, and then the social media systems, it gets into these cycles where they just keep giving us more and more and more of that content because that's the stuff that spreads the most, and they want to keep us activated and engaged.

Ken Harbaugh:

How accountable should we hold the platforms themselves? I mean, what you seem to be describing is a platform that is a victim of its own out-of-control algorithm, but they control the algorithm. Right?

Dr. Kate Starbird:

Yeah. I think we have to say platforms here, I think we're seeing similar kinds of things happen on all sorts of different kinds of platforms, certainly Facebook and Twitter. We've got a lot of documentation of things happening there, but it's happening on TikTok, YouTube, and certainly on the long tail of smaller platforms as well, and it's even happening on platforms that don't have algorithms, that are all just people forwarding things and do have algorithmic recommendations, so places like WhatsApp and other places also have disinformation problems. So, we can't purely tie this to the algorithms. However, we know that there are places where the algorithms or what gets recommended to us when we're in these platforms certainly play a role, as do our own choices about what we engage with and what we pass along.

In terms of ‘Are the platforms responsible?’, I think there's an early point in the development of the technology where they didn't necessarily know that there were toxicities. Folks may have been telling them that they were there, but they were prioritizing different kinds of things, and they had some sort of deniability on some of that responsibility. But I think we're no longer in that place. I think the platforms know that they have problems. They're working on those problems. They're putting resources into those problems, and they also have people that are making choices that privilege financial gain or other kinds of things over taking some of the harder decisions to address some of those problems.

So, certainly, I would say that they are responsible now for how we move forward, even as a lot of the toxicities that we're seeing now are things that have developed over a long period of time. So, even before they were aware of some of these problems, some of the people that were gaining influence, that we still see being influential in this part of disinformation, have been gaining that influence by gaming these systems for a decade. So, yes, they're responsible, but even the actions they take now can't just, how do we fix this going forward? They have to start thinking about, ‘How do we address the problems that have been here all along and what they mean for this moment right now?’

Ken Harbaugh:

How much faith do you put in their efforts to tackle the problems that they have created? I guess I'm asking, is it realistic to expect a corporation with shareholders to elevate ethical and moral obligations above shareholder value?

Dr. Kate Starbird:

Well, that's the question of capitalism right here. I don't know if we can take all of that on in one place, but certainly, there isn't-

Ken Harbaugh:

But you have written about the need for legislative framework so that it can't just be left to them.

Dr. Kate Starbird:

That's true. I think we do have to put some pressure on the platforms from different directions, and I also am very cognizant of some of the concerns around freedom of expression, and some of the concerns around, okay, we may be trying to address mis and disinformation, and how we define it on a platform right now, but how will that be used in four years or eight years or 20 years here? How will that be used in other parts of the world where they may have a different definition of what those things mean?

So, I realize it's very complex, but I think you're right that at the end of the day, no matter how many good people are working within these platforms to try to address some of these problems there are other concerns, like their financial bottom line, that sometimes our intention with the concerns of having healthy democratic discourse on those platforms. So, I do think in part it's government pressure, but it also may be pressure from advertisers, pressure from users, and pressure from within, from the people that are working there, maybe the next generation of labor rights, especially in a tech industry, as people who work for those companies are demanding that the platforms work differently.

Ken Harbaugh:

What have you seen most effective in this area? I mean, I've heard rumors, and maybe it's misinformation, that one of the main drivers of Facebook reforms wasn't fear of the legislative hammer. It was their own employees saying, "Enough is enough."

Dr. Kate Starbird:

I think that's one lever of potential change in this industry. It's probably not enough. You can always find another engineer, but they might not be as good as the one you're losing. Who cares about these things? But I do think that that is one avenue of pressure that may be productive, and maybe more productive in some ways than some others, but I think it's been a combination of things. There's no simple solution to any of this. It's usually a combination of different things, whether we look at the whole problem in general. It's not just the platforms. It's other kinds of things as well. But even just within the platforms, yeah, it's going to be employees saying that they're not going to do things in that way and demanding that the platforms do better. It'll be government regulation of certain kinds, maybe just more transparency. It's going to be journalism, continuing to highlight things, but hopefully doing it in a way that's constructive and doesn't overly sensationalize the criticism either.

So, I think there's a little bit of tension there for the platforms where every time they make a change, they get piled on for the change that they've made. So, they can feel like they can't win there, so I think some really thoughtful, critical journalism is part of that solution as well, but I don't think it's one thing, but certainly having the employees put some pressure on the platforms I think is part of what's going to make things better.

Ken Harbaugh:

One area that would defy any of the solutions we have been talking about is the rise of platforms hosted in countries that are unfriendly to democracy. Do you worry about that? You see recently the Chinese and Russian governments entering into agreements to create their own internet. For all of the attention that we pay to social media platforms hosted in liberal democracies, and for all of their misbehaviors, I think there's the specter of a totally different ecosystem emerging that would be immune to any of that pressure.

Dr. Kate Starbird:

Yeah. It'll have its own sets of issues around surveillance and different kinds of things. Instead of being surveilled by corporations for advertisements, you'll be surveilled for other kinds of purposes. Yeah no, I think this is definitely a concern. Even as we've been talking about we've got these few really big companies, they need to do things differently, and as they make changes, a lot of their users are going to other places, and part to get away from some of the what they call censorships and other kinds of things on platforms like Facebook and Twitter and YouTube as well.

My sense is there from the research that we do see deplatforming of individuals, so when the platforms are taking actions, and those individuals can move to these long-tail platforms, but they lose a lot of their audiences as they do, and even if their audiences go with them, they lose the opportunity to recruit new people into their conversations because these other platforms just don't have the same kind of. One of the things we're perceiving with our current information system, I don't know if it's 100% of the problem or 2% of the problem, somewhere in between that, is the scale at which a certain piece of information can reach so many people without much moderation, and that scale doesn't map to some of these long-tail platforms right now.

If another platform that's hosted in another place with a completely different set of norms and rules and values becomes that big, and becomes that big in terms of US usership or usership from other folks that are in the current democratic world, I think that creates a different set of challenges. But right now, I think the worry is more people moving into the smaller platforms and those becoming places of extremism and toxicity, and that's absolutely in our future. But it's a slightly changed situation because they'll have a harder time recruiting and a harder time reaching so many people so fast, which is a little bit, I think, of what's happening right now while Facebook and Twitter can still be leveraged for the spread of propaganda.

Ken Harbaugh:

You've used the term a couple of times, long-tail platforms. Can you just explain that?

Dr. Kate Starbird:

Yeah. Long tail, for me I'm just talking about.. We pretend that it's very democratic and you have all these different opportunities to engage, but most engagement happens around a few different accounts, whether you're looking inside a platform, like a small number of accounts get most of the retweets. Similarly, for platforms, a small number of platforms have most of the users, and then there's a ton of other platforms that have much smaller amounts of users, and then we call that the long tail. It's the smaller platforms that have fewer users, but they are many and diverse platforms for different purposes in that long tail.

Ken Harbaugh:

I imagine that it is in that long tail where most of the radicalization occurs. Before I go with a couple of the questions related to that, is that assumption fair?

Dr. Kate Starbird:

I think radicalization can happen in different places. The recruiting happens on the bigger platforms, and then they may go into that long tail, into a 4chan or an 8chan or Gab or something else, into spaces that become more radical. So, yes, I think that's a pretty good way of describing the situation, although we do also see some radicalization happening in Facebook groups or other more insulated groups within a larger platform.

Ken Harbaugh:

Is it in the public interest to try to keep more people in the, I guess, whatever the opposite of the long-tail platforms are, the mainstream platforms, where they actually have the resources to intercept some of these radicalization efforts and some of the misinformation campaigns. I mean, what risks do you see associated with the flight of people to these long-tail platforms where radicalization is so much more a part of the attraction?

Dr. Kate Starbird:

Yeah. I don't know how to answer this because I don't think I know the answer to it. There's some great arguments for breaking up the big platforms related to fighting monopolies and more diversity of ideas, these kinds of things, and often, when I look at a lot of those proposals, I agree with the premise of ‘Yeah, it's good.’ It would probably be better to have people spread out, but then there's this little lingering piece of me like, ‘But the disinformation problem doesn't necessarily get better if we spread people out on different platforms. The only thing I would come back to is that because they'll be in different places, it may be harder for things to move so fast and to move so many people, and it may add a little bit of friction to those intentional spread of content across different groups, and also, recruiting may be more difficult if things are spread.

So, again, your smaller spaces, if people end up in them, they can become more radicalized in those spaces, but there may be fewer people that end up in those spaces in the more sort of compartmentalized social media environment. But I don't think we know the answer. We're learning a lot of things as we go, after the changes have already been made, and being like, "Oh, gosh. How did we get here? We made that decision, and here we are." So, I don't know if that'll be a better world or a worse world, but I'm not sure we have much control over whether or not we're going there because I do see more and more movement, especially from folks who are invested in the political manipulation and radicalized processes. They do seem to be moving into these long-tail platforms.

Ken Harbaugh:

You've talked about the power of the emotional reaction to motivate people. Anger is an incredible motivator. Fear is an incredible motivator. Is there anything on the positive end of the spectrum that can be as powerful in pulling people back to being rational actors? Is there even one successful campaign of a positive information campaign that spreads as virally as these anger and fear-driven misinformation campaigns?

Dr. Kate Starbird:

That's interesting because we've seen so many positive things come out of the internet that are so similar to what we're seeing now with misinformation, disinformation, and this political outrage machine. So, if we look back, my own research, again, used to focus on social media during disaster, and we could see people use social media to draw attention to a humanitarian crisis in Haiti, help to fundraise, help to find volunteers who would come together and eventually go there to help, and I've seen some of the best of human behavior organized on social media in response to crisis events and other kinds of things as well. We've seen people come together in all sorts of interesting and unique ways to do amazing things.

The thing is it's easy to emotionally activate someone there. Not easy, but we can see emotional activation happening for the positive. We can see emotional activation happening for these kinds of negative things, but I'm not sure we know how to counter the emotional activation that's happening for the negative things. There's not like, oh, there's a positive that's going to pull them in the other direction. It's almost like we have to encourage people to be less emotionally activated and to be more thoughtful about how they're engaging with this content that's made... It's designed to emotional activate them, and so in some ways it's almost teaching folks to be a little more in tune for how their emotions are being manipulated when they're encountering content online, and less about trying to come up with some counter-mobilization for positive, because I'm not sure that there's a pathway for that that makes sense.

Ken Harbaugh:

I think that's exactly what I'm getting at, because it feels like, in contrast to a campaign to generate a response around a natural disaster or something like that, the political arena is fundamentally different in that it's oppositional by nature, and most responses in that political arena are going to be opposing someone else. So, I guess you're right. The challenge to misinformation has to be to dial down the emotion, not to create an equally powerful counter-emotion.

Dr. Kate Starbird:

Indeed. Yeah, because when we're emotionally activated, whether it's to try to help in a crisis event or whether it's emotionally activated around political stuff, we're very vulnerable to spreading this information. In fact, my team's research comes into this space, initially studying rumoring and misinformation in crisis events, because we began to recognize that people who were responding online were very vulnerable to spreading misinformation about the crises, sometimes by people who are trying to manipulate them, and often just because it was accidental misinformation, because when we're all caught in something, we're excited, we're activated, this is a time where we shutdown that part of our brain that does the verification, or we make these choices that are, “Well, it's better for me to spread this just in case it's true, or I want to spread this because I want to show people that I'm part of this thing, and it's not as important whether or not it's true because there's these other reasons that I'm sharing it.”

That happens in both of these different kinds of contexts, and we can think about how to make people more aware that when you're in that excited state, this is where you can actually do some damage to the things that you care about.

W hen we really look at the research on the spread of misinformation in online spaces, we've talked about how it's hard for people to determine whether or not something's true, and that's true, or we need to do a better job teaching critical thinking. Okay, this is probably true too. But it turns out that most people when they're engaging in political discourse, they don't bring their critical thinking skills to unpack and tear down the things they like. They only take their critical thinking skills to unpack and tear down the things they don't like.

So, until we can actually motivate them to bring that skepticism and that critical thinking to the stuff that's politically aligned with them, and actually vet that stuff, it does no good to teach media literacy. So, media literacy, it can't just be about... Media literacy can't just be about figuring out whether something's true or false. It has to be about teaching people when to bring in our skills for figuring out what things are true and false, and to be more likely to bring it towards content that we like, that aligns with what we believe. That's the content we need to be vetting the most, and unfortunately, a lot of us don't do that when we engage with content online.

Ken Harbaugh:

Why, in the same vein, are conspiracy theories so comforting to people in trying to make sense... Well, I won't guess as to why, but you've studied this, and conspiracy theories have an ability to propagate in ways that the mundane truth does not. Why is that?

Dr. Kate Starbird:

In some ways, conspiracy theories are sort of an outgrowth of the natural sense-making process that we go through when we're trying to make sense of anything in the world, especially a crisis event, but political discourse, a pandemic, potential climate change. We want to… These are crises that we're experiencing as societies, and we come together and we try to make sense of things. Conspiracy theories are an outgrowth of this natural sense-making process and are kind of like a corrupted version where they're not starting from, "Okay, here's the evidence, and let's try to figure out what's going on." They start with a couple of presumed theories about the world, and then they try to assemble that evidence to fit that theory.

So, they start off with ‘Things aren't what they seem. There's some powerful person or powerful people, a small set of powerful people who are trying to pull the strings of what's happening in the world, and they're trying to pull one over on us, and I'm going to take these pieces of information that I have and assemble in some way to fit that theory.’

So, that's what we see over and over again in the crisis space, where instead of trying to come to terms with what's happening, people assemble evidence to try to prove that things aren't as they seem, and that someone specific entity is pulling the strings of world events.

Why are they so attracted? There's so many reasons. It's the reason we'd rather watch a thriller on television than read a book about what happened yesterday in some mundane environment. Conspiracy theories are entertaining. Effective conspiracy theories draw on previous conspiracy theories that we believed already or believe a little bit of. They also often make us feel smart or make us feel like we have special knowledge that other people don't have. That can be one appeal of conspiracy theories. Often, they try to take complex things and random things in the world, and they try to put a pattern over them, and an explanation, and assign a simple cause or a villain, and it might be a complex story, but there's usually, behind the complex story of a conspiracy theory, is a simplistic idea that there's a certain set of villains that are causing bad things in the world. Often, the world is a lot more complicated than that, but conspiracy theories help us try to put patterns onto that that help us make sense of things, and even if it's not the truth, it might be more comforting to us than the truth.

Ken Harbaugh:

With that, I guess historical appreciation for the recurrence of conspiracy theories, do you see the QAnon adherence as uniquely dangerous, or just the reiteration of conspiracy theorists going back to time immemorial? Are we overreacting?

Dr. Kate Starbird:

I have no idea if we're overreacting. I thought I was overreacting in 2012, or 2013. The first thing, I was doing some research and saw some conspiracy theories, and we were like, "Oh, we're probably overreacting to this. Let's not even put it in the paper. We don't really want to talk about this. Let's talk about the rumors we're seeing, and not the conspiracy theory, because it just... We don't want to draw attention to it. It seems like such a small thing." Certainly, we're at a different point now, and QAnon certainly has a lot of sighs.

Ken Harbaugh:

Can you summarize it? But it's wildly entertaining, as disturbing as it is.

Dr. Kate Starbird:

I don't think I can summarize it. QAnon is a combination of so many different conspiracy theories at once, and it adapts, as any cult does, to new information that doesn't have... A lot of cults have, “Oh, this will be the end of days”, and then the end day is not the end day. There's another end day, and then they've got to adapt to something different because their prophecies don't come true. So, QAnon has had so much change over the course of its, what is it? Four or five years old now? It's had so much over the course of those years that it's difficult to simply summarize, but the things that most interesting about QAnon are this participatory nature of it where initially, this person who called themselves Q, they were putting out these clues, and then allowing these online crowds to try to make sense of the clues, and what the online crowds would do was come up with some explanation for the clue, and it was either a prediction of something or an explanation of something else.

So, they were creating the mythology. It was this participation, participatory creation of their own mythology, and it was also like a game that they were all playing together, and they were coming together and connecting. We could see when I looked at the data, a lot of people... It looked like an online gaming exercise, and they were all playing together, working together to come up with what the clue meant, and then once they had a good theory for the latest clue, they would try to work together to spread it as far as they could, and so they had these different kinds of tasks that they would take part in. So, it's a massive multiplayer game, and with this ideology that brings together just about every conspiracy theory we've ever heard of, all into one umbrella, and with this idea initially that there was some massive pedophile ring all over the world, but mostly initially connected to the US Democrat politicians, but that got adapted for other contexts, and then that Donald Trump was going to court marshal all these people, and then they would be, I think, publicly hung or something.

I mean, I think it was some pretty dark ideology, and then they kept predicting that these different things would come true and that certain things he said were signals that this thing was happening. I mean, I can't even get my head wrapped around the whole thing, to be honest, and I'm not sure that even the adherence could because it was constantly changing and being updated. But it doesn't work without the internet because the internet is what brought people together at such a scale.

So, when you think about are we overreacting, the thing that makes us nervous about QAnon is the scale of it is of a level that we haven't seen ever in terms of reaching that many people so fast. We can see religions develop, and over time, eventually gain foothold and spread really far, but here we have this sort of conspiracy-based religion movement that's just expanded all over the world extremely quickly.

Ken Harbaugh:

I know you're not a historian by training, but can you think of examples of societies in the past successfully challenging these kinds of movements, and successfully depolarizing or deradicalizing massive shares of their populations, without violence, that is?

Dr. Kate Starbird:

I don't know. I can't think of something similar enough, and when I can, right, as you're asking me to, I'm thinking of things, and the ending is very dark for a lot of those.

I don't think we can reduce QAnon to a cult, but I think understanding the psychology of cults can be helpful in thinking about some adherence of QAnon, and often, they don't end well for the folks that are most deeply connected to them. I think one hope is here in an online environment that the connections might not be as deep as they would be for people that are together physically as part of these things, so I think there may be something different there. People always make that accusation, activism isn't the same online, it's not as strong, people don't really know each other.

Perhaps there's some hope there, in that a lot of this could be, perhaps, be disrupted by other kinds of interpersonal interactions with people in their physical worlds, but I don't know. I don't think history gives us a lot of great examples for how this might go, in part because it's such a different moment in terms of the numbers of people and how they're connecting.

Ken Harbaugh:

In getting read for this interview, I went down a few rabbit holes, but one of them got me thinking about the emergence of new technologies and how massive adoption is always disruptive to the social order, and I read about the printing press and the challenges to Catholic Orthodoxy that that led to, which on one hand makes you want to believe that we've been here before. We'll get through it. Technological upheaval is cyclical, and it's just part of the evolution of human societies. On the other hand, a lot of these stories end with massive bloodshed, like the religious wars across Europe. So, I don't know what the question in that is, but do you have a prediction, I guess, of how this latest technological upheaval will end?

Dr. Kate Starbird:

I tend not to predict things, but I agree that our observations of history suggest that these kinds of disruptions can lead to bloodshed and upheaval of social organization. I'm hopeful that that's not where we're headed. I think we have the benefit of being able to look back at history. I think we have different incentives, and we connected in different ways that could perhaps help alleviate that, even though our connections cause some of these issues as well. So, I'm hopeful that that's not exactly where we're headed, but I do think we're in a moment of disruption, and how long that moment lasts and what comes out the other side I don't think we know.

Ken Harbaugh:

Well, I hate ending things on a bad note, so I wasn't intending to go here, but another rabbit hole I went down was reading about some of your family lineage and your-

Dr. Kate Starbird:

Oh, my goodness.

Ken Harbaugh:

If I'm not mistaken, your grandfather served on the same Olympic team with Jesse Owens at the 36th Olympics in Berlin. Is that true?

Dr. Kate Starbird:

One correction. Both of my grandfathers were on the modern pentathlon team in 1936, so they were both there.

Ken Harbaugh:

Incredible.

Dr. Kate Starbird:

Yeah.

Ken Harbaugh:

Did you ever get them to share stories of that, of marching in the US Team opening parade in front of a review stand with Adolf Hitler at the top of it?

Dr. Kate Starbird:

Yeah. So, my father's father died when I was eight years old, so I did not have a chance to talk to him much about it. My mother's father, he decided what he talked about, when, and so we didn't ask a lot of questions, but I do remember one year he was visiting us, and we were watching the Olympics, and the coverage was going on. It must've been a commercial break, and he said, "Well, I knew a great man once," and of course, whenever he talked we would all stop. We turned off the television, and he said, "I knew a great man once, Jesse Owens," and he went on to tell the story of meeting Jesse Owens on the boat, and how that changed his whole perception of race and really altered his outlook on a lot of things. He had just the greatest amount of respect for Jesse Owens. As kids, we just remember that, of being... My grandfather didn't tell a lot of stories, but when he did, we all listened, and that was one of the few that I remember.

Ken Harbaugh:

And that was the boat across the Atlantic, I take it?

Dr. Kate Starbird:

Yeah. They were on there for several days. They had to train, and they got to know each other and have conversations, and I think… And then once they get there, of course, it's 1936 in Berlin, and there's all sorts of interesting things happening. My grandfather wrote letters home that I own now, but I have not yet digitized. At some point in my life, I plan to digitize and put those out into the world.

Ken Harbaugh:

Well, I hope you do.

Dr. Kate Starbird:

Yeah.

Ken Harbaugh:

That's extraordinary. I'm glad I asked. Dr. Starbird, it's been great having you on Burn the Boats. Thanks so much for coming on.

Dr. Kate Starbird:

All right. Thank you so much, and thanks for the great questions.

Ken Harbaugh:

Thanks again to Kate for joining me.

You can find her on Twitter at @katestarbird

Thanks for listening to Burn the Boats. If you have any feedback, please email the team at [email protected]. We’re always looking to improve the show.

For updates and more, follow us on Twitter at @Team_Harbaugh.

And if you enjoyed this episode, don’t forget to rate and review.

Thanks to our partner, VoteVets. Their mission is to give a voice to veterans on matters of national security, veterans’ care, and issues that affect the lives of those who have served. VoteVets is backed by more than 700,000 veterans, family members, and their supporters. To learn more, go to VoteVets.org.

Burn the Boats is a production of Evergreen Podcasts. Our producer is Declan Rohrs, and Sean Rule-Hoffman is our Audio Engineer. Special thanks to Evergreen executive producers Joan Andrews, Michael DeAloia, and David Moss.

I’m Ken Harbaugh and this is Burn the Boats, a podcast about big decisions.

View Less

Recent Episodes

View All

Jimmy Soni: The Influential Founders of PayPal

Burn the Boats | S:1 E:80
PayPal was the launching pad for some of the world’s most impactful tech entrepreneurs, who went on to create companies like SpaceX, YouTube, Yelp...
Listen to Jimmy Soni: The Influential Founders of PayPal

Dan Pfeiffer: Battling the Big Lie

Burn the Boats | S:1 E:79
Dan Pfeiffer, Co-Host of Pod Save America, discusses his new book and the messaging of the Democratic Party....
Listen to Dan Pfeiffer: Battling the Big Lie

David Gergen: Passing the Torch to Gen-Z

Burn the Boats | S:1 E:78
Former Presidential Advisor David Gergen discusses leadership, passing the torch to younger generations, and his new book, Hearts Touched with Fir...
Listen to David Gergen: Passing the Torch to Gen-Z

Lecia Michelle: The White Allies Handbook

Burn the Boats | S:1 E:77
Lecia talks about her new book, The White Allies Handbook, and how white people must use their voices to actively dismantle racism....
Listen to Lecia Michelle: The White Allies Handbook