Bringing The World Home To You

© 2024 WUNC North Carolina Public Radio
120 Friday Center Dr
Chapel Hill, NC 27517
919.445.9150 | 800.962.9862
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Exploring YouTube And The Spread Of Disinformation

RACHEL MARTIN, HOST:

When Congress holds hearings about the role of social media in the spread of misinformation, they call the CEOs of Twitter and Facebook, but neither has as much influence as the CEO of Google because Google owns YouTube, and YouTube's reach around the world is massive.

(SOUNDBITE OF MUSIC)

MARTIN: According to the company, which is an NPR supporter, every day people watch 1 billion hours of video on YouTube - 1 billion. Almost 80% of all Internet users around the world have a YouTube account, and content on the platform is available in 80 languages. So wrap your head around those numbers, and then think about how quickly a single piece of false information can spread.

(SOUNDBITE OF MONTAGE)

UNIDENTIFIED PERSON #1: President Trump won four more years in office last night.

At this point, there is evidence...

This is a decisive victory for Trump.

This is a decisive victory for Trump.

(CROSSTALK).

UNIDENTIFIED PERSON #1: ...Are trying to steal it.

MARTIN: None of that is true, and all of those videos are still accessible on YouTube.

(SOUNDBITE OF MUSIC)

MARTIN: That misinformation distorts reality, and it changes how some people see the world. Then it changes them and their relationships. And that's what we're going to look into this morning as part of NPR's ongoing exploration of mis- and disinformation.

RENEE EKWOGE: My father has always been the most gentle person that you can imagine - and loving. And you didn't see him get angry a lot.

MARTIN: This is 39-year-old Renee Ekwoge. Growing up in Georgia as a kid, she remembers her dad full of wonder.

EKWOGE: You know, I remember getting up in the middle of the night, and it was just me and him and watching the Hale-Bopp comet, you know, when I was, like, 15 years old. And that was just, like a, you know, thing that we did - you got up in the middle of the night, and you watched a comet.

MARTIN: As she got older, her dad's wonder turned in a much darker direction.

EKWOGE: So after 9/11, he got very into the 9/11 truther movement - how, you know, the jet fuel can't melt steel beams, I think, is kind of, like, a meme at this point, right?

MARTIN: And then there were all the wild stories about government plots to poison the population. She engaged him just enough to be polite, and then she'd try to change the subject. It was irritating but manageable.

EKWOGE: And then the world started being flat.

(SOUNDBITE OF MUSIC)

EKWOGE: And he said, well, it's all just about mind control. And at that point, things started to, you know, happen more frequently.

(SOUNDBITE OF MUSIC)

MARTIN: He started sending her YouTube videos on everything from flat-Earth conspiracies to the false theory that the pandemic is a government plot.

EKWOGE: My father has sent me 50 videos since March of last year, so about one a week, that are each, you know, a half-an-hour-long or longer. And those are just the ones that he's forwarding. So I don't know how many he's watching. But I know that when he's sending it, he's getting more reception because you've got more and more people falling into this online - you know, believe everything on YouTube.

MARTIN: When Renee says her dad is getting more reception, she means when he posts YouTube videos on other social media platforms like Facebook, he gets reaction from people who then may share it with their friends, and the lie is given new life. We're going to talk more about that in a minute. We did reach out to Renee's dad to try to talk with him for this story. He never responded. Renee says the YouTube videos that he has sent her over the past year have gotten more and more inflammatory.

EKWOGE: You know, they're just so good at hyping people up and making you angry about this little thing and repeating an opinion over and over and over again until it sounds like a fact and being very snarky and demeaning with people who don't agree.

(SOUNDBITE OF ARCHIVED RECORDING)

UNIDENTIFIED PERSON #2: They were wrong about death totals, fatality rates. They were wrong about masks. They were wrong about ventilators.

UNIDENTIFIED PERSON #3: Wrong.

UNIDENTIFIED PERSON #2: They were wrong...

EKWOGE: So I watched one video where this guy was misrepresenting the CDC statistics on comorbidities, you know?

(SOUNDBITE OF ARCHIVED RECORDING)

UNIDENTIFIED PERSON #4: But first, there's been a new twist in the COVID crisis.

EKWOGE: So, essentially, he was heavily implying that people in car accidents or with, you know, terminal cancer were being counted as COVID deaths.

(SOUNDBITE OF ARCHIVED RECORDING)

UNIDENTIFIED PERSON #4: That means of the 161,392 COVID deaths that we've been shoving in your face, only 9,210 were legitimate COVID deaths.

MARTIN: Renee did her own research and confronted her dad about the false COVID stats in the video.

EKWOGE: And he didn't want to hear that. You know, it was just very emotional. And so we had a heated exchange, and he said, Renee, shut up. He sent me, since then, three more videos, even though I have repeatedly asked him to stop. And I haven't responded to - replied to any of them because I'm just - you know, what's the point? Have another fight?

(SOUNDBITE OF ARCHIVED RECORDING)

UNIDENTIFIED PERSON #4: And does this new information that proves that COVID is far less deadly than we've been trying to get you to believe mean you shouldn't live with intense fear anymore? Absolutely not. As your trusted...

KATE STARBIRD: I've seen this guy before in different, kind of funny videos prior to COVID.

MARTIN: This is Kate Starbird. She's a co-founder and researcher at the Center for an Informed Public at the University of Washington in Seattle.

STARBIRD: And actually, a couple of the folks that I saw spreading other COVID-related misinformation in my Facebook feeds spread a lot of this guy's videos.

MARTIN: She says YouTube is the biggest proliferator of false information. And a reminder here - misinformation is false but not necessarily intentionally so; disinformation is false information designed explicitly to change people's perception of something.

STARBIRD: We've done research on disinformation around 2016 election, around the civil war in Syria, conspiracy theories of crisis events. I've got a bunch of different cases. Over and over again, YouTube is the dominant domain in those conversations. It's not Facebook. They're all pulling in content from YouTube. So what YouTube does is it creates these content resources that get mobilized on other platforms. And so it's not just a problem within YouTube; it's actually a problem for the whole information ecosystem - the fact that YouTube hosts and allows those videos to be resources that are repeatedly mobilized in these other platforms at opportunistic times to spread mis- and disinformation.

MARTIN: And Starbird says it's hard to understand the real scope of the problem.

STARBIRD: We don't have as great insight into YouTube because it's harder to see. It's one of the platforms that's hardest to collect data about. And it's very - it's almost inscrutable for us compared to some other platforms, like Twitter, where we can collect lots of data and look at things. So YouTube fits centrally into the phenomenon, but it actually - it is harder to access for our research teams.

MARTIN: Explain why that is.

STARBIRD: Yeah, so the first thing - there are very few - we call them APIs, application programming interfaces that we can use to access data on different platforms. For Twitter, any data that you can see publicly on Twitter, we can collect it as researchers and look at it. On Facebook, there are a couple of tools that we can use to access public data and look at sort of public pages. On YouTube, there are very few sort of APIs that we can use, and the ones we use, we very quickly run into rate limits, where we're only allowed to collect so much data. For that reason, we really can't see the larger-scale phenomena, like misinformation and disinformation, with much resolution. We can see, you know, pieces here and there, but we can't systematically collect large-scale data from YouTube to use in our research.

MARTIN: Why not? Who sets those rate limits?

STARBIRD: Oh, the platforms themselves set the rate limits. And Twitter has been very liberal in terms of how they allow access to their data. And for Facebook, it's much harder. There's only certain kinds of access. And for YouTube, we really feel like we're kind of navigating in the dark a little bit. The platform itself has decided not to make their data as accessible.

MARTIN: As for that YouTube video of the guy with the false stats about COVID, YouTube told us it doesn't meet their criteria for misinformation. Even Kate Starbird told us it's borderline because it's satirical, and comedy isn't meant to be taken as fact, even though people do, which makes all of this more complicated.

YouTube wouldn't put forward a representative to talk with us on the air, but company spokesperson Elena Hernandez gave us a statement, saying that in January of 2019, YouTube changed its algorithms to, quote, "ensure more authoritative content is surfaced and labeled prominently in search results." The company says, from October to December of 2020, they removed 1.4 million videos from their site which they consider to be, quote, "spam, misleading information or scams." As a reference point, YouTube's own stats say more than 500 hours of content are uploaded to the site every minute.

For Renee Ekwoge, these mitigation efforts are just too little, too late, and she fears the same for others who have seen relationships with loved ones disintegrate.

EKWOGE: All the time, growing up, my story was extremely unique. I had no friends whose parents were, you know, conspiracy theorists. And now it's just - everyone has the same story. And it's heartbreaking because you just think, like, all these people that were these doting grandparents and doting parents and, you know, kind and wonderful people, and now all they can do is send you these videos that are just full of hatred and fear and ugliness.

(SOUNDBITE OF MONTAGE)

UNIDENTIFIED PERSON #5: Now linking to autism - big time autism here. Cancer, disease, arthritis - all these other things.

UNIDENTIFIED PERSON #6: You're going to kill people. You're absolutely going to kill people with these vaccines.

UNIDENTIFIED PERSON #7: And, like, sending them out into the world with these psychos everywhere, like, this is [expletive] up.

MARTIN: Renee says she hasn't seen her dad in about a year. She hasn't talked to him for at least four months, and she's not sure what the path back looks like.

Is a part of you grieving your relationship right now?

EKWOGE: Absolutely. I mean, it's - yeah, I mean, I can't even, like, think about it because it's - you know, I mean, like, what am I supposed to tell my daughter? You know, she's 5 years old, but, you know, I don't know how to tell her, like, hey, Grandmother called for your birthday, but Pops didn't, but he did send this string of angry videos for us to watch. You know, it's like, how am I supposed to reconcile that?

(SOUNDBITE OF ALEXANDRA STRELISKI'S "BERCEUSE")

MARTIN: Renee was talking with me, answering my questions, and then she shifted, and she started talking directly to her father and other people's family members who've been lost to online conspiracy theories.

EKWOGE: If you hear this, you know, like, we miss you. We all miss you.

(SOUNDBITE OF ALEXANDRA STRELISKI'S "BERCEUSE") Transcript provided by NPR, Copyright NPR.

Rachel Martin is a host of Morning Edition, as well as NPR's morning news podcast Up First.
Stories From This Author