NOEL KING, HOST:
A doctored video posted on Facebook seemed to show House Speaker Nancy Pelosi disoriented and slurring her words. It got millions of views. It also spread across social media. That video raises a question. As we get closer to the 2020 election, how worried should we be about misinformation hijacking the political conversation? NPR media correspondent David Folkenflik is on the line. Good morning, David.
DAVID FOLKENFLIK, BYLINE: Good morning.
KING: So we have a doctored video that was widely spread any way. Do videos like this influence what people think?
FOLKENFLIK: Sure, they do. I think that they do it on a couple of levels. The first level is that there are people for whom they say, oh, my gosh, the House speaker, you know, we've got to be concerned about her health. Or there may be people not predisposed to be favorable to Democrats, to liberals, to the House speaker, and happy to have this against her. So there are people who may believe something of this to be true. And there are people for whom it's a little bit like waving, I guess, the Terrible Towels for Pittsburgh Steelers fans, like David Greene. You know, it's a way for them to cheer on their side. And so they use that as a rallying call whether or not they believe fundamentally it's true.
But don't forget. Go back to the 2016 race. Hillary Clinton had videos like this circulate when she had the flu or a cough, when she stumbled once getting into an SUV and people claimed she was near death, you know, every few days, almost as though she had died hundreds of deaths between when the campaign started and when it ended.
KING: News organizations were able to prove very quickly that the video of Pelosi had been slowed down and edited. That's why it seemed like she was slurring. So it's been debunked. Does that change anything? Or once it's out there, it's out there?
FOLKENFLIK: Bit of both. I think there are people for whom they say, oh, maybe back off a bit. But it's still in your mind. That's a little bit like saying don't think about a pink elephant.
KING: Yeah.
FOLKENFLIK: They've planted the idea. Yasmin Green is with Jigsaw, which is kind of an offshoot of Alphabet Google, the think tank. She introduced me the idea of the liar's dividend, which is that whether or not something is true, whether or not it's fully believed, it's taking up oxygen. It's taking up mind space. So people are thinking about this issue. People are talking about this issue. Hey, you and I are, in a sense, talking about this issue.
KING: Yeah.
FOLKENFLIK: And it's an issue that has no grounding or validity whatsoever so far as we know.
KING: When the video came out, people alerted Facebook. And Facebook flagged it as fake. But Facebook didn't take it down. What was their explanation for not removing it, given that it's not real?
FOLKENFLIK: Well, there are a couple grounds, one of which is they say, you know, we want to give people sufficient information to be able to reach good, sound conclusions, make the choices they want to in life - which sounds a little bit like journalism. Facebook belatedly took some steps. After a few days, it flagged the video as problematic and offered users links to fact checks if they wanted to go there instead of to see the video itself. They also say they're not journalists.
They say, we are a social media platform. We want to connect people and allow them to talk. They are in the business of conveying content. And in that content is journalism and news and true information. And at the same time, they're very nervous about exercising that. And part of that is just the scope of what they wanted to do.
Mark Zuckerberg, the founder, CEO, recently boasted of how many billions of fake accounts they wiped out. But new studies reported on by BuzzFeed suggest actually they've done that. But also, there are more fake accounts on Facebook than ever. So it's an internal process in which they are eternally behind the curve in trying to catch up with things.
KING: And given the media landscape, it's probably not as easy as just saying Facebook should be more aggressive in taking these videos down, right?
FOLKENFLIK: Well, I think Facebook is incapable, really, of doing what it needs to do to show that it's a responsible steward of this. And there are those who say, look, there are First Amendment issues here. And do we want free speech calls being made by people in Silicon Valley, whose fundamental obligation is not to the citizen but their shareholders, who ultimately have a bottom-line profit motive to try to ensure we're fully engaged every minute they can get us to be so?
KING: NPR media correspondent David Folkenflik. Thanks, David.
FOLKENFLIK: You bet. Transcript provided by NPR, Copyright NPR.