Bringing The World Home To You

© 2024 WUNC North Carolina Public Radio
120 Friday Center Dr
Chapel Hill, NC 27517
919.445.9150 | 800.962.9862
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Author Interview: Adapting To Social Media's Disruptions In 'The Hype Machine'

MICHEL MARTIN, HOST:

In recent years, many of us have been thinking about and, in fact, worrying about how social media is affecting our daily lives, everything from how we vote to how we think we should look. A new book from a data scientist and entrepreneur argues that that is by design, and thus, social media, like any powerful tool, has both promise and peril. And to reap the benefits of these technologies and to avoid being victimized by them, we need to better understand them.

Sinan Aral teaches at the Massachusetts Institute of Technology, MIT. He's also invested in tech companies and consulted with some of the most prominent tech companies in the world. And he uses this experience to describe in lay terms how social media actually works. His new book is called "The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, And Our Health -- And How We Must Adapt." And Sinan Aral is with us now to tell us more. Welcome. Thank you for joining us.

SINAN ARAL: Thanks for having me.

MARTIN: So people who follow tech probably know your work and may know your name because in 2018, you and two colleagues made headlines when you published a study that found that lies travel faster than truth online. So, first of all, why is that? And is that what made you want to write this book?

ARAL: Well, in that study, which we published in Science, we found - we studied all the verified true and false news that spread on Twitter over 10 years. And we found that false news traveled farther, faster, deeper and more broadly than the truth in every category of information that we studied. The why is because of what we called the novelty hypothesis. So if you read the cognitive science literature, human attention is drawn to novelty. And if you read the sociology literature, we gain in status when we share novel information because it looks like we're in the know or because we have access to inside information. We showed that was true of false news as well. Why I wrote the book - I've been researching social media for 20 years. I've seen its evolution and also the techno utopianism and dystopianism. I thought it was appropriate to have a book that asks, what can we do to really fix the social media morass we find ourselves in?

MARTIN: You're saying that social media isn't something that just we experience. It's something that actually changes us.

ARAL: Absolutely. Very large-scale experiments have shown that social media can change voting behavior, can change how we exercise, certainly how we shop, how we date. Think about this for a second. This statistic blew me away. Relationships formed by algorithms surpassed relationships formed by human introductions in 2013. When I read that, I thought to myself, what do the matches that are created by these algorithms - how did they differ from matches that would have been created in real normal life if they weren't created by algorithms? What does that mean, for instance, for our genetic diversity or our evolution going forward if the matches that are being created in relationships via algorithms are different? So it has numerous effects on many different facets of society.

MARTIN: It's - talk a bit, if you would, about the - how it affects the way people vote. And I'm thinking about this because I'm thinking about a juror that I heard speak on, you know, one of the cable news programs after she served on a jury in which one of President Trump's associates, you know, was convicted of conduct related to trying to - was related to, you know, Russian malinfluence influence during the 2016 election. And this woman was a Trump supporter. And she was asked about - she acknowledged freely that she had voted for President Trump, but she still voted to convict the particular person because she felt that the evidence warranted that. And she was asked about whether, you know, the whole Russian influence campaign. And she said, well, I didn't see any Russians in the voting booth with me, so my vote had nothing to do with that.

ARAL: Yeah.

MARTIN: But what you're saying is, actually, they were there.

ARAL: Large-scale experiments involving over 60 million people on Facebook have demonstrated that very easy short messages on Facebook can dramatically change voter turnout. They did experiments in 2010 and in 2012, and they've shown that voter turnout can be affected. We have less indication that it would affect vote choice - that somebody who would definitely have been voting for Hillary Clinton would now vote for Donald Trump. But turnout and voter suppression could have a potential effect, not only in 2016, but now as we approach possibly the most consequential election of our generation.

MARTIN: It just - again, in the peril versus promise ledger, it just - what was disturbing is that it just seemed as though many of the proposed solutions have other unintended consequences which are also negative. For example, you said that people who - you know, if you cheered Twitter's decision to label fake news tweets, that the labels can also cause readers to distrust true news and create an...

ARAL: That's exactly right.

MARTIN: ...Implied truth effect that leads readers to believe that anything not labeled false is true.

ARAL: So when it comes to labeling of fake news, it's incredibly effective because, A, it creates in the user a critical reflection. And that has been shown in experimental studies when we're reflective about what we're reading, then we are much less likely to believe false news and much less likely to spread it. So labeling with algorithms, human moderators and the crowd together could be very effective. But it has to scale because if you don't label a sufficient number of things, the things that aren't labeled, consumers assume that they are true. That's the implied truth effect.

And so you can't just have a labeling solution. You have to have a labeling solution that scales. And we've actually done this relatively well in user ratings and reviews ferreting out false and fraudulent ratings and reviews, as well as sort of policing that entire system through Amazon, other commercial entities, Reddit, Yelp and so on. And I can imagine a labeling system that does work. The point of the book is that it has to be rigorous and nuanced.

MARTIN: For people who don't work in tech, who are not members of Congress but who are just listening to our conversation and also feel that something is wrong and something needs to be different, is there something that individuals can do right now to ameliorate the negative effects and, as you said, to kind of embrace the promise and minimize the peril of these technologies?

ARAL: Yeah. So just scheduling social media time in the day, and then however much you want, but then turning notifications off is a really good way to kind of dampen its effect in our lives. Why? Because social media, just like Pavlov's dogs that were conditioned to salivate at a bell thinking that it's food, the notifications of the likes, shares, when our phone vibrates or pings or lights up is like the same bell. We are conditioned neurologically to want to think about that like or that share. And it comes on a variable reinforcement schedule, meaning we don't know when the next ping is going to come.

So we're constantly thinking about our phones. If we schedule and turn notifications off, we'll kind of control it in our lives. And when it comes to fake news, the concepts that we talked about earlier - being reflective, checking your emotional pulse - if, for instance, you feel like something is disgusting, salacious, anger-inducing, that could be a sign that it's fake. Check the original source. Sometimes these are sources that are masquerading as true websites but that aren't with a slight change in the website name. And a lot of these fake news stories can be debunked with a few clicks. And so the 80-20 rule applies in that way to fake news. The book details many more examples of how individuals can behave to get control of social media in their lives.

MARTIN: Sinan Aral is a professor at the Massachusetts Institute of Technology, where he directs the MIT Initiative on the Digital Economy. His book, "The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, And Our Health -- And How We Must Adapt" is out now. Professor Aral, thanks so much for talking to us.

ARAL: Thanks for having me. Transcript provided by NPR, Copyright NPR.

More Stories