Fake News, Facebook And The Truth About Misinformation

Mar 6, 2018

Brian Southwell is a researcher at RTI International and an editor of 'Misinformation and Mass Audiences,' which explores connections between fake news and social media.
Credit Courtesy of Brian Southwell

How long does it take the brain to input information and process it as fact or fiction? Not long, according to Brian Southwell. He is a researcher at RTI International and co-editor of the book “Misinformation and Mass Audiences” (University of Texas Press/2018) who looks at the science and psychology behind "fake news.” 

'Misinformation and Mass Audiences,' co-edited by Brian Southwell, Emily Thorson, and Laura Sheble, is available now from University of Texas Press.
Credit Courtesy of Brian Southwell

Southwell interrogates why people share stories they know are not true and what responsibility social media networks should have in the distribution of misinformation.

Host Frank Stasio talks with Southwell, a professor at both Duke University and the UNC-Chapel Hill, about findings that may scare people away from their Twitter feed and give them tools to protect themselves from misinformation and disinformation.


On how quickly we process if information is fact or fake:
We are actually biased towards acceptance of information at face value … [Philosopher Spinoza] suggested that we actually take in information, accept it and then later tag it as being true or false.  And that slight step – that subtlety – actually makes all the difference ... And it suggests that there’s this complication because we actually accept information, and if we have the energy or the time or the effort, we can label that as being true or false. But it leaves the door open for the spread of misinformation.  

Senator Dianne Feinstein (D-CA) Grills Social Media Giants Over Distributing MisInformation:




On the social responsibility of Google, Facebook, Twitter and other platforms:

They are distributors of information. They’re not in that way any different than other media outlets.  At the same time, we might also worry about the absolute remedy that might be in some people’s mind which would be a lot closer to censorship than I’d be comfortable with.  And so, there’s this trade off. I think there absolutely is a responsibility for the information that gets distributed. At the same time we want to make sure we don’t have a system that’s so sanitized and so airtight that the truth can’t find its way in.

On the problem with fake social media posts and algorithms that keep spreading them:  

This is a really a fascinating arena now to try to figure out what the right remedies could be both in terms of technological possibilities and in terms of ethics … Actually we’re part of an effort with the Rita Allen Foundation to issue a call for technological innovation and educational innovation and the remedy against the spread of misinformation. We’re working on that later this year with a big forum in Washington, D.C. I think that there are possibilities that have to do with the mechanics of how these platforms operate, and they’re thinking about this all the time too.

Facebook founder Mark Zuckerberg finally admits to a breach and announces his plan: