Here's How Users React to Fake News on Twitter and Facebook
Here's How Users React to Fake News on Twitter and Facebook
Researchers watched 25 participants scroll through their Facebook or Twitter feeds while a Google Chrome extension randomly inserted fake content.

When it comes to fake news on social media platforms, some users outright ignore it, some take it at face value, some investigate whether it is true and some get suspicious of it but then choose to ignore it. Social media platforms, such as Facebook and Twitter, provide people with a lot of information, but it's getting harder and harder to tell what's real and what's not.

"We wanted to understand what people do when they encounter fake news or misinformation in their feeds. Do they notice it? What do they do about it?" said senior study author Franziska Roesner, an associate professor at the University of Washington in Seattle. "There are a lot of people who are trying to be good consumers of information and they're struggling. If we can understand what these people are doing, we might be able to design tools that can help them," added Roesner in a paper accepted to the 2020 ACM CHI Conference on Human Factors in Computing Systems.

The team watched 25 participants scroll through their Facebook or Twitter feeds while a Google Chrome extension randomly added debunked content on top of some of the real posts. Participants had various reactions to encountering a fake post. Previous research on how people interact with misinformation asked participants to examine content from a researcher-created account, not from someone they chose to follow.

"That might make people automatically suspicious," said lead author Christine Geeng, a doctoral student. "We made sure that all the posts looked like they came from people that our participants followed". The researchers either installed the extension on the participant's laptop or the participant logged into their accounts on the researcher's laptop, which had the extension enabled.

The team told the participants that the extension would modify their feeds -- the researchers did not say how -- and would track their likes and shares during the study -- though, in fact, it wasn't tracking anything. The extension was removed from the participants' laptops at the end of the study. Participants could not actually like or share the fake posts. On Twitter, a "retweet" would share the real content beneath the fake post.

The one time a participant did retweet content under the fake post, the researchers helped them undo it after the study was over. On Facebook, the like and share buttons didn't work at all. After the participants encountered all the fake posts -- nine for Facebook and seven for Twitter -- the researchers stopped the study and explained what was going on.

"Our goal was not to trick participants or to make them feel exposed. We wanted to normalize the difficulty of determining what's fake and what's not," said Geeng. While this study was small, it does provide a framework for how people react to misinformation on social media. The researchers can use this as a starting point to seek interventions to help people resist misinformation in their feeds.

What's your reaction?

Comments

https://lamidix.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!