Fake Coronavirus Cures and Theories Give Social Media its Biggest Headache in Years
Fake Coronavirus Cures and Theories Give Social Media its Biggest Headache in Years
The presence of misleading theories and peddling of false medication presents life-endangering threats on social media, raising a grave threat.

The biggest reputational risk Facebook and other social media companies had expected in 2020 was fake news surrounding the US presidential election. Be it foreign or domestic in origin, the misinformation threat seemed familiar, perhaps even manageable. The novel coronavirus, however, has opened up an entirely different problem: the life-endangering consequences of supposed cures, misleading claims, snake-oil sales pitches and conspiracy theories about the outbreak.

So far, AFP has debunked almost 200 rumors and myths about the virus, but experts say stronger action from tech companies is needed to stop misinformation and the scale at which it can be spread online. "There's still a disconnect between what people think is true and what people are willing to share," Professor David Rand, a specialist in brain and cognitive sciences at the MIT Sloan School of Management, told AFP, explaining how a user's bias toward content he or she thinks will be liked or shared typically dominates decision-making when online.

Part of the reason is that social media algorithms are geared to appeal to someone's habits and interests: the emphasis is on likeability, not accuracy. Changing that would require Facebook, Twitter and other such companies to alter what people see on screen. Prompts urging users to consider the accuracy of content they are spreading on social networks are needed, said Rand, co-author of a study on COVID-19 misinformation that was published earlier this month.

Deadly consequences

Using controlled tests with more than 1,600 participants, the study found that false claims were shared in part simply because people failed to think about whether the content was reliable. In a second test, when people were reminded to consider the accuracy of what they are going to share, their level of truth awareness more than doubled.

That approach -- known as "accuracy nudge intervention" -- from social media companies could limit the spread of misinformation, the report concluded. "These are the kind of things that make the concept of accuracy top of the minds of people," said Rand, noting that news feeds are instead filled by users' own content and commercial advertisements.

"There probably is a concern from social networking companies about accuracy warnings degrading the user experience, because you're exposing users to content that they didn't want to see. But I hope by talking about this more we'll get them to take this seriously and try it." What is undoubted is that misinformation about the novel coronavirus has been deadly. Although US, French and other scientists are working to expedite effective treatments, false reports have appeared in numerous countries.

In Iran, a fake remedy of ingesting methanol has reportedly led to 300 deaths, and left many more sick. Dr. Jason McKnight, assistant clinical professor in the Department of Primary Care and Population Health at Texas A&M University, said the sharing of false information has an impact beyond the immediate risk of the virus itself.

"I have seen posts related to 'treatments' that are not proven, techniques to prevent exposure and infection that are either not proven and/or filled with a lot of misleading information, and instruction for individuals to stock up on supplies and food," he said. McKnight highlighted two types of danger posed by inaccurate information on the virus: that it "could incite fear or panic," and "the potential for individuals to do harmful things in hope of 'curing the illness' or 'preventing' the illness."

'Immediate positive impact'

Facebook took a hammering over Russia's interference in the 2016 US election. Having been accused on Capitol Hill of ignoring the allegations, Facebook conceded the following year that up to 10 million Americans had seen advertisements purchased by a shadowy Russian agency. As evidence mounted about how Russia had used Facebook to sow division, company CEO Mark Zuckerberg apologized.

Facebook has placed authoritative coronavirus information at the top of news feeds and intensified its efforts to remove harmful content, including through the use of third-party fact checkers. Zuckerberg also said earlier this month that a public health crisis is an easier arena than politics to set policies and to take a harder line on questionable content.

AFP and other media companies, including Reuters and the Associated Press, work with Facebook's fact checking program, under which content rated false is downgraded in news feeds so that fewer people see it. If someone tries to share such a post, he or she is presented with an article explaining why the information is not accurate.

However, a Facebook spokeswoman declined to comment on the potential for adding accuracy prompts to its platform. A Twitter spokesman, in a statement to AFP, also did not address whether the company might consider using prompts. "Our goal has been to make certain everyone on our service has access to credible, authoritative health information," he said. "We've shifted our focus and priorities, working extensively with organizations like the WHO, ministries of health in a number of countries, and a breadth of public health officials."

The COVID-19 misinformation study mirrored past tests for political fake news, notably in that reminders about accuracy would be a simple way to improve choices about what people share. "Accuracy nudges are straightforward for social media platforms to implement on top of the other approaches they are currently employing, and could have an immediate positive impact on stemming the tide of misinformation about the COVID-19 outbreak," the authors concluded.

What's your reaction?

Comments

https://lamidix.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!