NCC April Users Might Accept Corrections to Fake News, if Facebook Could be Bothered

April 28, 2022

Facebook (aka Meta) has had a bumpy road of late, but perhaps a hypothetical tweak to the news feed could provide a path forward for the Zuckbook. We learn from Newswise that a study recently published in the Journal of Politics suggests that “Corrections on Facebook News Feed Reduces Misinformation.” The paper was co-authored by George Washington University’s Ethan Porter and Ohio State University’s Thomas J. Wood and funded in part by civil society non-profit Avaaz. It contradicts previous research that suggested such an approach could backfire. The article from George Washington University explains:

“Social media users were tested on their accuracy in recognizing misinformation through exposure to corrections on a simulated news feed that was made to look like Facebook’s news feed. However, just like in the real world, people in the experiment were free to ignore the information in the feed that corrected false stories also posted on the news feed. Even when given the freedom to choose what to read in the experiment, users’ accuracy improved when fact-checks were included with false stories. The study’s findings contradict previous research that suggests displaying corrections on social media was ineffective or could even backfire by increasing inaccuracy. Instead, even when users are not compelled to read fact-checks in a simulation of Facebook’s news feed, the new study found they nonetheless became more factually accurate despite exposure to misinformation. This finding was consistent for both liberal and conservative users with only some variation depending on the topic of the misinformation.”

Alongside a control group of subjects who viewed a simulated Facebook feed with no corrections, researchers ran two variants of the experiment. In the first, they placed corrections above the original false stories (all of which had appeared on the real Facebook at some point). In the second, the fake news was blurred out beneath the corrections. Subjects in both versions were asked to judge the stories’ veracity on a scale of 1 – 5. See the write-up for more on the study’s methodology. One caveat—researchers acknowledge potential influences from friends, family, and other connections were outside the scope of the study.

If Facebook adopted a similar procedure on its actual news feed, perhaps it could curb the spread of fake news. But does it really want to? We suppose it must weigh its priorities—reputation and legislative hassles vs profits. Hmm.

Cynthia Murrell, April 28, 2022


Got something to say?

  • Archives

  • Recent Posts

  • Meta