Research? Sure. Accurate? Yeah, Sort Of

October 19, 2021

Facebook is currently under scrutiny unlike any it has seen since the 2018 Cambridge Analytica scandal. Ironically, much of the criticism cites research produced by the company itself. The Verge discusses “Why These Facebook Research Scandals Are Different.” Reporter Casey Newton tells us about a series of stories about Facebook published by The Wall Street Journal collectively known as The Facebook Files. We learn:

“The stories detail an opaque, separate system of government for elite users known as XCheck; provide evidence that Instagram can be harmful to a significant percentage of teenage girls; and reveal that entire political parties have changed their policies in response to changes in the News Feed algorithm. The stories also uncovered massive inequality in how Facebook moderates content in foreign countries compared to the investment it has made in the United States. The stories have galvanized public attention, and members of Congress have announced a probe. And scrutiny is growing as reporters at other outlets contribute material of their own. For instance: MIT Technology Review found that despite Facebook’s significant investment in security, by October 2019, Eastern European troll farms reached 140 million people a month with propaganda — and 75 percent of those users saw it not because they followed a page but because Facebook’s recommendation engine served it to them. ProPublica investigated Facebook Marketplace and found thousands of fake accounts participating in a wide variety of scams. The New York Times revealed that Facebook has sought to improve its reputation in part by pumping pro-Facebook stories into the News Feed, an effort known as ‘Project Amplify.’”

Yes, Facebook is doing everything it can to convince people it is a force for good despite the negative press. This includes implementing “Project Amplify” on its own platform to persuade users its reputation is actually good, despite what they may have heard elsewhere. Pay no attention to the man behind the curtain. We learn the company may also stop producing in-house research that reveals its own harmful nature. Not surprising, though Newton argues Facebook should do more research, not less—transparency would help build trust, he says. Somehow we doubt the company will take that advice.

A legacy of the Cambridge Analytica affair is the concept that social media algorithms, perhaps Facebook’s especially, is reshaping society. And not in a good way. We are still unclear how and to what extent each social media company works to curtail false and harmful content. Is Facebook finally facing a reckoning, and will it eventually extend to social media in general? See the article for more discussion.

Cynthia Murrell October 19, 2021

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta