Social Media Engagement, Manipulation, and Bad Information

October 1, 2021

Researchers at Harvard’s NeimanLab have investigated the interactions between users and social media platforms. Writer Filippo Menczer shares some of the results in, “How ‘Engagement’ Makes You Vulnerable to Manipulation and Misinformation on Social Media.” Social media algorithms rely on the “wisdom of the crowds” to determine what users see. That concept helped our ancestors avoid danger—when faced with a fleeing throng, they ran first and asked questions later. However, there are several reasons this approach breaks down online. Menczer writes:

“The wisdom of the crowds fails because it is built on the false assumption that the crowd is made up of diverse, independent sources. There may be several reasons this is not the case. First, because of people’s tendency to associate with similar people, their online neighborhoods are not very diverse. The ease with which a social media user can unfriend those with whom they disagree pushes people into homogeneous communities, often referred to as echo chambers. Second, because many people’s friends are friends of each other, they influence each other. A famous experiment demonstrated that knowing what music your friends like affects your own stated preferences. Your social desire to conform distorts your independent judgment. Third, popularity signals can be gamed. Over the years, search engines have developed sophisticated techniques to counter so-called “link farms” and other schemes to manipulate search algorithms. Social media platforms, on the other hand, are just beginning to learn about their own vulnerabilities. People aiming to manipulate the information market have created fake accounts, like trolls and social bots, and organized fake networks. They have flooded the network to create the appearance that a conspiracy theory or a political candidate is popular, tricking both platform algorithms and people’s cognitive biases at once. They have even altered the structure of social networks to create illusions about majority opinions.”

See the link-packed article for more findings and details on the researchers’ approach, including their news literacy game called Fakey (click the link to play for yourself). The write-up concludes with a recommendation. Tech companies are currently playing a game of whack-a-mole against bad information, but they might make better progress by instead slowing down the spread of information on their platforms. As for users, we recommend vigilance—do not be taken in by the fake wisdom of the crowds.

Cynthia Murrell, October 1, 2021

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta