Social Media Helps Trolls Roll
July 9, 2020
Even social-media researcher Jeanna Matthews has to be vigilant to keep from being fooled, we learn from her article in Fast Company, “Bots and Trolls Control a Shocking Amount of Online Conversation.” Armies of hackers maliciously swaying public opinion through social media have only grown larger, and their methods more sophisticated, since they started making news in 2016. These bad actors game the algorithms that decide which posts to circulate heavily, choices based largely on which ones get the most reactions (“likes,” “votes,” sad/ laughing/ angry faces, etc.) It has been shown, however, that lies spread faster than truths. Any middle-school girl could have told us that. Mattews writes:
“But who is doing this ‘voting’? Often it’s an army of accounts, called bots, that do not correspond to real people. In fact, they’re controlled by hackers, often on the other side of the world. For example, researchers have reported that more than half of the Twitter accounts discussing COVID-19 are bots. As a social media researcher, I’ve seen thousands of accounts with the same profile picture ‘like’ posts in unison. I’ve seen accounts post hundreds of times per day, far more than a human being could. I’ve seen an account claiming to be an ‘All-American patriotic army wife’ from Florida post obsessively about immigrants in English, but whose account history showed it used to post in Ukranian. Fake accounts like this are called ‘sock puppets’—suggesting a hidden hand speaking through another identity. In many cases, this deception can easily be revealed with a look at the account history. But in some cases, there is a big investment in making sock puppet accounts seem real.”
One example is the much-followed Jenna Abrams Twitter account that turned out to be run by Russian trolls. These imposters’ have their favorite subjects—Covid-19 and Black Lives Matter are two examples—but their goals go beyond the issues. They practice to divide and conquer: sowing mistrust, pitting us against each other, and building a society in which objective truth no longer matters. Social media platforms, which (sadly) profit from the spread of misinformation, have been slow to act against these manipulators. They often brandish the freedom of speech argument to defend their inaction.
Matthews suggests some ways to protect ourselves from being swayed by these deceivers. We can use social media sparingly, and when we do visit, be more deliberate—navigate to particular pages instead of just consuming the default feed. We can also pressure platforms to delete accounts with sure signs of automation, provide more controls over what crosses our feed, and provide more transparency about how choices are made and who is placing ads. Some may want to contact legislators to demand regulation. Finally, we must take it all with a grain of salt. We know the trolls are out there, and we know how active they are. Do not fall for their tricks.
Cynthia Murrell, July 9, 2020