Google: A Ray of Light?
November 5, 2019
Google’s algorithms may not be so bad after all—it seems that humans are the problem yet again. Wired discusses a recent study from Penn State in its article, “Maybe It’s Not YouTube’s Algorithm That Radicalizes People.” Extreme ideological YouTube channels have certainly been growing by leaps and bounds. Many reporters have pointed to the site’s recommendation engine as the culprit, saying its suggestions, often running on auto-play, guide viewers further and further down radicalization rabbit holes. However, political scientists Kevin Munger and Joseph Phillips could find no evidence to support that view. Reporter Paris Martineau writes:
“Instead, the paper suggests that radicalization on YouTube stems from the same factors that persuade people to change their minds in real life—injecting new information—but at scale. The authors say the quantity and popularity of alternative (mostly right-wing) political media on YouTube is driven by both supply and demand. The supply has grown because YouTube appeals to right-wing content creators, with its low barrier to entry, easy way to make money, and reliance on video, which is easier to create and more impactful than text.”
The write-up describes the researchers’ approach:
“They looked at 50 YouTube channels that researcher Rebecca Lewis identified in a 2018 paper as the ‘Alternative Influence Network.’ Munger and Phillips’ reviewed the metadata for close to a million YouTube videos posted by those channels and mainstream news organizations between January 2008 and October 2018. The researchers also analyzed trends in search rankings for the videos, using YouTube’s API to obtain snapshots of how they were recommended to viewers at different points over the last decade. Munger and Phillips divided Lewis’s Alternative Influence Network into five groups—from ‘Liberals’ to ‘Alt-right’—based on their degree of radicalization. … Munger and Phillips found that every part of the Alternative Influence Network rose in viewership between 2013 and 2016. Since 2017, they say, global hourly viewership of these channels ‘consistently eclipsed’ that of the top three US cable networks combined.”
The Penn State team also cites researcher Manoel Ribeiro, who insists his rigorous analysis of the subject, published in July, has been frequently misinterpreted to support the bad-algorithm narrative. Why would mainstream media want to shift focus to the algorithm? Because, Munger and Phillips say, that explanation points to a clear policy solution, wishful thinking though it might be. The messiness of human motivations is not so easily dealt with.
Both Lewis and Ribeiro praised the Penn State study, indicating it represents a shift in this field of research. Munger and Phillips note that, based on the sheer volume of likes and comments these channels garner, their audiences are building communities—a crucial factor in the process of radicalization. Pointing fingers at YouTube’s recommendation algorithm is a misleading distraction.
Cynthia Murrell, November 4, 2019