Facebook: Fooled by Ranking?
April 1, 2022
I sure hope the information in “A Facebook Bug Led to Increased Views of Harmful Content Over Six Months.” The subtitle is interesting too. “The social network touts downranking as a way to thwart problematic content, but what happens when that system breaks?”
The write up explains:
Instead of suppressing posts from repeat misinformation offenders that were reviewed by the company’s network of outside fact-checkers, the News Feed was instead giving the posts distribution, spiking views by as much as 30 percent globally.
Now let’s think about time. The article reports:
In 2018, CEO Mark Zuckerberg explained that downranking fights the impulse people have to inherently engage with “more sensationalist and provocative” content. “Our research suggests that no matter where we draw the lines for what is allowed, as a piece of content gets close to that line, people will engage with it more on average — even when they tell us afterwards they don’t like the content,” he wrote in a Facebook post at the time.
Why did this happen?
The answer may be that assumptions about the functionality of online systems must be verified by those who know the mechanisms used. Then the functions must be checked on a periodic business. The practice of slipstreaming changes may introduce malfunctions, which no one catches because no one is rewarded for slowing down the operation.
Based on my work for assorted reports and monographs, there are several other causes of a disconnect between what a high technology outfits and its systems actually do. Let me highlight what I call the Big Three:
- Explaining something that might be is different from delivering the reality of the system. Management wants to believe that code works, and not too many people want to be the person who says, “Yeah, this is what the system is actually doing?” Institutional momentum can crush certain types of behavior.
- The dependencies within complex software systems are not understood, particularly by recently hired outside experts, new hires, or — heaven help us — interns who are told to do X without meaningful checks, reviews, and fixes.
- An organization’s implicit policies keep feedback contained so the revenue continues to flow. Who gets promoted for screwing up ad sales? As a result, news releases, public statements, and sworn testimony operates in an adjacent but separate conceptual space from the mechanisms that generate live systems.
It has been my experience that when major problems are pointed out, reactions range from “What do you mean?” to a chuckled comment, “That’s just the way software works.”
What intrigues me is the larger question, “Is the revelation that Facebook smart software does not work as the company believed it did, the baseline for the company’s systems. On the other hand, the information could be an ill considered April Fool’s joke.
My hunch is that the article is not humor. Much of Facebook’s and Silicon Valley behavior does not tickly my funny bone. My prediction is that some US regulators and possibly Margrethe Vestager will take this information under advisement.
Stephen E Arnold, April 1, 2022