Regulators Move, Just Slowly Toward Facebook

October 14, 2021

Finally, after 17 years a dim light flickers on. Vox Recode reports, “It’s Getting Harder for People to Believe that Facebook Is a Net Good for Society.” Though experts have been sounding the alarm for years, Facebook has insisted its ability to bring folks together far outweighs any damage perpetuated by its platforms. Now, though, more people are challenging that defense. Writer Shirin Ghaffary tells us:

“A new series of reports from the Wall Street Journal, “The Facebook Files,” provides damning evidence that Facebook has studied and long known that its products cause measurable, real-world harm — including on teenagers’ mental health — and then stifled that research while denying and downplaying that harm to the public. The revelations, which only strengthen the case that a growing chorus of lawmakers and regulators have been making for breaking up Facebook or otherwise severely limiting its power as a social media giant, could represent a turning point for the company. Already, the Journal’s reporting has prompted consequences for Facebook: A bipartisan Senate committee is investigating [Facebook-owned] Instagram’s impact on teenagers, and a group of legislators led by Sen. Ed Markey (D-MA) is calling for Facebook to halt all development of its Instagram for Kids product for children under 13, which BuzzFeed News first revealed the company was developing in March.”

Ghaffary reminds us the wheels of government turn slowly and, often, to little effect. The investigations are in early stages and may not lead to any real changes or meaningful consequences. At least some politicians are more willing to question Facebook about the harm it causes, as some did at recent Congressional hearings. Unfortunately, Facebook is inclined to withhold damaging information even at the request of elected officials. We learn:

“When Rep. Rodgers and other Republicans followed up with Facebook and asked about the company’s internal research on the effects of its products on mental health, the company did not share the Instagram research results, according to Bloomberg, nor did it share them with Sen. Ed Markey when his office also asked Facebook to provide any internal research on the matter in April, according to letters provided by Markey’s office to Recode.”

But wait, there’s more. The Journal’s reporting also reveals the company’s VIP program, through which certain celebrities and politicians can break its rules (such as they are). It also shows that, in 2018, Facebook modified its algorithm to encourage the sharing of angrier content. Anything to generate traffic and revenues, whatever the consequences, it seems.

Cynthia Murrell, October 14, 2021

Facebook and Synthetic Data

October 13, 2021

What’s Facebook thinking about its data future?

A partial answer may be that the company is doing some contingency planning. When regulators figure out how to trim Facebook’s data hoovering, the company may have less primary data to mine, refine, and leverage.

The solution?

Synthetic data. The jargon means annotated data that computer simulations output. Run the model. Fiddle with the thresholds. Get good enough data.

How does one get a signal about Facebook’s interest in synthetic data?

Facebook, according to Venture Beat, the responsible social media company acquired AI.Reverie.

Was this a straight forward deal? Sure, just via a Facebook entity called Dolores Acquisition Sub, Inc. If this sounds familiar, the social media leader may have taken its name from a motion picture called “Westworld.”

The write up states:

AI.Reverie — which competed with startups like Tonic, Delphix, Mostly AI, Hazy, Gretel.ai, and Cvedia, among others — has a long history of military and defense contracts. In 2019, the company announced a strategic alliance with Booz Allen Hamilton with the introduction of Modzy at Nvidia’s GTC DC conference. Through Modzy — a platform for managing and deploying AI models — AI.Reverie launched a weapons detection model that ostensibly could spot ammunition, explosives, artillery, firearms, missiles, and blades from “multiple perspectives.”

Booz, Allen may be kicking its weaker partners. Perhaps the wizards at the consulting firm should have purchased AI.Reverie. But Facebook aced out the century old other people’s business outfit. (Note: I used to labor in the BAH vineyards, and I feel sorry for the individuals who were not enthusiastic about acquiring AI.Reverie. Where did that bonus go?)

Several observations are warranted:

  1. Synthetic data is the ideal dating partner for Snorkel-type machine learning systems
  2. Some researchers believe that real data is better than synthetic data, but that is a fight like spats between those who love Windows and those who love Mac OSX
  3. The uptake of “good” enough data for smart statistical systems which aim for 60 percent or better “accuracy” appears to be a mini trend.

Worth watching?

Stephen E Arnold, October 13, 2021

Facebook and the UK: About to Get Exciting

October 13, 2021

Remember Cambridge Analytica? I think that some in the UK do. There’s been some suspicion that the Brexit thing may have been shaded by some Cambridge Analytica magic, and that may ignite the modern equivalent of the Protestant-Catholic excitement of the 16th century. Not religion this time. Social media, Facebook style.

The increasingly commercial BBC or Beeb published “Facebook Whistleblower to Appear before UK Parliament.” The write up states:

Frances Haugen, the Facebook whistleblower who accuses the technology giant of putting profit ahead of safety, will give evidence to the UK Parliament later this month. Ms Haugen will appear before the Online Safety Bill committee on 25 October. It is examining a law to impose obligations on social-media companies to protect users, especially children.

Kids are a big deal, but I think the Brexit thing will makes some snorting sounds as well.

The write up states:

Damian Collins, who chairs the committee reviewing the draft legislation, said Ms Haugen’s information to date had “strengthened the case for an independent regulator with the power to audit and inspect the big tech companies”.

Will Facebook’s PR ace get a chance to explain Facebook? What about the Zuck?

Interesting  because Ms. Haugen may be asked to do some sharing with EU regulators and the concerned officials in Australia, Canada, and New Zealand too.

Stephen E Arnold, October 13, 2021

Facebook Engineering: Big Is Tricky

October 12, 2021

The unthinkable happened on October 4, 2021, when Facebook went offline. Despite all the bad press Facebook has recently gotten, the social media network remains an important communication and business tool. The Facebook Engineering blog explains what happened with the shutdown in the post: “More Details About The October 4 Outage.” The outage happened with the system that manages Facebook’s global backbone network capacity.

The backbone connects all of Facebook’s data centers through thousands of miles of fiber optic cable. The post runs down how the backbone essentially works:

“When you open one of our apps and load up your feed or messages, the app’s request for data travels from your device to the nearest facility, which then communicates directly over our backbone network to a larger data center. That’s where the information needed by your app gets retrieved and processed, and sent back over the network to your phone.

The data traffic between all these computing facilities is managed by routers, which figure out where to send all the incoming and outgoing data. And in the extensive day-to-day work of maintaining this infrastructure, our engineers often need to take part of the backbone offline for maintenance — perhaps repairing a fiber line, adding more capacity, or updating the software on the router itself.”

A routine maintenance job issued a command to assess the global backbone’s capacity. Unfortunately it contained a bug the audit system did not catch and it terminated connections between data centers and the Internet. A second problem made things worse. The DNS servers were unreachable yet still operational. Facebook would not connect to their data centers through the normal meals and loss of DNS connections broke internal tools used to repair problems.

Facebook engineers had to physically visit the backbone facility, which is armed with high levels of security. The facility is hard to enter and the systems are purposely designed to be difficult to modify. It took awhile, but Facebook diagnosed and resolved the problems. Baby Boomers were overjoyed to resume posting photos of their grandchildren and anti-vaxxers could read their misinformation feeds.

Perhaps this Facebook incident and the interesting Twitch data breach illustrate that big is tricky? Too big to fail become too big to keep working in a reliable way.

Whitney Grace, October 12, 2021

The Economist Zucks Facebook

October 8, 2021

Time to unfollow or defriend or just erase the Zuck?

Facebook Is Nearing a Reputational Point of No Return” illustrates the turning of the capitalistic worm. The newspaper which sure looks like a magazine to me states:

Facebook is nearing a reputational point of no return.

Then this wowza:

If rational argument alone is no longer enough to get Facebook out of its hole, the company should look hard at its public face. Mark
Zuckerberg, Facebook’s all-powerful founder, made a reasoned statement after this week’s wave of anger. He was ignored or ridiculed and increasingly looks like a liability.

Facebook has been chugging along since 2004. Finger pointing, legal action, and the Winkelvossing have been stirred into Cambridge Analytica, apologies, and savories like WhatsApp as a new Dark Web.

Suddenly the Zuck is a liability.

Seventeen years and counting. Insight takes time to arrive.

Stephen E Arnold, October 8, 2021

An Onion Story: The Facebook Oversight Board Checks Out Rule Following

October 7, 2021

I read “Facebook’s Oversight Board to Review Xcheck System Following Investigation of Internal System That Exempted Certain Users.” Is this a story created for the satirical information service Onion, MAD Magazine, or the late, lamented Harvard Lampoon?

I noted this passage:

For certain elite users, Facebook’s rules don’t seem to apply.

I think this means that there is one set of rules for one group of users and another set of rules for another group of users. In short, the method replicates the tidy structure of a medieval hierarchy; to wit:

image

The “church” would probably represent the Zuck and fellow technical elite plus a handful of fellow travelers. The king is up for grabs now that the lean in expert has leaned out. The nobles and barons are those who get a special set of rules. The freemen can buy ads. The serfs? Well, peasants are okay for clicks but not much else.

Now the oversight board which is supposed to be overseeing will begin the overseeing process of what appears to be a discriminatory system.

Obviously the oversight board is either in the class of freemen or serfs. I wonder if this Onionesque management method is a variant of the mushroom approach; that is, keep the oversight board and users in the dark and feed them organic matter rich in indole, skatole, hydrogen sulfide, and mercaptans?

That Facebook is an Empyrean spring of excellence in ethics, management, and business processes. My hunch is that not even the outfits like the Onion can match this joke. Maybe Franz (Happy) Kafka could?

Stephen E Arnold, October 7, 2021

Facebook: Why Change?

October 6, 2021

I read “Facebook Can’t Be Saved.” The main point struck me as:

Facebook has experienced years of intense scrutiny over the exact issues that are being discussed in the wake of Haugen’s revelations, and has only succeeded in making its inherent problems worse. During the hearing, Haugen compared fixing Facebook’s issues to mandating that cars come with seat belts. But maybe Facebook doesn’t need a seat belt. Maybe it just needs to stop being given more chances.  

This is an interesting analogy. I would ask this question, “Why should Facebook change?” The company has loyal users, lobbyists, and friends in high places. The available consequences are fines and enduring hearings and legal proceedings.

After watching the testimony by the whistle blower, my hunch is that Facebook will evolve. But the deep machine is chugging along.

Stephen E Arnold, October 6, 2021

Facebook Doing Its Thing with Weaponized Content?

October 1, 2021

I read “Facebook Forced Troll Farm Content on Over 40% of All Americans Each Month.” Straight away, I have problems with “all.” The reality is that “all” Americans means those who don’t use Facebook, Instagram, or WhatsApp. Hence, I am not sure how accurate the story itself is.

Let’s take a look at a couple of snippets, shall we?

Here’s one that caught my attention:

When the report was published in 2019, troll farms were reaching 100 million Americans and 360 million people worldwide every week. In any given month, Facebook was showing troll farm posts to 140 million Americans. Most of the users never followed any of the pages. Rather, Facebook’s content-recommendation algorithms had forced the content on over 100 million Americans weekly. “A big majority of their ability to reach our users comes from the structure of our platform and our ranking algorithms rather than user choice,” the report said. Sweeping internal Facebook memo: “I have blood on my hands” The troll farms appeared to single out users in the US. While globally more people saw the content by raw numbers—360 million every week by Facebook’s own accounting—troll farms were reaching over 40 percent of all Americans.

Yeah, lots of numbers, not much context, and the source of the data appears to be Facebook. Maybe on the money, maybe a bent penny? If we assume that the passage is “sort of correct”, Facebook has added to its track record for content moderation.

Here’s another snippet I circled in red:

Allen believed the problem could be fixed relatively easily by incorporating “Graph Authority,” a way to rank users and pages similar to Google’s PageRank, into the News Feed algorithm. “Adding even just some easy features like Graph Authority and pulling the dial back from pure engagement-based features would likely pay off a ton in both the integrity space and… likely in engagement as well,” he wrote. Allen [a former data scientist at Facebook,] left Facebook shortly after writing the document, MIT Technology Review reports, in part because the company “effectively ignored” his research, a source said.

Disgruntled employee? Fancy dancing with confidential information? A couple of verification items?

Net net: On the surface, Facebook continues to do what its senior management prioritizes. Without informed oversight, what’s the downside for Facebook? Answer: At this time, none.

Stephen E Arnold, October 1, 2021

Facebook Brings People Together: A Different Spin

September 29, 2021

I read “Lawmakers Ask Zuckerberg to Drop ‘Instagram for Kids’ After Report Says App Made Kids Suicidal.” The write up reports about more concern and hand wringing about the impact of social media. Finally an anonymous but brave Facebook whistleblower has awakened the somnambulant US elected officials from their summer siesta. Here’s a quote from the write up:

“Children and teens are uniquely vulnerable populations online, and these findings paint a clear and devastating picture of Instagram as an app that poses significant threats to young people’s wellbeing,” the lawmakers said.

Facebook was founded in 2004. Let’s see that works out to about eight days in the timescape of US elected officials, doesn’t it. Why rush?

Stephen E Arnold, September 29, 2021

Yay, A Facebook Friday

September 24, 2021

Three slightly intriguing factoids about the Zuckbook.

The first is a characterization of Facebook’s and the supreme leader’s time spirit:

“Shame, addiction, and dishonesty.”

Well, that’s a poster message for some innovator in the decorative arts. The original could be offered on Facebook Messenger and the cash transaction handled at night in a fast food joint’s parking lot. What could go wrong? And the source of this information? The work of the UX Collective and included in a write up with the title “Zuckerberg’s Zeitgeist: A Culture of Shame, Addiction, and Dishonesty.” What’s left out of the write up? How many UX Collective professionals have Facebook accounts? And what’s the method of remediation? A better interface. Okay. Deep.

The second is from “Facebook’s Incoming Chief Technology Officer Once Said People Being Cyberbullied to Suicide of Killed in Terror Attacks Organized on the Site Was a Price Worth Paying to Connect People.” The headline alleges that the new Facebook chief technology officer or C3PO robot emitted this statement. Another memorable phrase from the C2PO Facebooker is allegedly:

Maybe it costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools. And still we connect people.’

Snappy? Yep.

And, finally, today (September 24, 2021), that  the estimable Salesforce luminary, Marc Benioff, who maybe said:

In regards to Facebook, they are not held accountable.

The write up “Tech Billionaire: Facebook Is What’s Wrong with America” contains an even more T shirtable slogan. I live in fear of Google’s duplication savvy smart software, but I want to be clear:

Facebook is what’s wrong with America

I like this statement whether from the humanoid running Salesforce or a thumbtyping PR expert with a degree in art history and a minor in business communications. Winner.

Net net: Facebook seems to be a font of news and inspiration. And, please, remember the fix: user interface changes. Yes.

Stephen E Arnold, September 24, 2021

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta