Facebook: A Buckeye for Good Luck or Bad Zuck?
November 17, 2021
Facebook is an excellent example of what happens when a high school science club “we know a lot” mentality. The company has been quite successful. Its advertising is thriving. Its ageing user base shows no reluctance to check out pictures of their grandchildren. Enterprising vendors have found Facebook classifieds an idea way to sell a wide range of products.
The Ohio Attorney General, however, does not see these benefits as material. “Attorney General Yost Sues Facebook for Securities Fraud after Misleading Disclosures, Allegations of Harm to Children.” The write up states:
Zuckerberg and other company officials, the lawsuit maintains, knew that they were making false statements regarding the safety, security and privacy of its platforms. Facebook admitted in those internal documents that “We are not actually doing what we say we do publicly.”
Was their harm? The write up says:
In roughly a month, those revelations caused a devaluation in Facebook’s stock of $54.08 per share, causing OPERS and other Facebook investors to lose more than $100 billion. Yost’s lawsuit not only seeks to recover that lost value but also demands that Facebook make significant reforms to ensure it does not mislead the public about its internal practices.
The case will take time to resolve. Several observations:
- Other “funds” may find an idea or two in the Ohio matter. Science club wizards are not comfortable when the non-science club people pile on and try to take their lunch money and their golden apple. Maybe more AG actions?
- The focus on money is more compelling than harming Facebook users. Money is the catnip for some, including OPERS-type outfits.
- Quoting the Zuck may signal that luck is running out for the pioneer of many interactions.
Net net: Worth monitoring this matter. It may come to nothing. On the other hand it may come to some settlement, just smaller than $100 billion. Jail time? Interesting question.
Stephen E Arnold, November 16, 2021
Facebook: Whom Do We Trust?
November 15, 2021
Despite common sense, people continue to trust Facebook for news. Informed people realize that Facebook is not a reliable news source. Social media and communication experts are warning the US government about the dangers. Roll Call delves into the details in, “Facebook Can’t Be Trusted To Stop Spread Of Extremism, Experts Tell Senate.”
Social media and communication experts testified at the Senate Homeland Security and Governmental Affairs Committee. They explained to the US Senate that social media platforms, especially Facebook, are incapable of containing the spread of violent, extremist content. Facebook personalization algorithms recommend extremist content and it becomes an addiction:
“Karen Kornbluh, who directs the Digital Innovation and Democracy Initiative at the German Marshall Fund of the United States, described how social media algorithms can quickly lure a user from innocuous political content to instructional videos for forming a militia. ‘This is a national security vulnerability,’ Kornbluh said, giving a small number of content producers the ability to exploit social media algorithms to gain enormous reach.‘Social media goes well beyond providing users tools to connect organically with others,’ she testified. ‘It pulls users into rabbit holes and empowers small numbers of extremist recruiters to engineer algorithmic radicalization.’”
Social media platform companies prioritize profit over societal well-being, because the extremist and polarizing content keeps users’ eyeballs glued to their screens. Experts want the US government to force social media platforms to share their algorithms so they can be studied. Facebook and its brethren argue their algorithms are proprietary technology and poses business risks if made public.
Social media companies know more their users than the public knows about the companies. The experts argue that social media platforms should be more transparent so users can understand how their views are distorted.
Whitney Grace, November 15, 2021
Survey Says: Facebook Is a Problem
November 11, 2021
I believe everything I read on the Internet. I also have great confidence in surveys conducted by estimable news organizations. A double whammy for me was SSRS Research Refined CNN Study. You can read the big logo version at this link.
The survey reports that Facebook is a problem. Okay, who knew?
Here’s a snippet about the survey:
About one-third of the public — including 44% of Republicans and 27% of Democrats — say both that Facebook is making American society worse and that Facebook itself is more at fault than its users.
Delightful.
Stephen E Arnold, November 11, 2021
Reigning in Big Tech: Facebook May Be Free to Roam
November 10, 2021
I noted that Google was, in effect, not guilty of tracking Safari users. You can read the the UK court decision here. Sure, Google has to pay a $3 billion fine for abusing its control over search, but there are appeals in the future. Google can afford appeals and allow turnover in the EU staff to dull the knife that slices at Googzilla.
I found “Why the Rest of the World Shrugged at the Facebook Papers” more interesting. The main point of the write up is, “Meh, Facebook.” Here’s a passage I noted:
Even many in civil society have barely registered the leaks.
Facebook may be able to do some hand waving, issue apologetic statements, and keep on its current path. Advertisers and stakeholders are like to find the report cited above reassuring.
Interesting.
Stephen E Arnold, November 10, 2021
Meta: A Stroke of Genius or a Dropout Idea from a Dropout
November 10, 2021
I read an article called “Thoughts on Facebook Meta.” The main idea of the essay surprised me. Here’s the passage which caught my attention:
I think the metaverse will be massive not so much because gaming and VR will be big, but because gaming and VR will be the only avenue to thrive for the bottom 80% of people on the planet.
I also circled in red this passage:
Anyway, this is a smart move by Face-meta. It allows Zuckerberg to dodge the scrutiny bullets and become a quixotic futurist, and at the same time build the reality substrate for 80% of the planet.
Net net: The Zuck does it again. He likes old-school barbeque sauce, not New Coke. The question is, “What will government regulators like?”
Stephen E Arnold, November 10, 2021
Facebook: Who Sees Disturbing Content?
November 4, 2021
Time, now an online service owned by Salesforce founders, published “Why Some People See More Disturbing Content on Facebook Than Others, According to Leaked Documents.”
The user categories exposed to more troubling Facebook content are, according to Facebook’s researchers:
vulnerable communities, including Black, elderly and low-income users, are among the groups most harmed by the prevalence of disturbing content on the site. At the time of the 2019 integrity report, Facebook’s researchers were still defining what constituted disturbing.
Interesting.
Stephen E Arnold, November 4, 2021
Facebook under the Meta Umbrella May Be a Teddy Bear
November 2, 2021
Facebook (oops, Meta) appears to be changing now that it is under the Meta umbrella. “Facebook Will Let Kazakhstan Government Directly Flag Content the Country Deems Harmful” reports:
Facebook owner Meta Platforms has granted the Kazakh government access to its content reporting system, after the Central Asian nation threatened to block the social network for millions of local users.
Will Kazakhstan be a pace-setter like China and Russia when it comes to country specific censorship? If Facebook (oops, Meta) finds that TikTok and other non-Zuck properties do not appeal to young people, Facebook (oops, Meta) will have to trade off its long-cherished policies for deals that generate revenue.
Money is the pressure point which caused Facebook (oops, Meta) to indicate that it has a kinder, gentler side. What other countries will want to embrace the warm and fuzzy social media giant’s alleged new approach?
Stephen E Arnold, November 2, 2021
The Zuck Strikes Back
November 2, 2021
Well, when Facebook strikes back it probably won’t use words. A few threshold modifications, a handful of key words (index terms), and some filter tweaking — – the target will be in for an exciting time. Try explaining why your Facebook page is replete with links to Drug X and other sporty concepts. Yeah, wow.
“Mark Zuckerberg angrily Insists Facebook Is the Real Victim Here” includes some interesting observations:
At the top of his company’s third quarter earnings call, the Facebook CEO broadly railed against the 17 news organizations working together to report on a massive trove of leaked internal documents dubbed the Facebook Papers.
Okay, victim.
What could Facebook, Instagram, and WhatsApp do to make life difficult for bylined journalists digging through the company’s confidential-no-more content.
My DarkCyber research team offered some ideas at lunch today. I just listened and jotted notes on a napkin. Here we go:
- Populate a journalist’s Facebook page with content related to human trafficking, child sex crime, contraband, etc.
- Inject images which are typically banned from online distribution into a journalist’s Instagram content. What no Instagram? Just use Facebook data to locate a relative or friend and put the imagery on one or more of those individuals’ Instagram. That would have some knock on consequences.
- Recycle WhatsApp messages from interesting WhatsApp groups to a journalist’s WhatsApp posts; for example, controlled substances, forbidden videos on Dark Web repositories, or some of those sites offering fraudulent Covid vaccination cards, false identification papers, or Fullz (stolen financial data).
Facebook has some fascinating data, and it can be repurposed. I assume the journalists spending time with the company’s documents are aware of what hypothetically Facebook could do if Mr. Zuckerberg gets really angry and becomes – what’s the word – how about vindictive?
How will investigators get access to these hypothetical poisoned data? Maybe one of the specialized services which index social media content?
Stephen E Arnold, November 2, 2021
A Great Idea: New Coke
November 1, 2021
I don’t think too much about companies changing their names. The reason is that brand shifts are a response to legal or financial woes. I may have to start paying more attention if I read analyses like “From Facebook to Meta: The Most Notable Company Rebrands.” Wow.
The article identifies name changes which emphasize the underlying desire to create distance between one name and a new, free floating moniker. The goal is no baggage and a lift to the beleaguered executives MBA-inspired strategic insights.
USA Today mentions Tronc. That is a name that flows trippingly on the tongue. The newspaper with color pictures points out that Andersen Consulting morphed into Accenture and then demonstrated that CPAs can make quite poor business decisions about how to report a client’s financial condition. Think Enron. Do you remember Jeffrey Skilling, who has a Harvard MBA and was a real, live Baker scholar. Impressive. He was able to explain bookkeeping to Andersen/Accenture. Good job! The must-read newspaper mentioned a cigarette outfit which became the Altria outfit. Think processed cheese, not nicotine delivery.
But the write up is about Facebook, which is now “meta.” I think “meta” is a subtle move. No one will know the difference, just like Coca Cola’s push of New Coke. Brilliant.
Stephen E Arnold, November 1, 2021
Facebook: A Fascinating Assertion
October 28, 2021
A Facebook professional named Monika Bickert, who is the “head” of global policy management is quoted as offering some insight into the Zuckbook’s approach to content. This information comes from “Facebook Exec Pushes Back on Whistleblower Claims,” published by US News & World Report, which I did not know was still in business.
Monika Bickert, Facebook’s head of global policy management, says the social media giant does not prioritize engagement and user growth over safety.
Everyone is entitled to his or her opinion.
The write up states:
Facebook has pushed back on Haugen’s claims but hasn’t pointed to any factual errors in her testimony or in a series of reports that outlined massive shortcomings at the social network, identified by its own internal research.
The write up is an interview with the “head” of global policy management, and I found her summary of her background interesting; for example, the article quotes her as saying:
We do not and we have not prioritized engagement over safety. I’ve been at this company for more than nine years. I’m a mother. I also was a criminal prosecutor and worked on child safety for more than 10 years. And I can tell you I wouldn’t be at this company if we weren’t prioritizing safety.
The implication is that a former criminal prosecutor would know what algorithms are up to 24×7. I am not sure I am 100 percent confident in this “head’s” ability to address message amplification, the interaction of user inputs and content outputs, or the unexpected signals smart software makes available to other platform components.
How do I know this?
- Google knee jerked and dumped staff who were poking around the behavior of the vaunted Snorkel method
- Twitter said in effect, “Hey, we don’t know why certain messages are augmented. Mystery, right? Let’s grab a latte and do some thinking.”
- The interesting “drift” which manifests itself when Bayesian centric systems like the venerable Autonomy neuro linguistic programming black box chugs away. Retrain or get some fascinating outputs.
Your mileage may vary, but in lawyer speak, Facebook is nothing but a bunch of great folks producing outstanding products.
Believe that?
I don’t.
Stephen E Arnold, October 28, 2021