Facebook: Getting Softer, More Lovable?
May 9, 2022
Is the Zuckbook going soft? Sure, the company allegedly dorked around with Facebook pages in Australia. Sure, a former employee revealed the high school science club thought framework? Sure, the Zuck is getting heat for his semi-exciting vision of ZuckZoom and ZuckGraphics.
The article with the clicky title “Meta’s Challenge to OpenAI—Give Away a Massive Language Model. At 175 Billion Parameters, It’s As Powerful As OpenAI’s GPT-3, and It’s Open to All Researchers” shows that El Zucko is into freebies. The idea is that Zuck’s smart software is not going to allow the Google to dominate in this super-hyped sector. Think of it as the battle of the high school science clubs.
The ZuckVerse anyone who sells gets special treatment. Meta will charge about 48 percent commission.
Selling in Horizon Worlds will be limited to a few creators located in the US and Canada who must be eighteen years old. The 50% commission is a huge chunk of a creator’s profit, even if the item is an NFT:
“Meta spokesperson Sinead Purcell confirmed the figure to The Post, adding that Horizon Worlds will eventually become available on hardware made by other companies. In those cases, Meta will keep charging its 25% Horizon Worlds fee but the other companies will set their own store transaction fees. Vivek Sharma, Meta’s vice president of Horizon, told The Verge that the commission is ‘a pretty competitive rate in the market.’”
Zuckerberg criticized Google and Apple for taking 30% commission fees to digital creators. He claims that when the Metaverse adds a revenue share the commission rate will be less than 30%.
Zuckerberg claims he wants to support creators and help them make a living wage, but his statements are probably hot air. Talk is cheap, especially for tech giants. Zuckerberg wants to recoup the lost ad revenue through NFTs.
See. Kinder. Gentler. Maybe a Zuckbork?
Stephen E Arnold, May 9, 2022
Facebook and Litigation: A Magnet for Legal Eagles
May 6, 2022
Facebook now called Meta is doing everything it can to maintain relevance with kids and attract advertisers. A large portion of Facebook’s net profits comes from advertising fees. Meta has not been entirely clear with its customers, because CNN Business explains in the story: “Facebook Advertisers Can Pursue Class Action Over Ad Rates” that the company lied about the ability of its “potential reach” tool.
San Francisco US District Judge James Donato ruled that millions of people and businesses that paid for Facebook ads and Instagram, a subsidiary, can sue as a group. Facebook’s fiasco started in pre-pandemic days:
“The lawsuit began in 2018, as DZ Reserve and other advertisers accused Facebook of inflating its advertising reach, by increasing the number of potential viewers by as much as 400%, and charging artificially high premiums for ad placements. They also said senior Facebook executives knew for years that the company’s “potential reach” metric was inflated by duplicate and fake accounts, yet did nothing about it and took steps to cover it up.”
Knowingly deceiving customers is a common business tactic among executives. They do not want to disappoint their investors, or lose face, or money. It is such a standard business tactic that many bigwigs do get away with it, but some are caught with hands so red that ghee would make a bull angry (along with their customers). Facebook argued that a class action lawsuit was not possible, because the litigants were too diverse. The litigants are large corporations and individuals with home businesses. Facebook claimed they would not know how to calculate images.
Judge Donato said it made more sense for Facebook’s irate customers to sue as a group, because “ ‘no reasonable person’ would sue Meta individually to recover at most a $32 price premium.”
Ticketmaster faced a similar scandal when they charged buyers absurd fees for tickets. The fees went directly into the pockets of the executives. Ticketmaster’s class-action lawsuit resulted in all plaintiffs reaching $3-4 Ticketmaster gift certificates for every ticket they bought. The gift certificates could not be combined and had expiration dates.
Big businesses should be held accountable for their actions, but the payoff is not always that great for the individual.
Whitney Grace, May 6, 2022
Meta (Formerly Zuckbook) Chases Another Digital Ghost
May 5, 2022
High school science club thinking is alive and well as Meta (formerly Zuckbook). Here’s a flashback to the Information Industry Association meeting in Boston in 1081. A wizard of sorts (Marvin Weinberger maybe?) pointed out that artificial intelligence was just around the corner. The conference was not far from an MIT building, so his optimism may have had some vibes from the pre-Epstein era at that institution.
No one said anything. There were just chuckles.
Flash forward to 2022: Synthetic data, handwaving, unexplainable outputs, Teslas which get confused, YouTube ad placement, etc. The era of AI has arrived in its close-enough-for-horseshoes glory.
“Meta AI Is Building AI That Processes Language Like the Human Brain” explains:
Meta AI announced a long-term research initiative to understand how the human brain processes language. In collaboration with neuroimaging center Neurospin (CEA) and INRIA, Meta AI is comparing how AI language models and the brain respond to the same spoken or written sentences.
Significant advancements based on “valuable insights” will allow the Zuckbook to offer services that process language like the humanoid brain.
And the progress? Well, MIT is not involved. Human brains at that institution apparently misunderstood Jeffrey Epstein. The Zuckbook will not make that mistake one hopes.
Neurospin? Niftier than plain old AI? Absolutely.
Stephen E Arnold, May 5, 2022
Gizmodo: The Facebook Papers, Void Filling, and Governance
May 2, 2022
If you need more evidence about the fine thought processes at Facebook, navigate to “We’re Publishing the Facebook Papers. Here’s What They Say About the Ranking Algorithms That Control Your News Feed.” In the story is a link to the link tucked into the article where the once-confidential documents are posted. In the event you just want to go directly to the list, here it is: https://bit.ly/3vWqLKD.
I reacted to the expansion of the Gizmodo Facebook papers with a chuckle. I noted this statement in the cited article:
Today, as part of a rolling effort to make the Facebook Papers available publicly, Gizmodo is releasing a second batch of documents—37 files in all.
I noted the phrase “rolling effort.”
In my OSINT lecture at the National Cyber Crime Conference, I mentioned that information once reserved for “underground” sites was making its way to mainstream Web sites. Major news organizations have dabbled in document “dumps.” The Pentagon Papers and the Snowden PowerPoints are examples some remember. An Australian “journalist” captured headlines, lived in an embassy, and faces a trip to the US because of document dumps.
Is Gizmodo moving from gadget reviews into the somewhat uncertain seas of digital information once viewed as proprietary, company confidential, or even trade secrets?
I don’t know if the professionals at Gizmodo are chasing clicks, thinking about emulating bigly media outfits, or doing what seems right and just.
I find the Facebook papers amusing. The legal eagles may have a different reaction. Remember. I found the embrace of interesting content amusing. From my point of view, gadget reviews are more interesting if less amusing.
Stephen E Arnold, May 2, 2022
NCC April Users Might Accept Corrections to Fake News, if Facebook Could be Bothered
April 28, 2022
Facebook (aka Meta) has had a bumpy road of late, but perhaps a hypothetical tweak to the news feed could provide a path forward for the Zuckbook. We learn from Newswise that a study recently published in the Journal of Politics suggests that “Corrections on Facebook News Feed Reduces Misinformation.” The paper was co-authored by George Washington University’s Ethan Porter and Ohio State University’s Thomas J. Wood and funded in part by civil society non-profit Avaaz. It contradicts previous research that suggested such an approach could backfire. The article from George Washington University explains:
“Social media users were tested on their accuracy in recognizing misinformation through exposure to corrections on a simulated news feed that was made to look like Facebook’s news feed. However, just like in the real world, people in the experiment were free to ignore the information in the feed that corrected false stories also posted on the news feed. Even when given the freedom to choose what to read in the experiment, users’ accuracy improved when fact-checks were included with false stories. The study’s findings contradict previous research that suggests displaying corrections on social media was ineffective or could even backfire by increasing inaccuracy. Instead, even when users are not compelled to read fact-checks in a simulation of Facebook’s news feed, the new study found they nonetheless became more factually accurate despite exposure to misinformation. This finding was consistent for both liberal and conservative users with only some variation depending on the topic of the misinformation.”
Alongside a control group of subjects who viewed a simulated Facebook feed with no corrections, researchers ran two variants of the experiment. In the first, they placed corrections above the original false stories (all of which had appeared on the real Facebook at some point). In the second, the fake news was blurred out beneath the corrections. Subjects in both versions were asked to judge the stories’ veracity on a scale of 1 – 5. See the write-up for more on the study’s methodology. One caveat—researchers acknowledge potential influences from friends, family, and other connections were outside the scope of the study.
If Facebook adopted a similar procedure on its actual news feed, perhaps it could curb the spread of fake news. But does it really want to? We suppose it must weigh its priorities—reputation and legislative hassles vs profits. Hmm.
Cynthia Murrell, April 28, 2022
Zuckerberg and Management: The Eye of What?
April 12, 2022
I am not familiar with Consequence.net. (I know. I am a lazy phat, phaux, phrench bulldog.) Plus I assume that everything I read on the Internet is actual factual. (One of my helpers clued me into that phrase. I am so grateful for young person speak.)
I spotted this article: “Mark Zuckerberg Says Meta Employees Lovingly Refer to Him as The Eye of Sauron.” The hook was the word “lovingly.” The article reported that the Zuck said on a very energetic, somewhat orthogonal podcast:
“Some of the folks I work with at the company — they say this lovingly — but I think that they sometimes refer to my attention as the Eye of Sauron. You have this unending amount of energy to go work on something, and if you point that at any given team, you will just burn them.”
My recollection of the eye in question is that the Lord of the Rings crowd is recycling the long Wikipedia article about looking at someone and causing no end of grief. Mr. Zuck cause grief? Not possible. A “Zuck up” means in Harrod’s Creek a sensitive, ethical action. A “Zuck eye”, therefore, suggests the look of love, understanding, and compassion. I have seen those eyes in printed motion picture posters; for example, the film “Evil Eye” released in the Time of Covid.
The article points out:
Without delving too deeply into fantasy lore, it is canonically nefarious, and bad things happen when it notices you. Zuckerberg’s computer nerd demeanor doesn’t quite scream “Dark Lord” to us, but we don’t deny that Meta employees would compare his semi-autocratic mode of operation to that of the Eye.
Interesting management method.
Stephen E Arnold, April 12, 2022
Facebook Defines Excellence: Also Participated?
April 5, 2022
Slick AI and content moderation functions are not all they are cracked up to be, sometimes with devastating results. SFGate provides one distressing example in, “‘Kill More’: Facebook Fails to Detect Hate Against Rohingya.” Rights group Global Witness recently put Facebook’s hate speech algorithms to the test. The AI failed spectacularly. The hate-filled ads submitted by the group were never posted, of course, though all eight received Facebook’s seal of approval. However, ads with similar language targeting Myanmar’s Rohingya Muslim minority have made it onto the platform in the past. Those posts were found to have contributed to a vicious campaign of genocide against the group. Associated Press reporters Victoria Milko and Barbara Ortutay write:
“The army conducted what it called a clearance campaign in western Myanmar’s Rakhine state in 2017 after an attack by a Rohingya insurgent group. More than 700,000 Rohingya fled into neighboring Bangladesh and security forces were accused of mass rapes, killings and torching thousands of homes. … On Feb. 1 of last year, Myanmar’s military forcibly took control of the country, jailing democratically elected government officials. Rohingya refugees have condemned the military takeover and said it makes them more afraid to return to Myanmar. Experts say such ads have continued to appear and that despite its promises to do better and assurances that it has taken its role in the genocide seriously, Facebook still fails even the simplest of tests — ensuring that paid ads that run on its site do not contain hate speech calling for the killing of Rohingya Muslims.”
The language in these ads is not subtle—any hate-detection algorithm that understands Burmese should have flagged it. Yet Meta (now Facebook’s “parent” company) swears it is doing its best to contain the problem. According to a recent statement sent to the AP, a company rep claims:
“We’ve built a dedicated team of Burmese speakers, banned the Tatmadaw, disrupted networks manipulating public debate and taken action on harmful misinformation to help keep people safe. We’ve also invested in Burmese-language technology to reduce the prevalence of violating content.”
Despite such assurances, Facebook has a history of failing to allocate enough resources to block propaganda with disastrous consequences for foreign populations. Perhaps taking more responsibility for their product’s impact in the world is too dull a topic for Zuck and company. They would much prefer to focus on the Metaverse, their latest shiny object, though that path is also fraught with collateral damage. Is Meta too big for anyone to hold it accountable?
Cynthia Murrell, April 5, 2022
Facebook: Fooled by Ranking?
April 1, 2022
I sure hope the information in “A Facebook Bug Led to Increased Views of Harmful Content Over Six Months.” The subtitle is interesting too. “The social network touts downranking as a way to thwart problematic content, but what happens when that system breaks?”
The write up explains:
Instead of suppressing posts from repeat misinformation offenders that were reviewed by the company’s network of outside fact-checkers, the News Feed was instead giving the posts distribution, spiking views by as much as 30 percent globally.
Now let’s think about time. The article reports:
In 2018, CEO Mark Zuckerberg explained that downranking fights the impulse people have to inherently engage with “more sensationalist and provocative” content. “Our research suggests that no matter where we draw the lines for what is allowed, as a piece of content gets close to that line, people will engage with it more on average — even when they tell us afterwards they don’t like the content,” he wrote in a Facebook post at the time.
Why did this happen?
The answer may be that assumptions about the functionality of online systems must be verified by those who know the mechanisms used. Then the functions must be checked on a periodic business. The practice of slipstreaming changes may introduce malfunctions, which no one catches because no one is rewarded for slowing down the operation.
Based on my work for assorted reports and monographs, there are several other causes of a disconnect between what a high technology outfits and its systems actually do. Let me highlight what I call the Big Three:
- Explaining something that might be is different from delivering the reality of the system. Management wants to believe that code works, and not too many people want to be the person who says, “Yeah, this is what the system is actually doing?” Institutional momentum can crush certain types of behavior.
- The dependencies within complex software systems are not understood, particularly by recently hired outside experts, new hires, or — heaven help us — interns who are told to do X without meaningful checks, reviews, and fixes.
- An organization’s implicit policies keep feedback contained so the revenue continues to flow. Who gets promoted for screwing up ad sales? As a result, news releases, public statements, and sworn testimony operates in an adjacent but separate conceptual space from the mechanisms that generate live systems.
It has been my experience that when major problems are pointed out, reactions range from “What do you mean?” to a chuckled comment, “That’s just the way software works.”
What intrigues me is the larger question, “Is the revelation that Facebook smart software does not work as the company believed it did, the baseline for the company’s systems. On the other hand, the information could be an ill considered April Fool’s joke.
My hunch is that the article is not humor. Much of Facebook’s and Silicon Valley behavior does not tickly my funny bone. My prediction is that some US regulators and possibly Margrethe Vestager will take this information under advisement.
Stephen E Arnold, April 1, 2022
TikTok: Search and Advertising
March 29, 2022
If life were not tricky enough for Amazon, Facebook, and Google, excitement is racing down the information highway. I read “TikTok Search Ads Tool Is Being Tested Out.” I learned:
This week, the famous short video application began beta testing for TikTok search ads in search results, allowing marketers to reach the audience utilizing the keywords they use.
Yep, a test, complete with sponsored listings at the top of the search result page.
Will this have an impact on most adults over the age of 65? The answer in my opinion, “Is not right away, but down the road, oh, baby, yes.”
Let’s think about the Big Boys:
- Amazon gets many clicks from its product search. The Google once dominated this function, but the Bezos bulldozer has been grinding away.
- Facebook or as I like to call it “zuckbook.” The combined social empire of Facebook, Instagram, and WhatsApp has quite a bit of product information. Don’t you follow Soph Mosca’s fashion snaps on Instagram? Will TikTok search offer a better experience with search, ads, and those nifty videos? Yep.
- And Google. Now the GOOG faces competition for product search ads from the China linked TikTok. How will the company respond? Publish a book on managing a diverse work force or put out a news release about quantum supremacy.
The write up explains that the ads, the search angle, and the experience is in beta. Will TikTok sell ads? Okay, let me think. Wow. Tough question. My answer, “Does President Gi take an interest in the Internet?”
The write up includes a link to a Twitter post which shows the beta format. You can view it at this link.
I want to point out that TikTok is a useful source of open source intelligence, captures information of interest to those who want to pinpoint susceptible individuals, and generates high value data about users interested in a specific type of content and the creators of that content.
Now TikTok will be on the agenda of meetings at three of the world’s most loved companies. Yep, Amazon, Facebook, and Google. Who loves these outfits the most? Advertisers!
Stephen E Arnold, March 29, 2022
Zuck Pestered by Legal Flies in Canberra
March 18, 2022
My most interesting experience in Canberra was the flies. These knew I was giving a lecture at the International Chiefs of Police Conference. I met a number of dedicated and effective law enforcement professionals. But I remember the flies. These critters besieged me when I walked from the conference hotel to a small market. I know I bought a hat with a mesh curtain, but the flies were persistent.
Meta Facebook whatever is learning that there are dedicated and effective public servants in Australia. The Zuck is discovering what I conceptualize as lawyers with the stick-to-ativity of those Canberra flies.
“Australian Watchdog Sues Facebook-Owner Meta over Scam Advertisements” — from a trusted source no less — explains that the Australian competition watchdog is taking action for Zuck’s alleged advertising methods. What are these? Nothing new: Allegations of questionable conduct and in appropriate use of images. (Note: You may have to cough up personal data or pay to view the source article from those trust worthy folks.)
I am not sure how Meta’s leadership team is leaning in to this most recent challenge. One downstream consequence is that countries allied with Australia are likely to monitor the legal action. If it prevails, other countries may pursue similar actions.
Flies. Annoying.
Stephen E Arnold, March 18, 2022