Facebook: Be Proud
December 31, 2021
Gee, who could have predicted it? MacRumors reports, “Facebook Crowned ‘Worst Company of 2021’ by New Survey.” Each December Yahoo Finance picks a Company of the Year based on factors like achievements and market performance, and stalwart Microsoft has come out on top for 2021. The less coveted worst company designation is determined by a survey of Yahoo Finance’s audience. Writer Sami Fathi reports:
“According to the survey, which polled over 1,000 individuals, Facebook received 50% more votes for the spot compared to Alibaba, a Chinese e-commerce platform. Those surveyed have a ‘litany of grievances’ towards Facebook, including but not limited to concerns over censorship, reports about Instagram’s impact on mental health, and privacy. While the survey results are grim and not in the company’s favor, 30% of the participants responded positively to whether Facebook could ‘redeem itself.’ … Facebook has long been embroiled in public concerns over the privacy of its users. Facebook has notably fought with Apple over changes in iOS and iPadOS that make it harder for companies to track users across other apps and websites. Facebook has claimed the new change, App Tracking Transparency, would hurt small businesses that rely on advertising to attract new customers. Mark Zuckerberg has gone as far as to say that Facebook’s lackluster growth in the last quarter of the year was partly to blame on ATT (App Tracking Transparency).”
For his part, Apple CEO Tim Cook flings criticism right back at the Zuck. Which is worse—giving up a little revenue or fomenting polarization and violence? I suppose it depends on one’s perspective. The company formerly known as Facebook is hoping its Meta rebrand will distract everyone from its woes, but going by this survey that has yet to happen.
Cynthia Murrell, December 31, 2021
Facebook: Making Friends in the USAF
December 31, 2021
This is a short post sparked by this Financial Times’ article: “Facebook to Build Metaverse with Start-Up That Had US Military Contracts.”
The main idea is that Facebook bought a company. The firm — Reverie— will work with Meta Facebook thing’s Reality Labs. But the bonus move is that the Meta Facebook thing was terminated when the Meta Facebook thing bought Reverie. The venerable and generally respectable Financial Times pointed out that the Meta Facebook thing would “not be involved with any future defense or military AI development.”
Okay. My hunch is that a Meta Facebook thing employee whose child seeks to enter the Air Force Academy may find that some of those involved in the selection process may remember this “not be involved with any future defense or military AI development.”
Who likes this type of business decision? Maybe the Chinese and Russian military leadership? But that’s just a thought from the wilds of rural Kentucky. The Meta Facebook thing knows what’s best for itself and, of course, the US government.
Stephen E Arnold, December 31, 2021
Reputation Repair Via Content Moderation? Possibly a Long Shot
December 30, 2021
Meta (formerly known as Facebook) is launching another shot in the AI war. CNet reports, “Facebook Parent Meta Uses AI to Tackle New Types of Harmful Content.” The new tool is intended to flag posts containing misinformation and those promoting violence. It also seems designed to offset recent criticism of the company, especially charges it is not doing enough to catch fake COVID-19 news.
As Meta moves forward with its grand plans for the metaverse, it is worth noting the company predicts this tech will also work on complex virtual reality content. Eventually. Writer Queenie Wong tells us:
“Generally, AI systems learn new tasks from examples, but the process of gathering and labeling a massive amount of data typically takes months. Using technology Meta calls Few-Shot Learner, the new AI system needs only a small amount of training data so it can adjust to combat new types of harmful content within weeks instead of months. The social network, for example, has rules against posting harmful COVID-19 vaccine misinformation, including false claims that the vaccine alters DNA. But users sometimes phrase their remarks as a question like ‘Vaccine or DNA changer?’ or even use code words to try to evade detection. The new technology, Meta says, will help the company catch content it might miss. … Meta said it tested the new system and it was able to identify offensive content that conventional AI systems might not catch. After rolling out the new system on Facebook and its photo-service Instagram, the percentage of views of harmful content users saw decreased, Meta said. Few-Shot Learner works in more than 100 languages.”
Yep, another monopoly type outfit doing the better, faster, cheaper thing while positioning the move as a boon for users. Will Few-Shot help Meta salvage its reputation?
Cynthia Murrell, December 30, 2021
Facebook Innovates: Beating Heart Emojis
December 29, 2021
I could not resist citing this write up: “WhatsApp Working on Animated Heart Emojis for Android, iOS: Report.” What’s the big news for 2022 from the most loved, oops sorry, worst company in the United States? Here’s the answer according to Gadgets360:
WhatsApp is reportedly planning to add animation to all the heart emojis of various colors for Android and iOS. This could be linked to the message reaction feature that the platform is said to be working on. The feature has been already added to WhatsApp Web/ Desktop via a stable update.
The beating hearts are chock full of meaning. The pulsing image files provide notification information. The compelling news story added:
WhatsApp is rumored to allow users to react to a specific message in a chat with specific emojis. There is also a reaction info tab to show who reacted to a message. Message reactions are reported to be rolled out to individual chat threads and group chat threads.
Definitely impressive. What use will bad actors using WhatsApp for interesting use cases find for pulsing hearts or other quivering emojis?
Stephen E Arnold, December 29, 2021
Meta Mark Gets an F from the British Medical Journal
December 20, 2021
I don’t know anything about Covid, medical data, or Facebook. I do recognize a failing “mark” when I see one. I noted “Researcher Blows the Whistle on Data Integrity Issues…” [Note: the editor has trimmed certain stop words because trigger warning software is a fascinating part of life these days.’’]
The Harvard drop out who has garnered a few dollars via a “friend”, “like”, and “social online” service is unlikely to be personally affected by the big red F.
The write up states:
We are aware that The BMJ is not the only high quality information provider to have been affected by the incompetence of Meta’s fact checking regime. To give one other example, we would highlight the treatment by Instagram (also owned by Meta) of Cochrane, the international provider of high quality systematic reviews of the medical evidence.[3] Rather than investing a proportion of Meta’s substantial profits to help ensure the accuracy of medical information shared through social media, you have apparently delegated responsibility to people incompetent in carrying out this crucial task. Fact checking has been a staple of good journalism for decades. What has happened in this instance should be of concern to anyone who values and relies on sources such as The BMJ. We hope you will act swiftly: specifically to correct the error relating to The BMJ’s article and to review the processes that led to the error; and generally to reconsider your investment in and approach to fact checking overall.
I was disappointed to see the letter’s close; that is, “best wishes.” A more British expression could have been “Excuse me.” But excusing a stupid “mark” is impolite.
Stephen E Arnold, December 20, 2021
Meta Shows Its True Face
December 17, 2021
I have no idea if a Meta executive offered the statement reported in “Top Meta Exec Blames Users for Spreading Misinformation.” We live in the Stone Age of Deep Fakes, redefining words like unlimited, and getting rid of bad grades. Anything is, therefore, possible.
Here’s the statement I noted:
“The individual humans are the ones who choose to believe or not believe a thing; they’re the ones that choose to share or not to share a thing.”
There you go. We built a platform. We provided algorithms to boost engagement. We shape the content flows. And users are responsible for whatever.
Amazing. I really want to work at “two face” or — sorry — Meta so I can sit in meetings and validate this viewpoint myself. Bizarro world, whatever, I want to be the only 77 year old in Kentucky to hear Meta’s wisdoming first hand. Zoom is okay with me, but I am wise to Zoom hacks which fake who is really in the meeting.
Stephen E Arnold, December 17, 2021
If One Thinks One Is Caesar, Is That Person Caesar? Thumbs Up or Thumbs Down
December 7, 2021
I read a story which may or may not be spot on. Nevertheless, I found it amusing, and if true, not so funny. The story is “Facebook Refuses to Recognize Biden’s FTC As Legitimate.” I am not sure if the original version of JP Morgan would have made this statement. Maybe he did?
Here’s a statement from the article which I circled in Facebook blue:
The FTC didn’t “plausibly establish” that the company “maintained a monopoly through unlawful, anticompetitive conduct.” It asked the court to dismiss the complaint with prejudice. In the court filing, Facebook also once again argued that Khan should recuse herself, saying that her not doing so will “taint all of the agency’s litigation choices in the event the case proceeds.”
I think Julius Caesar, before he had a bad day, allegedly said:
If you must break the law, do it to seize power: in all other cases observe it.
My thought is, “Enough of this pretending to be powerful.” Let’s make the US a real 21st century banana republic. Is there a T shirt which says, “Tech Rules” on the back and “I am Julius” on the front? There may be a market for one or two.
Stephen E Arnold, December 7, 2021
Facebook and Smoothing Data
November 26, 2021
I like this headline: “The Thousands of Vulnerable People Harmed by Facebook and Instagram Are Lost in Meta’s Average User Data.” Here’s a passage I noticed:
consider a world in which Instagram has a rich-get-richer and poor-get-poorer effect on the well-being of users. A majority, those already doing well to begin with, find Instagram provides social affirmation and helps them stay connected to friends. A minority, those who are struggling with depression and loneliness, see these posts and wind up feeling worse. If you average them together in a study, you might not see much of a change over time.
The write up points out:
The tendency to ignore harm on the margins isn’t unique to mental health or even the consequences of social media. Allowing the bulk of experience to obscure the fate of smaller groups is a common mistake, and I’d argue that these are often the people society should be most concerned about. It can also be a pernicious tactic. Tobacco companies and scientists alike once argued that premature death among some smokers was not a serious concern because most people who have smoked a cigarette do not die of lung cancer.
I like the word “pernicious.” But the keeper is “cancer.” The idea is, it seems to me, that Facebook – sorry, meta — is “cancer.” Cancer is A term for diseases in which abnormal cells divide without control and can invade nearby tissues. Cancer evokes a particularly sonorous word too: Malignancy. Indeed the bound phrase when applied to one’s great aunt is particularly memorable; for example, Auntie has a malignant tumor.
Is Facebook — sorry, Meta — is smoothing numbers the way the local baker applies icing to a so-so cake laced with a trendy substances like cannabutter and cannaoil? My hunch is that dumping outliers, curve fitting, and subsetting data are handy little tools.
What’s the harm?
Stephen E Arnold, November 26, 2021
Enough of the Meta, Meta, Meta Stuff, Please
November 19, 2021
Facebook recently decided to rename itself Meta in an attempt to appeal to younger demographics. The move has not been met with positive results, but Facebook is hoping a new move with its ads will. Market Beat explains the new change with Facebook’s ads in, “Facebook Parent Meta To Remove Sensitive Ad Categories.” Meta wants Facebook to remove the its targeted ad option for health, ethnicity, political affiliations, religions, and sexual orientations.
The current ad system tracks user activity and sends users targeted ads on the platform based on topics that interest then. Meta will definitely lose profits:
“Meta Platforms Inc. said in a blog post Tuesday that the decision was ‘not easy and we know this change may negatively impact some businesses and organizations.’ Shares of the company closed at $335.37 Tuesday, down almost 1%. ‘Some of our advertising partners have expressed concerns about these targeting options going away because of their ability to help generate positive societal change, while others understand the decision to remove them,’ wrote Graham Mudd, vice president of marketing and ads. ‘Like many of our decisions, this was not a simple choice and required a balance of competing interests where there was advocacy in both directions.’”
Facebook’s ads have generated negative publicity in the past. Facebook ads notoriously upset the 2016 US presidential election and spread misinformation regarding vaccinations. Meta Platforms does not want Facebook’s targeted system to discriminate against all individuals.
Why is Meta being more responsible than Facebook? Was a name change the only thing required to overhaul a poor ad system? Meta might be rushing to fix problems now, but the company is already facing lots of lawsuits for its fast indiscretions.
Whitney Grace, November 19, 2021
Metazuck Portal Go Is Yet Another Innovation for Kiddies
November 18, 2021
Facebook designed a new triangle shaped Portal Go for kids. The selling gimmick is that children can easily use it for video calling friends and family. There is one big drawback, Buzzfeed says in the article, “Facebook’s New Portal Go Is Nearly Perfect, Except It’s Horrible For Kids.” The biggest drawback is that it is yet another way for children to access unlimited, binge content through Facebook.
Video binging is a problem that kids have been doing since TV and movies were invented. The only problem with today’s binge videos is that they never stop. YouTube, Vimeo, and Facebook stream an endless supply of worthless content. The new Portal Go has a Watch feature where video callers can simultaneously watch the same thing. Binge watching can be disabled when the Portal Go is changed to Household Mode. During video calls, however, the Watch feature is still available. Facebook does plan to have a disable feature when it updates the Portal Go’s software, but:
“This is a perfect encapsulation of a recurring Facebook problem. The company creates a compelling product, something to like even if you don’t “like” Facebook (let me count the reasons why: 1, 2, 3…never mind). And then it tries to cram some other dreadful Facebook product down your throat along with it. Force-feeding Portal Go users a dogshit Watch experience is just another version of putting the Reels button in the center of Instagram, where the button to post a photo used to be, or prompting you to repost your Instagram Stories to Facebook Stories.”
Facebook does create reoccurring nightmares. Facebook new Portal Go needs a moral compass like every new piece of technology.
Whitney Grace, November 18, 2021