Reputation Repair Via Content Moderation? Possibly a Long Shot

December 30, 2021

Meta (formerly known as Facebook) is launching another shot in the AI war. CNet reports, “Facebook Parent Meta Uses AI to Tackle New Types of Harmful Content.” The new tool is intended to flag posts containing misinformation and those promoting violence. It also seems designed to offset recent criticism of the company, especially charges it is not doing enough to catch fake COVID-19 news.

As Meta moves forward with its grand plans for the metaverse, it is worth noting the company predicts this tech will also work on complex virtual reality content. Eventually. Writer Queenie Wong tells us:

“Generally, AI systems learn new tasks from examples, but the process of gathering and labeling a massive amount of data typically takes months. Using technology Meta calls Few-Shot Learner, the new AI system needs only a small amount of training data so it can adjust to combat new types of harmful content within weeks instead of months. The social network, for example, has rules against posting harmful COVID-19 vaccine misinformation, including false claims that the vaccine alters DNA. But users sometimes phrase their remarks as a question like ‘Vaccine or DNA changer?’ or even use code words to try to evade detection. The new technology, Meta says, will help the company catch content it might miss. … Meta said it tested the new system and it was able to identify offensive content that conventional AI systems might not catch. After rolling out the new system on Facebook and its photo-service Instagram, the percentage of views of harmful content users saw decreased, Meta said. Few-Shot Learner works in more than 100 languages.”

Yep, another monopoly type outfit doing the better, faster, cheaper thing while positioning the move as a boon for users. Will Few-Shot help Meta salvage its reputation?

Cynthia Murrell, December 30, 2021

Facebook Innovates: Beating Heart Emojis

December 29, 2021

I could not resist citing this write up: “WhatsApp Working on Animated Heart Emojis for Android, iOS: Report.” What’s the big news for 2022 from the most loved, oops sorry, worst company in the United States? Here’s the answer according to Gadgets360:

WhatsApp is reportedly planning to add animation to all the heart emojis of various colors for Android and iOS. This could be linked to the message reaction feature that the platform is said to be working on. The feature has been already added to WhatsApp Web/ Desktop via a stable update.

The beating hearts are chock full of meaning. The pulsing image files provide notification information. The compelling news story added:

WhatsApp is rumored to allow users to react to a specific message in a chat with specific emojis. There is also a reaction info tab to show who reacted to a message. Message reactions are reported to be rolled out to individual chat threads and group chat threads.

Definitely impressive. What use will bad actors using WhatsApp for interesting use cases find for pulsing hearts or other quivering emojis?

Stephen E Arnold, December 29, 2021

Meta Mark Gets an F from the British Medical Journal

December 20, 2021

I don’t know anything about Covid, medical data, or Facebook. I do recognize a failing “mark” when I see one. I noted “Researcher Blows the Whistle on Data Integrity Issues…” [Note: the editor has trimmed certain stop words because trigger warning software is a fascinating part of life these days.’’]

The Harvard drop out who has garnered a few dollars via a “friend”, “like”, and “social online” service is unlikely to be personally affected by the big red F.

The write up states:

We are aware that The BMJ is not the only high quality information provider to have been affected by the incompetence of Meta’s fact checking regime. To give one other example, we would highlight the treatment by Instagram (also owned by Meta) of Cochrane, the international provider of high quality systematic reviews of the medical evidence.[3] Rather than investing a proportion of Meta’s substantial profits to help ensure the accuracy of medical information shared through social media, you have apparently delegated responsibility to people incompetent in carrying out this crucial task. Fact checking has been a staple of good journalism for decades. What has happened in this instance should be of concern to anyone who values and relies on sources such as The BMJ. We hope you will act swiftly: specifically to correct the error relating to The BMJ’s article and to review the processes that led to the error; and generally to reconsider your investment in and approach to fact checking overall.

I was disappointed to see the letter’s close; that is, “best wishes.” A more British expression could have been “Excuse me.” But excusing a stupid “mark” is impolite.

Stephen E Arnold, December 20, 2021

Meta Shows Its True Face

December 17, 2021

I have no idea if a Meta executive offered the statement reported in “Top Meta Exec Blames Users for Spreading Misinformation.” We live in the Stone Age of Deep Fakes, redefining words like unlimited, and getting rid of bad grades. Anything is, therefore, possible.

Here’s the statement I noted:

“The individual humans are the ones who choose to believe or not believe a thing; they’re the ones that choose to share or not to share a thing.”

There you go. We built a platform. We provided algorithms to boost engagement. We shape the content flows. And users are responsible for whatever.

Amazing. I really want to work at “two face” or — sorry — Meta so I can sit in meetings and validate this viewpoint myself. Bizarro world, whatever, I want to be the only 77 year old in Kentucky to hear Meta’s wisdoming first hand. Zoom is okay with me, but I am wise to Zoom hacks which fake who is really in the meeting.

Stephen E Arnold, December 17, 2021

If One Thinks One Is Caesar, Is That Person Caesar? Thumbs Up or Thumbs Down

December 7, 2021

I read a story which may or may not be spot on. Nevertheless, I found it amusing, and if true, not so funny. The story is “Facebook Refuses to Recognize Biden’s FTC As Legitimate.” I am not sure if the original version of JP Morgan would have made this statement. Maybe he did?

Here’s a statement from the article which I circled in Facebook blue:

The FTC didn’t “plausibly establish” that the company “maintained a monopoly through unlawful, anticompetitive conduct.” It asked the court to dismiss the complaint with prejudice. In the court filing, Facebook also once again argued that Khan should recuse herself, saying that her not doing so will “taint all of the agency’s litigation choices in the event the case proceeds.”

I think Julius Caesar, before he had a bad day, allegedly said:

If you must break the law, do it to seize power: in all other cases observe it.

My thought is, “Enough of this pretending to be powerful.” Let’s make the US a real 21st century banana republic. Is there a T shirt which says, “Tech Rules” on the back and “I am Julius” on the front? There may be a market for one or two.

Stephen E Arnold, December 7, 2021

Facebook and Smoothing Data

November 26, 2021

I like this headline: “The Thousands of Vulnerable People Harmed by Facebook and Instagram Are Lost in Meta’s Average User Data.” Here’s a passage I noticed:

consider a world in which Instagram has a rich-get-richer and poor-get-poorer effect on the well-being of users. A majority, those already doing well to begin with, find Instagram provides social affirmation and helps them stay connected to friends. A minority, those who are struggling with depression and loneliness, see these posts and wind up feeling worse. If you average them together in a study, you might not see much of a change over time.

The write up points out:

The tendency to ignore harm on the margins isn’t unique to mental health or even the consequences of social media. Allowing the bulk of experience to obscure the fate of smaller groups is a common mistake, and I’d argue that these are often the people society should be most concerned about. It can also be a pernicious tactic. Tobacco companies and scientists alike once argued that premature death among some smokers was not a serious concern because most people who have smoked a cigarette do not die of lung cancer.

I like the word “pernicious.” But the keeper is “cancer.” The idea is, it seems to me, that Facebook – sorry, meta — is “cancer.” Cancer is A term for diseases in which abnormal cells divide without control and can invade nearby tissues. Cancer evokes a particularly sonorous word too: Malignancy. Indeed the bound phrase when applied to one’s great aunt is particularly memorable; for example, Auntie has a malignant tumor.

Is Facebook — sorry, Meta — is smoothing numbers the way the local baker applies icing to a so-so cake laced with a trendy substances like cannabutter and cannaoil? My hunch is that dumping outliers, curve fitting, and subsetting data are handy little tools.

What’s the harm?

Stephen E Arnold, November 26, 2021

Enough of the Meta, Meta, Meta Stuff, Please

November 19, 2021

Facebook recently decided to rename itself Meta in an attempt to appeal to younger demographics. The move has not been met with positive results, but Facebook is hoping a new move with its ads will. Market Beat explains the new change with Facebook’s ads in, “Facebook Parent Meta To Remove Sensitive Ad Categories.” Meta wants Facebook to remove the its targeted ad option for health, ethnicity, political affiliations, religions, and sexual orientations.

The current ad system tracks user activity and sends users targeted ads on the platform based on topics that interest then. Meta will definitely lose profits:

“Meta Platforms Inc. said in a blog post Tuesday that the decision was ‘not easy and we know this change may negatively impact some businesses and organizations.’ Shares of the company closed at $335.37 Tuesday, down almost 1%. ‘Some of our advertising partners have expressed concerns about these targeting options going away because of their ability to help generate positive societal change, while others understand the decision to remove them,’ wrote Graham Mudd, vice president of marketing and ads. ‘Like many of our decisions, this was not a simple choice and required a balance of competing interests where there was advocacy in both directions.’”

Facebook’s ads have generated negative publicity in the past. Facebook ads notoriously upset the 2016 US presidential election and spread misinformation regarding vaccinations. Meta Platforms does not want Facebook’s targeted system to discriminate against all individuals.

Why is Meta being more responsible than Facebook? Was a name change the only thing required to overhaul a poor ad system? Meta might be rushing to fix problems now, but the company is already facing lots of lawsuits for its fast indiscretions.

Whitney Grace, November 19, 2021

Metazuck Portal Go Is Yet Another Innovation for Kiddies

November 18, 2021

Facebook designed a new triangle shaped Portal Go for kids. The selling gimmick is that children can easily use it for video calling friends and family. There is one big drawback, Buzzfeed says in the article, “Facebook’s New Portal Go Is Nearly Perfect, Except It’s Horrible For Kids.” The biggest drawback is that it is yet another way for children to access unlimited, binge content through Facebook.

Video binging is a problem that kids have been doing since TV and movies were invented. The only problem with today’s binge videos is that they never stop. YouTube, Vimeo, and Facebook stream an endless supply of worthless content. The new Portal Go has a Watch feature where video callers can simultaneously watch the same thing. Binge watching can be disabled when the Portal Go is changed to Household Mode. During video calls, however, the Watch feature is still available. Facebook does plan to have a disable feature when it updates the Portal Go’s software, but:

“This is a perfect encapsulation of a recurring Facebook problem. The company creates a compelling product, something to like even if you don’t “like” Facebook (let me count the reasons why: 1, 2, 3never mind). And then it tries to cram some other dreadful Facebook product down your throat along with it. Force-feeding Portal Go users a dogshit Watch experience is just another version of putting the Reels button in the center of Instagram, where the button to post a photo used to be, or prompting you to repost your Instagram Stories to Facebook Stories.”

Facebook does create reoccurring nightmares. Facebook new Portal Go needs a moral compass like every new piece of technology.

Whitney Grace, November 18, 2021

Facebook: A Buckeye for Good Luck or Bad Zuck?

November 17, 2021

Facebook is an excellent example of what happens when a high school science club “we know a lot” mentality. The company has been quite successful. Its advertising is thriving. Its ageing user base shows no reluctance to check out pictures of their grandchildren. Enterprising vendors have found Facebook classifieds an idea way to sell a wide range of products.

The Ohio Attorney General, however, does not see these benefits as material. “Attorney General Yost Sues Facebook for Securities Fraud after Misleading Disclosures, Allegations of Harm to Children.” The write up states:

Zuckerberg and other company officials, the lawsuit maintains, knew that they were making false statements regarding the safety, security and privacy of its platforms. Facebook admitted in those internal documents that “We are not actually doing what we say we do publicly.”

Was their harm? The write up says:

In roughly a month, those revelations caused a devaluation in Facebook’s stock of $54.08 per share, causing OPERS and other Facebook investors to lose more than $100 billion. Yost’s lawsuit not only seeks to recover that lost value but also demands that Facebook make significant reforms to ensure it does not mislead the public about its internal practices.

The case will take time to resolve. Several observations:

  1. Other “funds” may find an idea or two in the Ohio matter. Science club wizards are not comfortable when the non-science club people pile on and try to take their lunch money and their golden apple. Maybe more AG actions?
  2. The focus on money is more compelling than harming Facebook users. Money is the catnip for some, including OPERS-type outfits.
  3. Quoting the Zuck may signal that luck is running out for the pioneer of many interactions.

Net net: Worth monitoring this matter. It may come to nothing. On the other hand it may come to some settlement, just smaller than $100 billion. Jail time? Interesting question.

Stephen E Arnold, November 16, 2021

Facebook: Whom Do We Trust?

November 15, 2021

Despite common sense, people continue to trust Facebook for news. Informed people realize that Facebook is not a reliable news source. Social media and communication experts are warning the US government about the dangers. Roll Call delves into the details in, “Facebook Can’t Be Trusted To Stop Spread Of Extremism, Experts Tell Senate.”

Social media and communication experts testified at the Senate Homeland Security and Governmental Affairs Committee. They explained to the US Senate that social media platforms, especially Facebook, are incapable of containing the spread of violent, extremist content. Facebook personalization algorithms recommend extremist content and it becomes an addiction:

“Karen Kornbluh, who directs the Digital Innovation and Democracy Initiative at the German Marshall Fund of the United States, described how social media algorithms can quickly lure a user from innocuous political content to instructional videos for forming a militia. ‘This is a national security vulnerability,’ Kornbluh said, giving a small number of content producers the ability to exploit social media algorithms to gain enormous reach.‘Social media goes well beyond providing users tools to connect organically with others,’ she testified. ‘It pulls users into rabbit holes and empowers small numbers of extremist recruiters to engineer algorithmic radicalization.’”

Social media platform companies prioritize profit over societal well-being, because the extremist and polarizing content keeps users’ eyeballs glued to their screens. Experts want the US government to force social media platforms to share their algorithms so they can be studied. Facebook and its brethren argue their algorithms are proprietary technology and poses business risks if made public.

Social media companies know more their users than the public knows about the companies. The experts argue that social media platforms should be more transparent so users can understand how their views are distorted.

Whitney Grace, November 15, 2021

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta