Facebook and Smoothing Data

November 26, 2021

I like this headline: “The Thousands of Vulnerable People Harmed by Facebook and Instagram Are Lost in Meta’s Average User Data.” Here’s a passage I noticed:

consider a world in which Instagram has a rich-get-richer and poor-get-poorer effect on the well-being of users. A majority, those already doing well to begin with, find Instagram provides social affirmation and helps them stay connected to friends. A minority, those who are struggling with depression and loneliness, see these posts and wind up feeling worse. If you average them together in a study, you might not see much of a change over time.

The write up points out:

The tendency to ignore harm on the margins isn’t unique to mental health or even the consequences of social media. Allowing the bulk of experience to obscure the fate of smaller groups is a common mistake, and I’d argue that these are often the people society should be most concerned about. It can also be a pernicious tactic. Tobacco companies and scientists alike once argued that premature death among some smokers was not a serious concern because most people who have smoked a cigarette do not die of lung cancer.

I like the word “pernicious.” But the keeper is “cancer.” The idea is, it seems to me, that Facebook – sorry, meta — is “cancer.” Cancer is A term for diseases in which abnormal cells divide without control and can invade nearby tissues. Cancer evokes a particularly sonorous word too: Malignancy. Indeed the bound phrase when applied to one’s great aunt is particularly memorable; for example, Auntie has a malignant tumor.

Is Facebook — sorry, Meta — is smoothing numbers the way the local baker applies icing to a so-so cake laced with a trendy substances like cannabutter and cannaoil? My hunch is that dumping outliers, curve fitting, and subsetting data are handy little tools.

What’s the harm?

Stephen E Arnold, November 26, 2021

Enough of the Meta, Meta, Meta Stuff, Please

November 19, 2021

Facebook recently decided to rename itself Meta in an attempt to appeal to younger demographics. The move has not been met with positive results, but Facebook is hoping a new move with its ads will. Market Beat explains the new change with Facebook’s ads in, “Facebook Parent Meta To Remove Sensitive Ad Categories.” Meta wants Facebook to remove the its targeted ad option for health, ethnicity, political affiliations, religions, and sexual orientations.

The current ad system tracks user activity and sends users targeted ads on the platform based on topics that interest then. Meta will definitely lose profits:

“Meta Platforms Inc. said in a blog post Tuesday that the decision was ‘not easy and we know this change may negatively impact some businesses and organizations.’ Shares of the company closed at $335.37 Tuesday, down almost 1%. ‘Some of our advertising partners have expressed concerns about these targeting options going away because of their ability to help generate positive societal change, while others understand the decision to remove them,’ wrote Graham Mudd, vice president of marketing and ads. ‘Like many of our decisions, this was not a simple choice and required a balance of competing interests where there was advocacy in both directions.’”

Facebook’s ads have generated negative publicity in the past. Facebook ads notoriously upset the 2016 US presidential election and spread misinformation regarding vaccinations. Meta Platforms does not want Facebook’s targeted system to discriminate against all individuals.

Why is Meta being more responsible than Facebook? Was a name change the only thing required to overhaul a poor ad system? Meta might be rushing to fix problems now, but the company is already facing lots of lawsuits for its fast indiscretions.

Whitney Grace, November 19, 2021

Metazuck Portal Go Is Yet Another Innovation for Kiddies

November 18, 2021

Facebook designed a new triangle shaped Portal Go for kids. The selling gimmick is that children can easily use it for video calling friends and family. There is one big drawback, Buzzfeed says in the article, “Facebook’s New Portal Go Is Nearly Perfect, Except It’s Horrible For Kids.” The biggest drawback is that it is yet another way for children to access unlimited, binge content through Facebook.

Video binging is a problem that kids have been doing since TV and movies were invented. The only problem with today’s binge videos is that they never stop. YouTube, Vimeo, and Facebook stream an endless supply of worthless content. The new Portal Go has a Watch feature where video callers can simultaneously watch the same thing. Binge watching can be disabled when the Portal Go is changed to Household Mode. During video calls, however, the Watch feature is still available. Facebook does plan to have a disable feature when it updates the Portal Go’s software, but:

“This is a perfect encapsulation of a recurring Facebook problem. The company creates a compelling product, something to like even if you don’t “like” Facebook (let me count the reasons why: 1, 2, 3never mind). And then it tries to cram some other dreadful Facebook product down your throat along with it. Force-feeding Portal Go users a dogshit Watch experience is just another version of putting the Reels button in the center of Instagram, where the button to post a photo used to be, or prompting you to repost your Instagram Stories to Facebook Stories.”

Facebook does create reoccurring nightmares. Facebook new Portal Go needs a moral compass like every new piece of technology.

Whitney Grace, November 18, 2021

Facebook: A Buckeye for Good Luck or Bad Zuck?

November 17, 2021

Facebook is an excellent example of what happens when a high school science club “we know a lot” mentality. The company has been quite successful. Its advertising is thriving. Its ageing user base shows no reluctance to check out pictures of their grandchildren. Enterprising vendors have found Facebook classifieds an idea way to sell a wide range of products.

The Ohio Attorney General, however, does not see these benefits as material. “Attorney General Yost Sues Facebook for Securities Fraud after Misleading Disclosures, Allegations of Harm to Children.” The write up states:

Zuckerberg and other company officials, the lawsuit maintains, knew that they were making false statements regarding the safety, security and privacy of its platforms. Facebook admitted in those internal documents that “We are not actually doing what we say we do publicly.”

Was their harm? The write up says:

In roughly a month, those revelations caused a devaluation in Facebook’s stock of $54.08 per share, causing OPERS and other Facebook investors to lose more than $100 billion. Yost’s lawsuit not only seeks to recover that lost value but also demands that Facebook make significant reforms to ensure it does not mislead the public about its internal practices.

The case will take time to resolve. Several observations:

  1. Other “funds” may find an idea or two in the Ohio matter. Science club wizards are not comfortable when the non-science club people pile on and try to take their lunch money and their golden apple. Maybe more AG actions?
  2. The focus on money is more compelling than harming Facebook users. Money is the catnip for some, including OPERS-type outfits.
  3. Quoting the Zuck may signal that luck is running out for the pioneer of many interactions.

Net net: Worth monitoring this matter. It may come to nothing. On the other hand it may come to some settlement, just smaller than $100 billion. Jail time? Interesting question.

Stephen E Arnold, November 16, 2021

Facebook: Whom Do We Trust?

November 15, 2021

Despite common sense, people continue to trust Facebook for news. Informed people realize that Facebook is not a reliable news source. Social media and communication experts are warning the US government about the dangers. Roll Call delves into the details in, “Facebook Can’t Be Trusted To Stop Spread Of Extremism, Experts Tell Senate.”

Social media and communication experts testified at the Senate Homeland Security and Governmental Affairs Committee. They explained to the US Senate that social media platforms, especially Facebook, are incapable of containing the spread of violent, extremist content. Facebook personalization algorithms recommend extremist content and it becomes an addiction:

“Karen Kornbluh, who directs the Digital Innovation and Democracy Initiative at the German Marshall Fund of the United States, described how social media algorithms can quickly lure a user from innocuous political content to instructional videos for forming a militia. ‘This is a national security vulnerability,’ Kornbluh said, giving a small number of content producers the ability to exploit social media algorithms to gain enormous reach.‘Social media goes well beyond providing users tools to connect organically with others,’ she testified. ‘It pulls users into rabbit holes and empowers small numbers of extremist recruiters to engineer algorithmic radicalization.’”

Social media platform companies prioritize profit over societal well-being, because the extremist and polarizing content keeps users’ eyeballs glued to their screens. Experts want the US government to force social media platforms to share their algorithms so they can be studied. Facebook and its brethren argue their algorithms are proprietary technology and poses business risks if made public.

Social media companies know more their users than the public knows about the companies. The experts argue that social media platforms should be more transparent so users can understand how their views are distorted.

Whitney Grace, November 15, 2021

Survey Says: Facebook Is a Problem

November 11, 2021

I believe everything I read on the Internet. I also have great confidence in surveys conducted by estimable news organizations. A double whammy for me was SSRS Research Refined CNN Study. You can read the big logo version at this link.

The survey reports that Facebook is a problem. Okay, who knew?

Here’s a snippet about the survey:

About one-third of the public — including 44% of Republicans and 27% of Democrats — say both that Facebook is making American society worse and that Facebook itself is more at fault than its users.

Delightful.

Stephen E Arnold, November 11, 2021

Reigning in Big Tech: Facebook May Be Free to Roam

November 10, 2021

I noted that Google was, in effect, not guilty of tracking Safari users. You can read the the UK court decision here. Sure, Google has to pay a $3 billion fine for abusing its control over search, but there are appeals in the future. Google can afford appeals and allow turnover in the EU staff to dull the knife that slices at Googzilla.

I found “Why the Rest of the World Shrugged at the Facebook Papers” more interesting. The main point of the write up is, “Meh, Facebook.” Here’s a passage I noted:

Even many in civil society have barely registered the leaks.

Facebook may be able to do some hand waving, issue apologetic statements, and keep on its current path. Advertisers and stakeholders are like to find the report cited above reassuring.

Interesting.

Stephen E Arnold, November 10, 2021

Meta: A Stroke of Genius or a Dropout Idea from a Dropout

November 10, 2021

I read an article called “Thoughts on Facebook Meta.” The main idea of the essay surprised me. Here’s the passage which caught my attention:

I think the metaverse will be massive not so much because gaming and VR will be big, but because gaming and VR will be the only avenue to thrive for the bottom 80% of people on the planet.

I also circled in red this passage:

Anyway, this is a smart move by Face-meta. It allows Zuckerberg to dodge the scrutiny bullets and become a quixotic futurist, and at the same time build the reality substrate for 80% of the planet.

Net net: The Zuck does it again. He likes old-school barbeque sauce, not New Coke. The question is, “What will government regulators like?”

Stephen E Arnold, November 10, 2021

Facebook: Who Sees Disturbing Content?

November 4, 2021

Time, now an online service owned by Salesforce founders, published “Why Some People See More Disturbing Content on Facebook Than Others, According to Leaked Documents.”

The user categories exposed to more troubling Facebook content are, according to Facebook’s researchers:

vulnerable communities, including Black, elderly and low-income users, are among the groups most harmed by the prevalence of disturbing content on the site. At the time of the 2019 integrity report, Facebook’s researchers were still defining what constituted disturbing.

Interesting.

Stephen E Arnold, November 4, 2021

Facebook under the Meta Umbrella May Be a Teddy Bear

November 2, 2021

Facebook (oops, Meta) appears to be changing now that it is under the Meta umbrella. “Facebook Will Let Kazakhstan Government Directly Flag Content the Country Deems Harmful” reports:

Facebook owner Meta Platforms has granted the Kazakh government access to its content reporting system, after the Central Asian nation threatened to block the social network for millions of local users.

Will Kazakhstan be a pace-setter like China and Russia when it comes to country specific censorship? If Facebook (oops, Meta) finds that TikTok and other non-Zuck properties do not appeal to young people, Facebook (oops, Meta) will have to trade off its long-cherished policies for deals that generate revenue.

Money is the pressure point which caused Facebook (oops, Meta) to indicate that it has a kinder, gentler side. What other countries will want to embrace the warm and fuzzy social media giant’s alleged new approach?

Stephen E Arnold, November 2, 2021

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta