Facebook: Not Happy

September 20, 2021

“What the Wall Street Journal Got Wrong” is interesting, and you may want to read it. My synopsis is: “We’re doing good.”

I noted this passage from the firm’s top PR dog:

Facebook understands the significant responsibility that comes with operating a global platform. We take it seriously, and we don’t shy away from scrutiny and criticism. But we fundamentally reject this mischaracterization of our work and impugning of the company’s motives.

I like this statement. It’s bold. It ignores the criticism. It sidesteps tricky issues like human trafficking. Very nice.

What makes me happy is the commitment to excellence. I do wonder where the Zuck is in this brutal rejoinder to leaked company info. Is he “leaning in”? Is his leaning out? Practicing a dose doe?

Stephen E Arnold, September 20, 2021

Facebook: A Pattern That Matches the Zuck

September 20, 2021

The laws of the United States (and most countries) are equally applied to everyone, unless you are rich and powerful. Facebook certainly follows this rule of thumb according to The Guardian article, “Facebook: Some High-Profile Users ‘Allowed To Break Platform’s Rules.’” Facebook has to sets of rules, one for high profile users and everyone else. The Wall Street Journal investigated Facebook’s special list.

Rich and powerful people’s profiles, such as politicians, journalists, and celebrities, are placed on a special list that exempts them from Facebook’s rules. The official terms for these shortlisted people are “newsworthy”, “influential or popular” or “PR risky” The special list is called the XCheck or “CrossCheck” system. Supposedly if these exempt people do post any rule breaking content, it is subject to review but that never happens. There are over 5.8 million people on the XCheck system and the list continues to grow:

The WSJ investigation details the process known as “whitelisting”, where some high-profile accounts are not subject to enforcement at all. An internal review in 2019 stated that whitelists “pose numerous legal, compliance, and legitimacy risks for the company and harm to our community”. The review found favouritism to those users to be both widespread and “not publicly defensible”.

Facebook said that the information The Wall Street Journal dug up were outdated and glosses over that the social media platform is actively working on these issues. Facebook is redesigning CrossCheck to improve the system.

Facebook is spouting nothing but cheap talk. Facebook and other social media platforms will allow rich, famous, and powerful people to do whatever they want on their platforms. It does not make sense why Facebook and other social media platforms allow this, unless money is involved.

Whitney Grace, September 20, 2021

Facebook and Social Media: How a Digital Country Perceives Its Reality

September 17, 2021

I read “Instagram Chief Faces Backlash after Awkward Comparison between Cars and Social Media Safety.” This informed senior manager at Facebook seems to have missed a book on many reading lists. The book is one I have mentioned a number of times in the last 12 years since I have been capturing items of interest to me and putting my personal “abstracts” online.

Jacques Ellul is definitely not going to get a job working on the script for the next Star Wars’ film. He won’t be doing a script for a Super Bowl commercial. Most definitely Dr. Ellul will not be founding a church called “New Technology’s Church of Baloney.”

Dr. Ellul died in 1994, and it is not clear if he knew about online or the Internet. He jabbered at the University of Bordeaux, wrote a number of books about technology, and inspired enough people to set up the International Jacques Ellul Society.

One of his books was the Technological Society or in French Le bluff technologique.

The article was sparked my thoughts about Dr. Ellul contains this statement:

“We know that more people die than would otherwise because of car accidents, but by and large, cars create way more value in the world than they destroy,” Mosseri said Wednesday on the Recode Media podcast. “And I think social media is similar.”

Dr. Ellul might have raised a question or two about Instagram’s position. Both are technology; both have had unintended consequences. On one hand, the auto created some exciting social changes which can be observed when sitting in traffic: Eating in the car, road rage, dead animals on the side of the road, etc. On the other hand, social media is sparking upticks in personal destruction of young people, some perceptual mismatches between what their biomass looks like and what an “influencer” looks like wearing clothing from Buffbunny.

Several observations:

  • Facebook is influential, at least sufficiently noteworthy for China to take steps to trim the sails of the motor yacht Zucky
  • Facebook’s pattern of shaping reality via its public pronouncements, testimony before legislative groups, and and on podcasts generates content that seems to be different from a growing body of evidence that Facebook facts are flexible
  • Social media as shaped by the Facebook service, Instagram, and the quite interesting WhatsApp service is perhaps the most powerful information engine created. (I say this fully aware of Google’s influence and Amazon’s control of certain data channels.) Facebook is a digital Major Gérald, just with its own Légion étrangèr.

Net net: Regulation time and fines that amount to more than a few hours revenue for the firm. Also reading Le bluff technologique and writing an essay called, “How technology deconstructs social fabrics.” Blue book, handwritten, and three outside references from peer reviewed journals about human behavior. Due on Monday, please.

Stephen E Arnold, September 17, 2021

Facebook: Continuous Reality Distortion

September 14, 2021

Facebook CEO Mark Zuckerberg stated in 2019 that WhatsApp was designed as a “privacy-focused vision” for communication. WhatsApp supposedly offers end-to-end encryption. ProPublica shares that is not true in, “How Facebook Undermines Privacy Protections For Its 2 Billion WhatsApp Users.” Essentially the majority of WhatsApp messages are private, but items users flag are sifted through WhatsApp employees.

These employees monitor the flagged messages for child pornography, terroristic plots, spam, and more. This type of monitoring appears contrary to WhatsApp’s mission, but Carl Woog, the director of communications, did not regard this as content monitoring and saw it as preventing abuse.

WhatsApp reviewers sign NDAs and, if asked, say they work for Accenture. They review over 600 violation tickets a day, leaving less than a minute for each one, then they decide if they should ban the account, put the user on “watch,” or do nothing. Reviewers are required to:

“WhatsApp moderators must make subjective, sensitive and subtle judgments, interviews and documents examined by ProPublica show. They examine a wide range of categories, including “Spam Report,” “Civic Bad Actor” (political hate speech and disinformation), “Terrorism Global Credible Threat,” “CEI” (child exploitative imagery) and “CP” (child pornography). Another set of categories addresses the messaging and conduct of millions of small and large businesses that use WhatsApp to chat with customers and sell their wares. These queues have such titles as “business impersonation prevalence,” “commerce policy probable violators” and “business verification.””

Unlike Facebook’s other platforms, Facebook and Instagram, WhatsApp does not release statistics about what data it collects, because it cites that its an encryption service. Facebook also needs WhatsApp to generate a profit, because the company spent $22 billion on it in 2014. WhatsApp does share data with Facebook, despite its dedication to privacy. Facebook also faced fines for violating user privacy. WhatsApp was used to collect data on criminals and governments want backdoors to access and trace data. It is for user safety, but governments can take observation too far.

Whitney Grace, September 14, 2021

Facebook: A Curious Weakness and a Microsoft Strength

September 7, 2021

I read “The Irony of Facebook’s VR Collaboration Debacle” authored by a wizard whom I associate with IBM. I am not sure why the author’s observations trigger images of Big Blue, mainframes, and blazing history of Watson.

The angle in this essay is:

Collaboration is a social process where people get together to collectively solve problems. But Facebook sucks at social. A more accurate descriptor is that Facebook is a gossip platform at scale, which has done considerable harm to several countries and put them at considerable existential risk.

Yikes. “Sucks.” “Gossip platform.” And “harm to several countries.”

The write up zips into Zoom-land which Facebook allegedly wants to reimagine as a virtual reality metaverse.

Where is the analysis of “Facebook sucks” heading? Here’s a clue:

Facebook’s Horizon Workrooms is not collaboration. Microsoft Teams would be a better solution for information sharing because you’d see Zuckerberg, not an avatar that looks nothing like him.

I think I have it. The write up is a rah-rah for Teams. I was hoping that the conclusion would point to IBM video services.

Nope, it’s Microsoft a company I presume which does not suck, is not a gossip platform, and has not done harm to several countries?

Stephen E Arnold, September 7, 2021

Facebook: Controlling Behavior Underscores Facebook Insecurity

August 30, 2021

Misinformation was running rampant long before the pandemic hit its stride. No one knows if the misinformation wave that currently plagues the United States and the world has hit its peak. Experts, like social media researcher Laura Edelson, are investigating the how misinformation spreads, but social media platforms do not like it says Vox Recode in “‘People Do Not Trust That Facebook Is Healthy Ecosystem.’” Edelson works at the NYU Ad Observatory and focuses her current research on Facebook’s role in spreading misinformation.

She believes that misinformation encourages COVID anti-vaxxers and is eroding democracy. Unfortunately Facebook decided to block Edelson and her colleagues’ Facebook accounts. They use their accounts to study political advertisements and misinformation. Facebook stated that the Ad Observatory violated users’ privacy through its Ad Observer tool. Edelson replied that only volunteers download the tool.

Lawmakers, free speech advocates, and the FTC condemned Facebook. Edelson states that Facebook wants to bury her research, because it exposes its part in spreading misinformation. On Facebook, users share misinformation more than any other content and the company refuses to disclose who pays for political ads. It demonstrates that Facebook does not like Edelson’s research and wants to stop it, because it hurts their bottom dollar.

Facebook, of course, denies the allegation and it points to larger problems:

“But Facebook’s effective shutdown of the Ad Observatory raises larger questions about whether the company is trying to limit outside interrogation of the company’s business practices in the name of protecting its users’ privacy. At the same time, the social media network has good reason to be worried about privacy as it faces intense regulatory scrutiny for past missteps that led to it having to pay the largest penalty ever imposed by the Federal Trade Commission.”

Edelson states that Facebook is an unhealthy misinformation ecosystem. Facebook and other misinformation platforms could be doing irreparable damage to society. Because this is a current problem, Facebook should be working with Edelson and other researchers who want to combat the misinformation plague.

Facebook and other companies, however, are more concerned about losing control and revenue. The good news is … Wait. There isn’t any for those researching the world’s most stabilizing and refreshing social media system.

Whitney Grace, August 30, 2021

Remember Who May Have Wanted to License Pegasus?

August 20, 2021

Cyber intelligence firm NSO, makers of Pegasus spyware, knows no bounds when it comes to enabling government clients to spy on citizens. Apparently, however, it draws the line at helping Facebook spy on its users. At his Daring Fireball blog, computer scientist John Gruber reports that “Facebook Wanted NSO Spyware to Monitor iOS Users.” We learn that NSO CEO Shalev Hulio has made a legal declaration stating he was approached in 2017 by Facebook reps looking to purchase certain Pegasus capabilities. Gruber quotes Motherboard’s Joseph Cox, who wrote:

“At the time, Facebook was in the early stages of deploying a VPN product called Onavo Protect, which, unbeknownst to some users, analyzed the web traffic of users who downloaded it to see what other apps they were using. According to the court documents, it seems the Facebook representatives were not interested in buying parts of Pegasus as a hacking tool to remotely break into phones, but more as a way to more effectively monitor phones of users who had already installed Onavo. ‘The Facebook representatives stated that Facebook was concerned that its method for gathering user data through Onavo Protect was less effective on Apple devices than on Android devices,’ the court filing reads. ‘The Facebook representatives also stated that Facebook wanted to use purported capabilities of Pegasus to monitor users on Apple devices and were willing to pay for the ability to monitor Onavo Protect users.’”

We are glad to learn NSO has boundaries of any sort. And score one for Apple security. As for Facebook, Gruber asserts this news supports his oft-stated assertion that Facebook is a criminal operation. He bluntly concludes:

“Facebook’s stated intention for this software was to use it for mass surveillance of its own honest users. That is profoundly [messed] up — sociopathic.”

Perhaps.

Cynthia Murrell, August 20, 2021

Facebook Keeps E2EE Goodness Flowing

August 18, 2021

Facebook is a wonderful outfit. One possible example is the helpful company’s end to end encryption for Facebook Messenger. “Facebook Messenger now have End-to-End Encryption for Voice and Video Calls” reports:

The social media giant said that end-to-end encryption for group voice and video calls will soon be a part of Messenger. Encryption is already available in Messenger as Secret Conversation. But Secret Conversation makes many features disable and only can be done with individuals. Facebook is going to change it in the coming weeks. Users will be able to control who can reach your chat lists, who will stay in the requests folder, and who can’t message you at all. In the blog, Facebook also talked about that Instagram is also likely to get end-to-end encryption and one-to-one conversations.

Should Facebook be subject to special oversight?

Stephen E Arnold, August 18, 2021

Facebook: A Force for Good. Now What Does Good Mean?

August 17, 2021

I read Preston Byrne’s essay about the Taliban’s use of WhatsApp. You can find that very good write up at this link. Mr. Byrne asks an important question: Did America just lose Afghanistan because of WhatsApp?

I also read “WhatsApp Can’t Ban the Taliban Because It Can’t Read Their Texts.” The main point of the write up is to point out that Facebook’s encrypted message system makes blocking users really difficult, like impossible almost.

I noted this statement:

the Taliban used Facebook-owned chat app WhatsApp to spread its message and gain favor among local citizens…

Seems obvious, right. Free service. Widely available. Encrypted. Why the heck not?

Here’s a statement in the Vice write up which caught my attention:

The company spokesperson said that WhatsApp complies with U.S. sanctions law, so if it encounters any sanctioned people or organizations using the app, it will take action, including banning the accounts. This obviously depends on identifying who uses WhatsApp, without having access to any of the messages sent through the platform, given that the app uses end-to-end encryption. This would explain why WhatsApp hasn’t taken action against some account spreading the Taliban’s message in Afghanistan.

Let me ask a pointed question: Is it time to shut down Facebook, WhatsApp, and Instagram? Failing that, why not use existing laws to bring a measure of control over access, message content, and service availability?

Purposeful action is needed. If Facebook cannot figure out what to do to contain and blunt the corrosive effects of the “free” service, outsource the task to an entity which will make an effort. That approach seems to be what is looming for the NSO Group. Perhaps purposeful action is motivating Apple to try and control the less salubrious uses of the iPhone ecosystem?

Dancing around the Facebook earnings report is fine entertainment. Is it time to add some supervision to the largely unregulated, uncontrolled, and frat boy bash? One can serve a treat like Bore Palaw too.

Stephen E Arnold, August 17, 2021

Facebook, Booze, Youngsters, and Australia: Shaken and Stirred

August 6, 2021

Quite a mixologist’s concoction: Facebook, booze, young people, and the Australian government. The country seems to be uncomfortable with some of Facebook’s alleged practices. I would assume that some Australian citizens who hold shares in the social media outfit are pleased as punch with the company’s financial results.

Others are not amused. “Facebook to Limit Ads Children See after revelations Australian Alcohol Companies Can Reach Teens” reports:

Facebook will impose more control on the types of ads that children as young as 13 are exposed to on Instagram and other platforms, as new research finds Australian alcohol companies are not restricting their social media content from reaching younger users.

How many companies targeted the youngsters down under? The write up asserts:

The paper examined the use of social media age-restriction controls by 195 leading alcohol brands on Instagram and Facebook, and found large numbers were not shielding their content from children. The 195 brands were owned by nine companies, and the research identified 153 Facebook accounts, including 84 based in Australia, and 151 Instagram accounts, of which 77 were Australian-based. The authors found 28% of the Instagram accounts and 5% of Facebook accounts had not activated age-restriction controls.

I did spot a quote attributed to one of the experts doing the research about Facebook, Booze, Youngsters, and Australia; to wit:

it was clear that companies were not complying with the code. “The alcohol industry has demonstrated that it is unable to effectively control its own marketing…

Shocking that about self regulation. Has anyone alerted the US financial sector?

Stephen E Arnold, August 6, 2021

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta