Facebook Lets Group Admins Designate Experts. Okay!

August 2, 2021

Facebook once again enlists the aid of humans to impede the spread of misinformation, only this time it has found a way to avoid paying anyone for the service. Tech Times reports, “Facebook Adds Feature to Let Admin in Groups Chose ‘Experts’ to Curb Misinformation.” The move also has the handy benefit of shifting responsibility for bad info away from the company. We wonder—what happened to that smart Facebook software? The article does not say. Citing an article from Business Insider, writer Alec G. does tell us:

“The people who run the communities on Facebook now have the authority to promote individuals within its group to gain the title of ‘expert.’ Then, the individuals dubbed as experts can be the voices of which the public can then base their questions and concerns. This is to prevent misinformation plaguing online communities for a while now.”

But will leaving the designation of “expert” up to admins make the problem worse instead of better? The write-up continues:

“The social platform now empowers specific individuals inside groups who are devoted to solely spreading misinformation-related topics. The ‘Stop the Steal’ group, for example, was created in November 2020 with over 365,000 members. They were convinced that the election for the presidency was a fraud. If Facebook didn’t remove the group two days later, it would continue to have negative effects. Facebook explained that the organization talked about ‘the delegitimization of the election process,’ and called for violence, as reported by the BBC. Even before that, other groups within Facebook promoted violence and calls to action that would harm the civility of the governments.”

Very true. We are reminded of the company’s outsourced Oversight Board created in 2018, a similar shift-the-blame approach that has not worked out so well. Facebook’s continued efforts to transfer responsibility for bad content to others fail to shield it from blame. They also do little to solve the problem and may even make it worse. Perhaps it is time for a different (real) solution.

Cynthia Murrell, August 2, 2021

Facebook and NSO Group: An Odd Couple or Squabbling Neighbors?

July 28, 2021

Late in 2019, The Adware Guru published “Facebook Sues NSO Group Spyware Maker Due to Exploitation of WhatsApp Vulnerability.” That write up stated:

The cause of [Facebook’s]  lawsuit was WhatsApp’s zero-day vulnerability, which Facebook claims was sold to the NSO Group, and then the company helped use the problem to attack human rights defenders, journalists, political dissidents, diplomats, and governmental officials. According to court documents, more than 1,400 people in Bahrain, the United Arab Emirates, and Mexico suffered a total of 11 days from attacks. Facebook has already sent WhatsApp special messages to everyone affected.

In April 2020, Technadu published “The NSO Group Is Accusing Facebook of Having Tried to License Their Spyware.” That write up stated:

The ‘NSO Group’ is now turning the tables, claiming that they rejected Facebook’s proposal to license Pegasus because they only did it for governments and not private companies. In addition to that, they describe Facebook’s accusations as baseless and even accuse the social media company of failing to prepare the legal paperwork properly, which resulted in legislative procedure problems. NSO says Facebook didn’t have powerful methods to spy on iOS devices in the same way that they did with Android, and they felt like Pegasus could solve this problem for them. Facebook, on the other side, completely dismissed these statements by saying that these allegations had the sole purpose of distracting the court from the real facts.

Technadu added:

even if Facebook wasn’t trying to add Pegasus in Onavo for iOS, they are giving the NSO Group something to hold on to and make allegations that are at least seemingly realistic. At the very least, this development will complicate the legal process by much now.

Jump to the present. The Guardian’s story “Officials Who Are US Allies Among Targets of NSO Malware, Says WhatsApp Chief” reported on July 24, 2021:

Cathcart said that he saw parallels between the attack against WhatsApp users in 2019 – which is now the subject of a lawsuit brought by WhatsApp against NSO – and reports about a massive data leak that are at the centre of the Pegasus project… When WhatsApp says it believes its users were “targeted”, it means the company has evidence that an NSO server attempted to install malware on a user’s device.

The Guardian story includes this statement from the PR savvy NSO Group:

An NSO spokesperson said: “We are doing our best to help creating a safer world. Does Mr Cathcart have other alternatives that enable law enforcement and intelligence agencies to legally detect and prevent malicious acts of pedophiles, terrorists and criminals using end-to-end encryption platforms? If so, we would be happy to hear.”

Are Facebook’s statements credible? Is NSO Group’s version believable? Are these two behaving like the characters in Neil Simon’s “Odd Couple” or like the characters in the 1981 film “Neighbors”? Does each firm have something the other needs?

Stephen E Arnold, July 28, 2021

Does Facebook Kill?

July 22, 2021

I found it interesting that the US government suggested that Facebook information kills. You can refresh your knowledge of this assertion in “Biden: COVID Misinformation on Platforms Like Facebook Is ‘Killing People’”. The statement is an attention grabber. Facebook responded, according to Neowin in “Facebook Refutes Biden’s Blame That It’s “Killing People” with COVID Fake News”:

Facebook clearly took issue with these statements and a company spokesperson responded by saying, “We will not be distracted by accusations which aren’t supported by the facts”.

The US government asserts one thing; Facebook another. Which is the correct interpretation of Facebook: An instrument of death or a really great helper of humanity?

The US is a country, and it has legal tools at its disposal. Facebook is a commercial enterprise operating in the US with a single person controlling what the company does.

Facebook wants to use the laws of the country to advantage itself; for example, Facebook is not too keen on Lina Khan. The company filed a legal document to keep that person from getting involved in matters related to Facebook’s commercial behaviors.

I find the situation amusing. Facebook’s assertions are not going to get a like from me. The US government, on the other hand, is a country. When countries take action — as China did with regard to Jack Ma — consequences can be significant.

The phrase “Facebook kills” is meme-able. That may be a persistent problem for the Zuck and the Zuckers in my opinion.

Stephen E Arnold, July 22, 2021

Zuckin and Duckin: Socialmania at Facebook

July 19, 2021

I read “Zuck Is a Lightweight, and 4 More Things We Learned about Facebook from ‘An Ugly Truth’.” My initial response was, “No Mashable professionals will be invited to the social Zuckerberg’s Hawaii compound.” Bummer. I had a few other thoughts as well, but, first, here’s couple of snippets in what is possible to characterize a review of a new book by Sheera Frenkel and Cecilia Kang. I assume any publicity is good publicity.

Here’s an I circled in Facebook social blue:

Frenkel and Kang’s careful reporting shows a company whose leadership is institutionally ill-equipped to handle the Frankenstein’s monster they built.

Snappy. To the point.

Another? Of course, gentle reader:

Zuckerberg designed the platform for mindless scrolling: “I kind of want to be the new MTV,” he told friends.

Insightful but TikTok, which may have some links to the sensitive Chinese power plant, aced out the F’Book.

And how about this?

[The Zuck] was explicitly dismissive of what she said.” Indeed, the book provides examples where Sandberg was afraid of getting fired, or being labeled as politically biased, and didn’t even try to push back…

Okay, and one more:

Employees are fighting the good fight.

Will I buy the book? Nah, this review is close enough. What do I think will happen to Facebook? In the short term, not much. The company is big and generating big payoffs in power and cash. Longer term? The wind down will continue. Google, for example, is dealing with stuck disc brakes on its super car. Facebook may be popping in and out of view in that outstanding vehicle’s rear view mirrors. One doesn’t change an outfit with many years of momentum.

Are the book’s revelations on the money. Probably reasonably accurate but disenchantment can lead to some interesting shaping of non fiction writing. And the Mashable review? Don’t buy a new Hawaiian themed cabana outfit yet. What about Facebook’s management method? Why change? It worked in high school. It worked when testifying before Congress. It worked until a couple of reporters shifted into interview mode and reporters are unlikely to rack up the likes on Facebook.

Stephen E Arnold, July xx, 2021

Facebook Has Channeled Tacit Software, Just without the Software

July 14, 2021

I would wager a free copy of my book CyberOSINT that anyone reading this blog post remembers Tacit Software, founded in the late 1990s. The company wrote a script which determined what employee in an organization was “consulted” most frequently. I recall enhancements which “indexed” content to make it easier for a user to identify content which may have been overlooked. But the killer feature was allowing a person with appropriate access to identify individuals with particular expertise. Oracle, the number one in databases, purchased Tacit Software and integrated the function into Oracle Beehive. If you want to read marketing collateral about Beehive, navigate to this link. Oh, good luck with pinpointing the information about Tacit. If you dig a bit, you will come across information which suggests that the IBM Clever method was stumbled upon and implemented about the same time that Backrub went online. Small community in Silicon Valley? Yes, it is.

So what?

I thought about this 1997 innovation in Silicon Valley when I read “Facebook’s Groups to Highlight Experts.” With billions of users, I wonder why it took Facebook years to figure out that it could identify individuals who “knew” something. Progress never stops in me-to land, of course. Is Facebook using its estimable smart software to identify those in know?

The article reports:

There are more than 70 million administrators and moderators running active groups, Facebook says. When asked how they’re vetting the qualifications of designated experts, a Facebook spokesperson said it’s “all up the discretion of the admin to designate experts who they believe are knowledgeable on certain topics.”

I think this means that humans identify experts. What if the human doing the identifying does not know anything about the “expertise” within another Facebooker?

Yeah, maybe give Oracle Beehive a jingle. Just a thought.

Stephen E Arnold, July 14, 2021

Facebook and Milestones for the Zuck

July 2, 2021

I read “Facebook Reaches $1 Trillion Valuation Faster Than Any Other Company.” The write up reports that the highly esteemed Facebook (WhatsApp and Instagram) have out performed Apple, the Google, the spaceship outfits, and Microsoft. The article reports:

No stranger to breaking records, Facebook has just achieved another: the social media giant’s market cap has exceeded $1 trillion for the first time, reaching the milestone faster than any other company in history.

What has caused this surge in “valuation”? The answer is revealed in “Judge Dismisses Gov’t Antitrust Lawsuits against Facebook.” This separate write up in a news service oh, so close to Microsoft’s headquarters, states:

A federal judge on Monday dismissed antitrust lawsuits brought against Facebook by the Federal Trade Commission and a coalition of state attorneys general, dealing a significant blow to attempts by regulators to rein in tech giants. U.S. District Judge James Boasberg ruled Monday that the lawsuits were “legally insufficient” and didn’t provide enough evidence to prove that Facebook was a monopoly.

Concentration is good, it seems. For social media start ups, have you thought about issuing T shirts with the Zuck’s smiling face printed on them? You can update your Facebook page and do some Instagram posts. Don’t overlook WhatsApp as a way to spread the good news. About the T shirts I mean.

Stephen E Arnold, July 2, 2021

Facebook Reputation Glitch: Just a Misunderstanding or a Warning Signal?

June 1, 2021

I read “Facebook Battles Reputation Crisis in the Middle East.” Interesting but narrow. Why? I think the write up identifies an issue—a serious one at that. However the “crisis” extends beyond the Middle East. The write up has a narrow focus, but it identifies a critical weak spot for the Zuck machine.

Here’s a passage I noted:

Facebook is grappling with a reputation crisis in the Middle East, with plummeting approval rates and advertising sales in Arab countries, according to leaked documents obtained by NBC News. The shift corresponds with the widespread belief by pro-Palestinian and free speech activists that the social media company has been disproportionately silencing Palestinian voices on its apps – which include Facebook, Instagram and WhatsApp – during this month’s Israel-Hamas conflict.

As US monopolies wrestle with the challenges of unfettered content flowing from equally free spirit users, Facebook cannot be friends with everyone. Don’t hit that like button automatically.

Facebook, however, is a monopolistic-type of operation which is a digital nation. The collision of the datasphere and the real world is going to test the precepts of Zuckland.

Are there real world consequences from this reputation glitch? Yes, and among them is the very significant risk that the ripples will spread. Can one contain ripples in a pond or the wave in a digital ocean? Maybe.

Stephen E Arnold, June 1, 2021

WhatsApp Slightly Less Onerous on Privacy Policy

May 28, 2021

WhatsApp, which Facebook acquired in 2014, issued a new privacy policy that gives its parent company control over user data. This is a problem for those who value their privacy. Originally, users had to accept the policy by May 15 or be booted off the service. After facing backlash, however, the company has decided on a slightly less heavy-handed approach, at least in India. The Android Police reports, “WhatsApp Will Progressively Kill Features Until Users Accept New Privacy Policy.” Writer Prahsam Parikh reveals:

“The Press Trust of India reported that the Facebook-owned messaging service won’t delete accounts of those individuals who do not accept the new privacy policy on May 15. However, the same source also confirms that users will be sent reminders about accepting over the next ‘several weeks.’ And in a statement given to Android Central, WhatsApp has confirmed that while it won’t terminate accounts immediately, users who don’t accept the new terms will have only ‘limited account functionality’ available to them until they do. In the short term, that means losing access to your chat list, but you will still be able to see and respond to notifications as well as answer voice and video calls. However, after a few weeks of that, WhatsApp will then switch off all incoming notifications and calls for your account, effectively rendering it useless. The decision not to fully enforce the deadline seems to be in reaction to the stern stance that the Ministry of Electronics and Information Technology (MEITY) in India took against the company. Earlier this year, the ministry filed a counter-affidavit in the high court to prevent WhatsApp from going ahead with the privacy policy update.”

Wow, Facebook really wants that data. We think Facebook will have to relax its “new” rules in order to prevent Signal, Telegram, and Threema from capturing disaffected WhatsApp users.

Cynthia Murrell, May 28, 2021

Whistleblower Discusses Fake Account Infestation at Facebook

May 25, 2021

While working at Facebook, Sophie Zhang followed her conscience where her managers and peers failed to go. Naturally, this initiative eventually got her fired. AP News shares the data scientist’s perspective in, “Insider Q&A: Sophie Zhang, Facebook Whistleblower.” Reporter Barbara Ortutay introduces the interview:

“Sophie Zhang worked as a Facebook data scientist for nearly three years before was she fired in the fall of 2020. On her final day, she posted a 7,800-word memo to the company’s internal forum. … In the memo, first published by Buzzfeed, she outlined evidence that governments in countries like Azerbaijan and Honduras were using fake accounts to influence the public. Elsewhere, such as India and Ecuador, Zhang found coordinated activity intended to manipulate public opinion, although it wasn’t clear who was behind it. Facebook, she said, didn’t take her findings seriously. Zhang’s experience led her to a stark conclusion: ‘I have blood on my hands.’ Facebook has not disputed the facts of Zhang’s story but has sought to diminish the importance of her findings.”

If you have not yet seen excerpts from the eye-opening memo or read the full story, we suggest checking out the Buzzfeed and/or Guardian links Ortutay supplies above. In the AP interview, Zhang adds some details. For example, she was apparently fired because the work she did to protect citizens around the world was interfering with her official, low-level duties. She blamed herself, however, for not doing more because she was the only one seeking out and taking down these fake accounts. No one around her seemed to give a hoot unless an outside agency contacted Facebook about a specific page. She states:

“I talked about it internally … but people couldn’t agree on whose job it was to deal with it. I was trying desperately to find anyone who cared. I talked with my manager and their manager. I talked to the threat intelligence team. I talked with many integrity teams. It took almost a year for anything to happen.”

It was actually remarkable how many fake accounts Zhang was able to eliminate on her own, but one employee could only do so much. Especially without the support of higher ups. Though Facebook pays lip service to the issue, Zhang insists they would be doing more if they were really prioritizing the problem. We note this exchange:

“Q: Facebook says it’s taking down many inauthentic accounts and has sought to dismiss your story.

A: So this is a very typical Facebook response, by which I mean that they are not actually answering the question. Suppose your spouse asks you, ‘Did you clean up the dishes yesterday?’ And you respond by saying, ‘I always prioritize cleaning the dishes. I make sure to clean the dishes. I do not want there to be dirty dishes.’ It’s an answer that may make sense, but it does not actually answer that question.”

Indeed.

Cynthia Murrell, May 19, 2021

Facebook Tracking: Why Secrets Are Important to Some Digital Players

May 12, 2021

I read a headline which I assume was crafted to shock; to wit: “Analytics Suggest 96% of of Users Leave App Tracking Disabled in iOS 14.5.” The headline did not surprise me, nor did the fact that four out of 100 in the sample said, “Sure, follow, listen, and watch me 24×7.” The write up states:

According to the latest data from analytics firm Flurry, just 4% of ?iPhone? users in the U.S. have actively chosen to opt into app tracking after updating their device to iOS 14.5. The data is based on a sampling of 2.5 million daily mobile active users.

The article points out:

Facebook, a vociferous opponent of ATT [app tracking tech], has already started attempting to convince users that they must enable tracking in iOS 14.5 if they want to help keep Facebook and Instagram “free of charge.” That sentiment would seem to go against the social network’s earlier claim that ATT will have a “manageable” impact on its business and could even benefit Facebook in the long term.

Several observations:

  • Secrets work. Making certain behaviors “known” undermines a number of capabilities; for example, revenue, trust, and data collection
  • iPhone users appear to be interested in keeping some of their mobile centric behaviors within their span of control. (What about iPhone users in China and Russia? Alas, the write up did not include those data.)
  • Processing items of data across time and within the monitored datasphere may make it difficult for some entities to perform in the manner they did prior to the introduction of ATT.

Net net: Flowing information erodes certain beliefs, social constructs, and processes. Once weakened by bits, these beliefs, constructs, and processes may not be reconstructable. The Apple ATT may have unforeseen consequences.

Stephen E Arnold, May 12, 2021

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta