Facebook: A Force for Good. Now What Does Good Mean?
August 17, 2021
I read Preston Byrne’s essay about the Taliban’s use of WhatsApp. You can find that very good write up at this link. Mr. Byrne asks an important question: Did America just lose Afghanistan because of WhatsApp?
I also read “WhatsApp Can’t Ban the Taliban Because It Can’t Read Their Texts.” The main point of the write up is to point out that Facebook’s encrypted message system makes blocking users really difficult, like impossible almost.
I noted this statement:
the Taliban used Facebook-owned chat app WhatsApp to spread its message and gain favor among local citizens…
Seems obvious, right. Free service. Widely available. Encrypted. Why the heck not?
Here’s a statement in the Vice write up which caught my attention:
The company spokesperson said that WhatsApp complies with U.S. sanctions law, so if it encounters any sanctioned people or organizations using the app, it will take action, including banning the accounts. This obviously depends on identifying who uses WhatsApp, without having access to any of the messages sent through the platform, given that the app uses end-to-end encryption. This would explain why WhatsApp hasn’t taken action against some account spreading the Taliban’s message in Afghanistan.
Let me ask a pointed question: Is it time to shut down Facebook, WhatsApp, and Instagram? Failing that, why not use existing laws to bring a measure of control over access, message content, and service availability?
Purposeful action is needed. If Facebook cannot figure out what to do to contain and blunt the corrosive effects of the “free” service, outsource the task to an entity which will make an effort. That approach seems to be what is looming for the NSO Group. Perhaps purposeful action is motivating Apple to try and control the less salubrious uses of the iPhone ecosystem?
Dancing around the Facebook earnings report is fine entertainment. Is it time to add some supervision to the largely unregulated, uncontrolled, and frat boy bash? One can serve a treat like Bore Palaw too.
Stephen E Arnold, August 17, 2021
Facebook, Booze, Youngsters, and Australia: Shaken and Stirred
August 6, 2021
Quite a mixologist’s concoction: Facebook, booze, young people, and the Australian government. The country seems to be uncomfortable with some of Facebook’s alleged practices. I would assume that some Australian citizens who hold shares in the social media outfit are pleased as punch with the company’s financial results.
Others are not amused. “Facebook to Limit Ads Children See after revelations Australian Alcohol Companies Can Reach Teens” reports:
Facebook will impose more control on the types of ads that children as young as 13 are exposed to on Instagram and other platforms, as new research finds Australian alcohol companies are not restricting their social media content from reaching younger users.
How many companies targeted the youngsters down under? The write up asserts:
The paper examined the use of social media age-restriction controls by 195 leading alcohol brands on Instagram and Facebook, and found large numbers were not shielding their content from children. The 195 brands were owned by nine companies, and the research identified 153 Facebook accounts, including 84 based in Australia, and 151 Instagram accounts, of which 77 were Australian-based. The authors found 28% of the Instagram accounts and 5% of Facebook accounts had not activated age-restriction controls.
I did spot a quote attributed to one of the experts doing the research about Facebook, Booze, Youngsters, and Australia; to wit:
it was clear that companies were not complying with the code. “The alcohol industry has demonstrated that it is unable to effectively control its own marketing…
Shocking that about self regulation. Has anyone alerted the US financial sector?
Stephen E Arnold, August 6, 2021
Facebook Lets Group Admins Designate Experts. Okay!
August 2, 2021
Facebook once again enlists the aid of humans to impede the spread of misinformation, only this time it has found a way to avoid paying anyone for the service. Tech Times reports, “Facebook Adds Feature to Let Admin in Groups Chose ‘Experts’ to Curb Misinformation.” The move also has the handy benefit of shifting responsibility for bad info away from the company. We wonder—what happened to that smart Facebook software? The article does not say. Citing an article from Business Insider, writer Alec G. does tell us:
“The people who run the communities on Facebook now have the authority to promote individuals within its group to gain the title of ‘expert.’ Then, the individuals dubbed as experts can be the voices of which the public can then base their questions and concerns. This is to prevent misinformation plaguing online communities for a while now.”
But will leaving the designation of “expert” up to admins make the problem worse instead of better? The write-up continues:
“The social platform now empowers specific individuals inside groups who are devoted to solely spreading misinformation-related topics. The ‘Stop the Steal’ group, for example, was created in November 2020 with over 365,000 members. They were convinced that the election for the presidency was a fraud. If Facebook didn’t remove the group two days later, it would continue to have negative effects. Facebook explained that the organization talked about ‘the delegitimization of the election process,’ and called for violence, as reported by the BBC. Even before that, other groups within Facebook promoted violence and calls to action that would harm the civility of the governments.”
Very true. We are reminded of the company’s outsourced Oversight Board created in 2018, a similar shift-the-blame approach that has not worked out so well. Facebook’s continued efforts to transfer responsibility for bad content to others fail to shield it from blame. They also do little to solve the problem and may even make it worse. Perhaps it is time for a different (real) solution.
Cynthia Murrell, August 2, 2021
Facebook and NSO Group: An Odd Couple or Squabbling Neighbors?
July 28, 2021
Late in 2019, The Adware Guru published “Facebook Sues NSO Group Spyware Maker Due to Exploitation of WhatsApp Vulnerability.” That write up stated:
The cause of [Facebook’s] lawsuit was WhatsApp’s zero-day vulnerability, which Facebook claims was sold to the NSO Group, and then the company helped use the problem to attack human rights defenders, journalists, political dissidents, diplomats, and governmental officials. According to court documents, more than 1,400 people in Bahrain, the United Arab Emirates, and Mexico suffered a total of 11 days from attacks. Facebook has already sent WhatsApp special messages to everyone affected.
In April 2020, Technadu published “The NSO Group Is Accusing Facebook of Having Tried to License Their Spyware.” That write up stated:
The ‘NSO Group’ is now turning the tables, claiming that they rejected Facebook’s proposal to license Pegasus because they only did it for governments and not private companies. In addition to that, they describe Facebook’s accusations as baseless and even accuse the social media company of failing to prepare the legal paperwork properly, which resulted in legislative procedure problems. NSO says Facebook didn’t have powerful methods to spy on iOS devices in the same way that they did with Android, and they felt like Pegasus could solve this problem for them. Facebook, on the other side, completely dismissed these statements by saying that these allegations had the sole purpose of distracting the court from the real facts.
Technadu added:
even if Facebook wasn’t trying to add Pegasus in Onavo for iOS, they are giving the NSO Group something to hold on to and make allegations that are at least seemingly realistic. At the very least, this development will complicate the legal process by much now.
Jump to the present. The Guardian’s story “Officials Who Are US Allies Among Targets of NSO Malware, Says WhatsApp Chief” reported on July 24, 2021:
Cathcart said that he saw parallels between the attack against WhatsApp users in 2019 – which is now the subject of a lawsuit brought by WhatsApp against NSO – and reports about a massive data leak that are at the centre of the Pegasus project… When WhatsApp says it believes its users were “targeted”, it means the company has evidence that an NSO server attempted to install malware on a user’s device.
The Guardian story includes this statement from the PR savvy NSO Group:
An NSO spokesperson said: “We are doing our best to help creating a safer world. Does Mr Cathcart have other alternatives that enable law enforcement and intelligence agencies to legally detect and prevent malicious acts of pedophiles, terrorists and criminals using end-to-end encryption platforms? If so, we would be happy to hear.”
Are Facebook’s statements credible? Is NSO Group’s version believable? Are these two behaving like the characters in Neil Simon’s “Odd Couple” or like the characters in the 1981 film “Neighbors”? Does each firm have something the other needs?
Stephen E Arnold, July 28, 2021
Does Facebook Kill?
July 22, 2021
I found it interesting that the US government suggested that Facebook information kills. You can refresh your knowledge of this assertion in “Biden: COVID Misinformation on Platforms Like Facebook Is ‘Killing People’”. The statement is an attention grabber. Facebook responded, according to Neowin in “Facebook Refutes Biden’s Blame That It’s “Killing People” with COVID Fake News”:
Facebook clearly took issue with these statements and a company spokesperson responded by saying, “We will not be distracted by accusations which aren’t supported by the facts”.
The US government asserts one thing; Facebook another. Which is the correct interpretation of Facebook: An instrument of death or a really great helper of humanity?
The US is a country, and it has legal tools at its disposal. Facebook is a commercial enterprise operating in the US with a single person controlling what the company does.
Facebook wants to use the laws of the country to advantage itself; for example, Facebook is not too keen on Lina Khan. The company filed a legal document to keep that person from getting involved in matters related to Facebook’s commercial behaviors.
I find the situation amusing. Facebook’s assertions are not going to get a like from me. The US government, on the other hand, is a country. When countries take action — as China did with regard to Jack Ma — consequences can be significant.
The phrase “Facebook kills” is meme-able. That may be a persistent problem for the Zuck and the Zuckers in my opinion.
Stephen E Arnold, July 22, 2021
Zuckin and Duckin: Socialmania at Facebook
July 19, 2021
I read “Zuck Is a Lightweight, and 4 More Things We Learned about Facebook from ‘An Ugly Truth’.” My initial response was, “No Mashable professionals will be invited to the social Zuckerberg’s Hawaii compound.” Bummer. I had a few other thoughts as well, but, first, here’s couple of snippets in what is possible to characterize a review of a new book by Sheera Frenkel and Cecilia Kang. I assume any publicity is good publicity.
Here’s an I circled in Facebook social blue:
Frenkel and Kang’s careful reporting shows a company whose leadership is institutionally ill-equipped to handle the Frankenstein’s monster they built.
Snappy. To the point.
Another? Of course, gentle reader:
Zuckerberg designed the platform for mindless scrolling: “I kind of want to be the new MTV,” he told friends.
Insightful but TikTok, which may have some links to the sensitive Chinese power plant, aced out the F’Book.
And how about this?
[The Zuck] was explicitly dismissive of what she said.” Indeed, the book provides examples where Sandberg was afraid of getting fired, or being labeled as politically biased, and didn’t even try to push back…
Okay, and one more:
Employees are fighting the good fight.
Will I buy the book? Nah, this review is close enough. What do I think will happen to Facebook? In the short term, not much. The company is big and generating big payoffs in power and cash. Longer term? The wind down will continue. Google, for example, is dealing with stuck disc brakes on its super car. Facebook may be popping in and out of view in that outstanding vehicle’s rear view mirrors. One doesn’t change an outfit with many years of momentum.
Are the book’s revelations on the money. Probably reasonably accurate but disenchantment can lead to some interesting shaping of non fiction writing. And the Mashable review? Don’t buy a new Hawaiian themed cabana outfit yet. What about Facebook’s management method? Why change? It worked in high school. It worked when testifying before Congress. It worked until a couple of reporters shifted into interview mode and reporters are unlikely to rack up the likes on Facebook.
Stephen E Arnold, July xx, 2021
Facebook Has Channeled Tacit Software, Just without the Software
July 14, 2021
I would wager a free copy of my book CyberOSINT that anyone reading this blog post remembers Tacit Software, founded in the late 1990s. The company wrote a script which determined what employee in an organization was “consulted” most frequently. I recall enhancements which “indexed” content to make it easier for a user to identify content which may have been overlooked. But the killer feature was allowing a person with appropriate access to identify individuals with particular expertise. Oracle, the number one in databases, purchased Tacit Software and integrated the function into Oracle Beehive. If you want to read marketing collateral about Beehive, navigate to this link. Oh, good luck with pinpointing the information about Tacit. If you dig a bit, you will come across information which suggests that the IBM Clever method was stumbled upon and implemented about the same time that Backrub went online. Small community in Silicon Valley? Yes, it is.
So what?
I thought about this 1997 innovation in Silicon Valley when I read “Facebook’s Groups to Highlight Experts.” With billions of users, I wonder why it took Facebook years to figure out that it could identify individuals who “knew” something. Progress never stops in me-to land, of course. Is Facebook using its estimable smart software to identify those in know?
The article reports:
There are more than 70 million administrators and moderators running active groups, Facebook says. When asked how they’re vetting the qualifications of designated experts, a Facebook spokesperson said it’s “all up the discretion of the admin to designate experts who they believe are knowledgeable on certain topics.”
I think this means that humans identify experts. What if the human doing the identifying does not know anything about the “expertise” within another Facebooker?
Yeah, maybe give Oracle Beehive a jingle. Just a thought.
Stephen E Arnold, July 14, 2021
Facebook and Milestones for the Zuck
July 2, 2021
I read “Facebook Reaches $1 Trillion Valuation Faster Than Any Other Company.” The write up reports that the highly esteemed Facebook (WhatsApp and Instagram) have out performed Apple, the Google, the spaceship outfits, and Microsoft. The article reports:
No stranger to breaking records, Facebook has just achieved another: the social media giant’s market cap has exceeded $1 trillion for the first time, reaching the milestone faster than any other company in history.
What has caused this surge in “valuation”? The answer is revealed in “Judge Dismisses Gov’t Antitrust Lawsuits against Facebook.” This separate write up in a news service oh, so close to Microsoft’s headquarters, states:
A federal judge on Monday dismissed antitrust lawsuits brought against Facebook by the Federal Trade Commission and a coalition of state attorneys general, dealing a significant blow to attempts by regulators to rein in tech giants. U.S. District Judge James Boasberg ruled Monday that the lawsuits were “legally insufficient” and didn’t provide enough evidence to prove that Facebook was a monopoly.
Concentration is good, it seems. For social media start ups, have you thought about issuing T shirts with the Zuck’s smiling face printed on them? You can update your Facebook page and do some Instagram posts. Don’t overlook WhatsApp as a way to spread the good news. About the T shirts I mean.
Stephen E Arnold, July 2, 2021
Facebook Reputation Glitch: Just a Misunderstanding or a Warning Signal?
June 1, 2021
I read “Facebook Battles Reputation Crisis in the Middle East.” Interesting but narrow. Why? I think the write up identifies an issue—a serious one at that. However the “crisis” extends beyond the Middle East. The write up has a narrow focus, but it identifies a critical weak spot for the Zuck machine.
Here’s a passage I noted:
Facebook is grappling with a reputation crisis in the Middle East, with plummeting approval rates and advertising sales in Arab countries, according to leaked documents obtained by NBC News. The shift corresponds with the widespread belief by pro-Palestinian and free speech activists that the social media company has been disproportionately silencing Palestinian voices on its apps – which include Facebook, Instagram and WhatsApp – during this month’s Israel-Hamas conflict.
As US monopolies wrestle with the challenges of unfettered content flowing from equally free spirit users, Facebook cannot be friends with everyone. Don’t hit that like button automatically.
Facebook, however, is a monopolistic-type of operation which is a digital nation. The collision of the datasphere and the real world is going to test the precepts of Zuckland.
Are there real world consequences from this reputation glitch? Yes, and among them is the very significant risk that the ripples will spread. Can one contain ripples in a pond or the wave in a digital ocean? Maybe.
Stephen E Arnold, June 1, 2021
WhatsApp Slightly Less Onerous on Privacy Policy
May 28, 2021
WhatsApp, which Facebook acquired in 2014, issued a new privacy policy that gives its parent company control over user data. This is a problem for those who value their privacy. Originally, users had to accept the policy by May 15 or be booted off the service. After facing backlash, however, the company has decided on a slightly less heavy-handed approach, at least in India. The Android Police reports, “WhatsApp Will Progressively Kill Features Until Users Accept New Privacy Policy.” Writer Prahsam Parikh reveals:
“The Press Trust of India reported that the Facebook-owned messaging service won’t delete accounts of those individuals who do not accept the new privacy policy on May 15. However, the same source also confirms that users will be sent reminders about accepting over the next ‘several weeks.’ And in a statement given to Android Central, WhatsApp has confirmed that while it won’t terminate accounts immediately, users who don’t accept the new terms will have only ‘limited account functionality’ available to them until they do. In the short term, that means losing access to your chat list, but you will still be able to see and respond to notifications as well as answer voice and video calls. However, after a few weeks of that, WhatsApp will then switch off all incoming notifications and calls for your account, effectively rendering it useless. The decision not to fully enforce the deadline seems to be in reaction to the stern stance that the Ministry of Electronics and Information Technology (MEITY) in India took against the company. Earlier this year, the ministry filed a counter-affidavit in the high court to prevent WhatsApp from going ahead with the privacy policy update.”
Wow, Facebook really wants that data. We think Facebook will have to relax its “new” rules in order to prevent Signal, Telegram, and Threema from capturing disaffected WhatsApp users.
Cynthia Murrell, May 28, 2021