Avaaz Facebook Report: Another Road Map for Bad Actors?
October 14, 2020
DarkCyber is intrigued by research reports which try to alert the public to an issue. Often the reports provide a road map for bad actors who are looking for a new method or angle to practice their dark arts. “Facebook’s Algorithm: A Major Threat to Public Health” may be a recent example of doing right going wrong.
Avaaz is, according to the organization’s Web site:
a global web movement to bring people-powered politics to decision-making everywhere.
A 33-page study, published in late August 2020, is available without charge at this link. The publication covers health misinformation through the lens of Facebook’s apparently flawed content curation mechanisms.
For bad actors (real or would be), the document explains:
- The relationship between Web pages with “disinformation” and Facebook sites
- The amplification function of content generators
- Utility of a message output in multiple languages
- The role of backlinks
- A list of “gaps in Facebook’s” content curation method.
Interesting report and one which may help some individuals operate more effectively. Facebook’s content curation has some flaws. The company flagged a photograph of onions as salacious. No, really.
Stephen E Arnold, October 14, 2020
Facebook: Merging Apps Before the Call to Break It Up
October 7, 2020
DarkCyber noted “Facebook Is Merging Messenger into Instagram.” The write up explains:
…Facebook is starting to unify its messaging platforms across apps. They will start including more of Messenger’s features into Instagram’s direct messaging chat platform. It will also add the ability to send messages across the two apps.
DarkCyber believes that unified messaging may have some downstream consequences. On one hand, certain government requests for data may be more helpful if Facebook provides the requested information. On the other hand, breaking up the company could become more difficult.
Stephen E Arnold, October 7, 2020
Facebook Is Nothing If Not Charming
October 5, 2020
Facebook spies on its users by collecting their personal information from hobbies, birthdays, relationships, and vacation spots. Facebook users voluntarily share this information publicly and/or privately. As a result, the company sells that information to advertisers. Facebook also spies on its competitors, but it does so in a more sophisticated way says the BBC article “Facebook Security App Used To ‘Spy’ On Competitors.”
Facebook apparently used its cross-party Onavo VPN to collect information on its competitors knowingly and in violation of anti-piracy laws. The Commons Committee discussed the incident in a report that is more than one hundred pages. Here is the gist of the report:
“The Digital, Culture, Media and Sport Committee wrote that through the use of Onavo, which was billed as a way to give users an extra layer of security, Facebook could ‘collect app usage data from its customers to assess not only how many people had downloaded apps, but how often they used them”.
The report added:
‘This knowledge helped them to decide which companies were performing well and therefore gave them invaluable data on possible competitors. They could then acquire those companies, or shut down those they judged to be a threat.”
Even more alarming are the details about ways Facebook could shut down services it provides to its competition. Twitter’s video sharing app Vine is an example of how Facebook destroyed a competitor. Twitter wanted Vine users to find friends via their Facebook accounts, but Zuckerberg nixed that idea. Vine shuttered in 2016.
Facebook does something equally nefarious with a white list of approved apps that are allowed to use Facebook user data. Among the 5,000 approved apps are Netflix, Airbnb, and Lyft. These app companies supposedly spend $250,000 on Facebook advertising to keep their coveted position.
Zuckerburg wrote in an email:
“I think we leak info to developers, but I just can’t think of any instances where that data has leaked from developer to developer and caused a real issue for us.”
There was the Cambridge Analytica scandal where voter information was collected through a personality quiz. The data of users and their friends was stolen and it profiled 82 million Americans, then that information was sold to the Cambridge Analytica company. The United Kingdom fined Facebook 500,000 pounds and the company apologized.
It will not be the first time Facebook steals and sells user information. We wonder how their competition spies on users and sells their data.
Whitney Grace, October 5, 2020
Facebook WhatsApp: Disappearing Media. Really? Gone for Good?
September 28, 2020
Facebook is endlessly entertaining. On one tentacle, the octopus company seeks to lessen the competitive threats from next generation social media like TikTok-type videos. On another tentacle, Facebook suggests that those in the European Union can do without Facebook. DarkCyber thinks of this as “the take my ball and go home” tactic. Ten year olds with minimal physical coordination but a stock of high end athletic equipment have been known to trot out this argumentative chestnut. Another tentacle semi cooperates with government officials. Another tentacle balances on a knife edge in order to keep employees happy with the wonderful social environment within the helpful company’s digital walled garden. There are other tentacles too, but I want to focus your attention on “WhatsApp Expiring Media Feature Details Tipped via New Beta Version.” Keep in mind that “beta” does not mean something a thumbtyper will be able to use.
The write up explains:
WhatsApp 2.20.201.6 beta for Android has been released with further references for a new feature called “Expiring Media.” The feature, as its name suggests, would make media files such as images, videos, and GIFs sent to the recipient’s phone over WhatsApp disappear once they are viewed.
Interesting. Just one question:
If media are disappeared for users, are those data deleted from the Facebook servers?
One hopes not; otherwise, some investigations will be slowed or halted.
Stephen E Arnold, September 28, 2020
Facebook: Fine Thinking
September 26, 2020
I read “Former Facebook Manager: We Took a Page from Big Tobacco’s Playbook.” The main idea is that a former Facebook professional revealed how the gears meshed within the Facebook distributed intelligence machine. For me, the allegedly truthful revelations add some color to my understanding of what I call high school science club thinking.
The write up quotes the actual factual testimony of Facebook’s former director of monetization (good title that), quoting a certain Tim Kendall as saying:
“We sought to mine as much attention as humanly possible… We took a page form Big Tobacco’s playbook, working to make our offering addictive at the outset.”
What’s interesting is the way in which Ars Technica approached the story. The article lets Mr. Kendall’s own words and some facts about Facebook’s fine manager-employee relations beef up the write up.
What’s interesting is the way in which Ars Technica approached the story. The article lets Mr. Kendall’s own words and some facts about Facebook’s fine manager-employee relations beef up the write up.
Facebook continues to capture the attention of the savvy US elected officials. The social media company opened for business in 2004. That works out to more than 15 years ago. Now after controversies with alleged “co-founders”, the pompous Etonian, and interactions with the clear-minded European union officials, Facebook is getting scrutinized by the US government.
What if Mr. Kendall is making Facebook look different like a reflection in a fun house mirror? What if Facebook is a happy, happy place? What if Facebook has contributed to social comity?
What if Facebook is the best thing one can say about 2020?
Stephen E Arnold, September 26, 2020
Facebook and Digital Partitioning
September 18, 2020
I am no expert on managing the Gen X, Y, and millennials creating must have services for thumbtypers. The services, like the young wizards, puzzle me. I don’t worry about it, but for Facebook’s Mark Zuckerberg, he worries and tries to remediate what seems to be a management Sudoku.
“Facebook Issues New Rules on Internal Employee Communication” explains new principles “to guide debates and conversations within Workplace. This is Facebook’s social network for employees. The article points out that Google moderates its internal message boards.
I live in rural Kentucky, but it seems to me that “principles” and humans who are digital content guards are an interesting development. The approach is even more interesting because Facebook has expressed a keen desire to facilitate social interactions.
I noted this passage in the CNBC write up:
The company will also be more specific about which parts of Workplace can be used to discuss social and political issues. This change will be so that employees do not have to confront social issues during their day-to-day work. Facebook’s new principles also ask that employees communicate with professionalism and continue to debate about the company’s work but do so in a respectful manner.
How does partitioning work in day-to-day communication? In computer speak, a partition is a chunk of a storage device. That data space is a separate logical volume. In a house, a partition divides one space into smaller spaces; for example, a big house in San Jose may have a “safe room.” The idea is that a person can enter the separate area and prevent an intruder from harming the individual. In the case of the storage device, a person or software system operates as the decision maker. the partition is created. The “user” gains access to the storage under certain conditions, but the user does not decide. The user just gets rights and lives with those rights.
The safe house is a different kettle of intentions. The safe room is entered by an individual who feels threatened or who wants to escape a Zoom call. The user locks the door and prevents others from getting into the safe room.
What’s the Facebook partition? Who decides? These will be interesting questions to answer as Facebook pushes forward with what I call “imposed adulting.” The partitioning of Workplace is an interesting step by a company which has been less than proactive in making certain types of decisions about social behavior within the Facebook datasphere.
A related question is, “How does partitioning work out in a social setting?” I napped through lectures about historical partitioning efforts. I vaguely recall one of my history professors (Dr. Philip Crane) expounding about the partitioning of Berlin after the second world war. My recollection is very fuzzy, but the impression I can dredge up from the murky depths of my memory is that it was not party time and pink balloons.
Net net: Partitioning a storage device is a god mode function. Partitioning in a social space is a less tidy logical operation. And when the computer partitioning meets the social partition? Excitement for sure.
Stephen E Arnold, September 18, 2020
Another Interesting 2020 Moment: User Reaction to Facebook
September 16, 2020
After years of letting online services operate like independent countries, a handful of the faithful have taken action. If the information in “Kim Kardashian to Freeze Facebook, Instagram Accounts in #StopHateForProfit Effort” is accurate, luminaries are using their orbital power to cause change. Elected officials lack the hands on experience with digital power houses that high profile cultural icons do. The write up reports:
Kim Kardashian West announced that she will join two dozen celebrities in temporarily freezing their Instagram and Facebook accounts on Wednesday because the platforms “continue to allow the spreading of hate, propaganda and misinformation — created by groups to sow division and split America apart.”
What is interesting is that governments have shown less initiative than made-for-social-media stars. DarkCyber is intrigued that individuals closely associated with Facebook usage are demonstrating a behavior that appears to be “adulting.”
Unregulated, cowboy operations are spinning money. Corporate actions have motivated celebrities to organize and behave — for at least a day — in a way that calls attention to behaviors these individuals find reprehensible.
In 2020, a year of surprises and challenges, Kim Kardashian West-type individuals appear to be manifesting more moxie, purpose, and understanding than most elected officials, Silicon Valley go getters, and churn-centric Wall Street professionals.
DarkCyber is indeed surprised.
Facebook’s management is likely to greet the user response and 24 hour pushback with a bold, “Meh.”
Stephen E Arnold, September 16, 2020
Facebook: Luck of the Irish
September 16, 2020
“Facebook Takes Legal Action after Irish Regulator Threatens to Clamp Down on Transatlantic Data Transfer” illustrates that the company is consistent. The write up reports:
Facebook … launched legal action against Ireland’s data regulator, in an attempt to halt a preliminary order that could stop the company from transferring user data from the European Union to the U.S. The social media giant has applied to seek judicial review of the approach used by Ireland’s Data Protection Commission on the grounds it was premature for the IDPC to have reached a preliminary conclusion at this stage.
On the surface, it appears that Facebook wants to rely on the legal system, not the luck of the Irish, in its effort to sidestep certain constraints on its business. Is this action out of step with Facebook’s socially responsible policies? No. Facebook is acting in a consistent manner. Facebook’s tag line, according to one person on the DarkCyber research team, is “socially responsible.” Another team member understood that colleague to have used the word “reprehensible.”
Another perplexing issue which DarkCyber cannot resolve.
Stephen E Arnold, September 16, 2020
Facebook Management: The High School Science Club Method Reveals Insights
September 3, 2020
An online publication called The Daily Beast published “Facebook’s Internal Black Lives Matter Debate Got So Bad Zuckerberg Had to Step In.” How accurate is the write up? DarkCyber does not know. It is not clear what the point of the “real news” story is.
The write up seems to suggest that there is dissention within Facebook over what employees can on the Facebook internal communication system. The write up makes clear that Mr. Zuckerberg, the Caesar of social media, involved himself in the online dust up. Plus the article describes actions that are just peculiar; for example, this quote:
“[L]et me be absolutely clear about our stance as a company: systemic racism is real. It disadvantages and endangers people of color in America and around the world,” Zuckerberg posted. Zuckerberg added that while it was “valuable for employees to be able to disagree with the company and each other,” he encouraged Facebook staffers to do so “respectfully, with empathy and understanding towards each other.”
What’s the dividing point between an opinion and a statement which is out of bounds? Does Mark Zuckerberg referee these in bounds and out of bounds events?
Several observations:
- Facebook may be able to deal with pesky regulators in Europe and remind the government of Australia that the company has its own views of news, but managing a large company is a different category of problem. Dissention within an organization may not be a positive when regulators are keeping their eyes peeled for witnesses
- Employees within Facebook are manifesting behaviors associated with views and reactions to those views on the Facebook system itself; Facebook is a microcosm of the corrosive effect of instant, unchecked messaging. Will these messages be constrained by humans or smart software or both?
- Mr. Zuckerberg himself is offering a path forward that seems to suggest that a certain homogeneity of thought amongst employees is desirable; that is, disagree within boundaries. But what are the boundaries? Is it possible to define what crosses a shades of gray line ?
Net net: The high school science club management method which has gained favor among a number of Silicon Valley-centric companies is being pushed and pulled in interesting ways. What happens if the fabric of governance is torn and emergency fixes are necessary? Expulsion, loss of market momentum, de facto control of discourse, or insider threats in the form of sabotage, leaks, and unionization? That puts a different spin on social, does it not?
Stephen E Arnold, September 3, 2020
Facebook: High School Science Club Management in Action
September 3, 2020
The online information service Mashable published a headline which tells the story. And the story is a Dusie if accurate: “Mark Zuckerberg Blames Facebook Contractors for Kenosha Militia Fiasco.” The article states:
When it comes to mistakenly allowing a militia’s event page to remain on Facebook, even after concerned users reported it at least 455 times, Mark Zuckerberg wants you to know that the buck stops with his contractors.
The essence of the high school science club management method is to infuse entitlement and arrogance with a pinch of denial. The write up notes:
According to Zuckerberg, the reason Facebook chose to tacitly approve an event page that, by his own admission, violated the site’s own rules, is because the non-Facebook employees tasked with enforcing his company’s Byzantine policies didn’t understand them well enough.
The HSSC approach to management may be institutionalized in some Silicon Valley type outfits. That’s super, right? The elite science club is never wrong; for example, “It is not our fault that the stink bomb triggered smoke alarms and two students were hurt rushing from the building.”
Stephen E Arnold, September 3, 2020