Facebook Security: Fodder for Testimony?

April 9, 2021

Who knows if this is true? “533 Million Facebook Users’ Phone Numbers Leaked on Hacker Forum.” The write up states:

The mobile phone numbers and other personal information for approximately 533 million Facebook users worldwide has been leaked on a popular hacker forum for free. The stolen data first surfaced on a hacking community in June 2020 when a member began selling the Facebook data to other members.

If true, the revelation is a nice complement to a series of outstanding achievements by the centralized, big tech, really smart managers at super important companies. Examples include:

  • Twitter’s senior manager spoofing elected officials
  • Microsoft’s Exchange Server misstep when Windows Defender was on the job sort of
  • Amazon’s brilliant Twitter campaign about workers’ inexplicable need to take breaks
  • Google’s staunch defense of employees who grouse with assurances of continued employment.

Now Mr. Zuckerberg’s digital nation and its outstanding security.

How did this happen? The write up asserts:

According to Alon Gal, CTO of cybercrime intelligence firm Hudson Rock, it is believed that threat actors exploited in 2019 a now-patched vulnerability in Facebook’s “Add Friend” feature that allowed them to gain access to member’s phone numbers.

I envision Mr. Zuckerberg answering this question under oath in an upcoming Congressional hearing:

Senator X: Mr. Zuckerberg, what the heck happened? I have a teen age grand daughter. Are you protecting her?

Mr. Zuckerberg: Senator, thank you for that question. At Facebook, we take every possible precaution to guard our user’s identify. I will look into this matter and provide a report written by an Amazon PR person whom we just hired, and assign the former head of Microsoft security also a new hire to investigate this matter. Early reports suggest that the 1,000 criminals attacking Microsoft were supplemented with an additional 2,000 bad actors to breach our highly secure system.

Plus, the loss of data affected a mere 533 million users. Trivial. It is old news too.

Stephen E Arnold, April 9, 2021

Facebook and Microsoft: Communing with the Spirit of Security

April 7, 2021

Two apparently unrelated actions by bad actors. Two paragons of user security. Two. Count ‘em.

The first incident is summarized in “Huge Facebook Leak That Contains Information about 500 Million People Came from Abuse of Contacts Tool, Company Says.” The main point is that flawed software and bad actors were responsible. But 500 million. Where is Alex Stamos when Facebook needs guru-grade security to zoom into a challenge?

The second incident is explained in “Half a Billion LinkedIn Users Have Scraped Data Sold Online.” Microsoft, the creator of the super useful Defender security system, owns LinkedIn. (How is that migration to Azure coming along?) Microsoft has been a very minor character in the great works of 2021. These are, of course, The Taming of SolarWinds and The Rape of Exchange Server.

Now what’s my point. I think when one adds 500 million and 500 million the result is a lot of people. Assume 25 percent overlap. Well, that’s still a lot of people’s information which has taken wing.

Indifference? Carelessness? Cluelessness? A lack of governance? I would suggest that a combination of charming personal characteristics makes those responsible individuals one can trust with sensitive information.

Yep, trust and credibility. Important.

Stephen E Arnold, April 7, 2021

Facebook: The Polarization Position

March 17, 2021

I find Silicon Valley “real” news amusing. I like the publications themselves; for example, Buzzfeed. I like the stories themselves; for example, “Polarization Is Good For America, Actually, Says Facebook Executive.”

How much of the Google method has diffused into Facebook? From my point of view, a magnetic influence exists. The cited article points out:

Facebook has created a ”playbook” to help its employees rebut criticism that the company’s products fuel political polarization and social division.

The idea is that employees comprise a team. The team runs plays in order to score. The playbook also directs and informs team members on their roles.

Trapped Priors As a Basic Problem of Rationality” explains how feedback loops lead to a reinforcement of ideas, data, and rationality otherwise not noticed.

Buzzfeed references this Facebook research document:

In the [Facebook] paper, titled “What We Know About Polarization,” Cox and Raychodhury [Facebook experts] call polarization “an albatross public narrative for the company.” “The implicit argument is that Facebook is contributing to a social problem of driving societies into contexts where they can’t trust each other, can’t share common ground, can’t have conversation about issues, and can’t share a common view on reality,” they write, adding that “the media narrative in this case is generally not supported by the research.” While denying that Facebook meaningfully contributes to polarization, Pablo Barberá, a research scientist at the company, also suggested political polarization could be a good thing during Thursday’s presentation. “If we look back at history, a lot of the major social movements and major transformations, for example, the extension of civil rights or voting rights in this country have been the result of increasing polarization,” he told employees.

The value of polarization and a game plan to make explicit a particular business method are high. The fact that the trappings of research are required to justify the game plan is interesting. But those trapped priors are going to channel Facebook’s behavior into easy-to-follow grooves.

Scrutiny, legal action, and “more of the same” will allow pot holes to form. Some will be deep. Others will be no big deal.

Stephen E Arnold, March 17, 2021

Facebook WhatsApp, No Code Ecommerce, and Google: What Could Go Wrong?

March 5, 2021

The Dark Web continues to capture the attention of some individuals. The little secret few pursue is that much of the Dark Web action has shifted to encrypted messaging applications. Even Signal gets coverage in pot boiler novels. Why? Encrypted messaging apps are quite robust convenience stores? Why go to Ikea when one can scoot into a lightweight, mobile app and do “business.” How hard is it to set up a store, make its products like malware or other questionable items available in WhatsApp, and start gathering customers? Not hard at all. In fact, there is a no code wrapper available. With a few mouse clicks, a handful of images, and a product or service to sell, one can be in business. The developer – an outfit called Wati – provides exactly when the enterprising marketer requires. None of that Tor stuff. None of the Amazon police chasing down knock off products from the world’s most prolific manufacturers. New territory, so what could go wrong. If you are interested in using WhatsApp as an ecommerce vehicle, you can point your browser to this Google Workspace Marketplace. You will need both a Google account and a WhatsApp account. Then you can us “a simple and powerful Google Sheet add-on to launch an online store from Google Sheets and take orders on WhatsApp.” How much does this service cost? The developer asserts, “It’s free forever.” There is even a video explaining what one does to become a WhatsApp merchant. Are there legitimate uses for this Google Sheets add on? Sure. Will bad actors give this type of service a whirl? Sure. Will Google police the service? Sure. Will Facebook provide oversight? Sure. That’s a lot of sures. Why not be optimistic? For me, the Wati wrapper is a flashing yellow light that a challenge to law enforcement is moving from the Dark Web to apps which are equally opaque. Progress? Nope.

Stephen E Arnold, March 5, 2021

Facebook Found Lax in Enforcement of Own Privacy Rules. Surprised?

March 4, 2021

Facebook is refining its filtering AI for app data after investigators at New York’s Department of Financial Services found the company was receiving sensitive information it should not have received. The Jakarta Post reports, “Facebook Blocks Medical Data Shared by Apps.” Facebook regularly accepts app-user information and feeds it to an analysis tool that helps developers improve their apps. It never really wanted responsibility for safeguarding medical and other sensitive data, but did little to block it until now. The write-up quotes state financial services superintendent Linda Lacewell:

“Facebook instructed app developers and websites not to share medical, financial, and other sensitive personal consumer data but took no steps to police this rule. By continuing to do business with app developers that broke the rule, Facebook put itself in a position to profit from sensitive data that it was never supposed to receive in the first place.”

Facebook is now stepping up its efforts to block sensitive information from reaching its databases. We learn:

“Facebook created a list of terms blocked by its systems and has been refining artificial intelligence to more adaptively filter sensitive data not welcomed in the analytics tool, according to the report. The block list contains more than 70,000 terms, including diseases, bodily functions, medical conditions, and real-world locations such as mental health centers, the report said.”

A spokesperson says the company is also “doing more to educate advertisers on how to set-up and use our business tools.” We shall see whether these efforts will be enough to satisfy investigators next time around.

Cynthia Murrell, March 4, 2021

Facebook Found Lax in Enforcement of Own Privacy Rules

February 26, 2021

Facebook is refining its filtering AI for app data after investigators at New York’s Department of Financial Services found the company was receiving sensitive information it should not have received. The Jakarta Post reports, “Facebook Blocks Medical Data Shared by Apps.” Facebook regularly accepts app-user information and feeds it to an analysis tool that helps developers improve their apps. It never really wanted responsibility for safeguarding medical and other sensitive data, but did little to block it until now. The write-up quotes state financial services superintendent Linda Lacewell:

“Facebook instructed app developers and websites not to share medical, financial, and other sensitive personal consumer data but took no steps to police this rule. By continuing to do business with app developers that broke the rule, Facebook put itself in a position to profit from sensitive data that it was never supposed to receive in the first place.”

Facebook is now stepping up its efforts to block sensitive information from reaching its databases. We learn:

“Facebook created a list of terms blocked by its systems and has been refining artificial intelligence to more adaptively filter sensitive data not welcomed in the analytics tool, according to the report. The block list contains more than 70,000 terms, including diseases, bodily functions, medical conditions, and real-world locations such as mental health centers, the report said.”

A spokesperson says the company is also “doing more to educate advertisers on how to set-up and use our business tools.” We shall see whether these efforts will be enough to satisfy investigators next time around.

Cynthia Murrell, February 26, 2021

Facebook: The Great Victory

February 25, 2021

Facebook Says It Will Pay News Industry $1 Billion over 3 Years” makes clear the magnitude of Facebook’s “victory” over a mere nation state. The “real” news report reveals:

Facebook announced Wednesday it plans to invest $1 billion to “support the news industry” over the next three years and admits it “erred on the side of over-enforcement” by banning news links in Australia.

The admission does nothing to diminish the greatness of Facebook and its decision to unfriend or non-like Australia. A member of the Five Eyes, Australia did not reference Facebook’s alleged “bully boy” behavior. The country’s government was delighted to modify its laws in order to accommodate the digital nation state’s wishes.

Beyond Search’s art unit created the “new” flag for the mere nation state of Australia. Here it is:

a aus flag

An Australian official revealed:

The Morrison Government’s world-leading news media bargaining code has just passed the Parliament. This is a significant milestone.

Beyond Search has learned that changes to the school curricula, including replacing existing non-Facebook flags has begun immediately.

Facebook’s diplomatic skill, its management team’s acumen, and the incredible personal warmth of Mr. Zuckerberg (affectionately known as the Zuck) appear to have forced a mere nation state to reverse course.

Australia is no longer “unfriended” by the digital power house.

Stephen E Arnold, February 25, 2021

Quote to Note: Facebook and Its True Colors

February 24, 2021

I find some “real” newspapers interesting because some of their stories have a social political agenda and every once in a while a story is just made up. (Yes, even the New York Times has experienced this type of journalistic lapse.)

In the estimable New York Post, “Facebook Faces Boycott Campaign after Banning News in Australia” included an interesting statement allegedly made by British member of Parliament Julian Knight. Here’s the quote I noted:

Facebook always claimed it is a platform. It very much looks like it is now making quite substantial editorial and even political decisions. It is arrogant, particularly during a pandemic, to basically turn off the taps to a great deal of news. It is not being a good global citizen.

Facebook operates as if it were a country. Perhaps it will move beyond cyber force in Australia? Does Mr. Zuckerberg own a yacht. The boat could be outfitted with special equipment. On the other hand, Mr. Zuckerberg will find a way to make peace with a country which he obviously perceives as annoying, if not irrelevant to the proud dataspace of Facebook.

Stephen E Arnold, February 24, 2021

Facebook Demonstrates It Is More Powerful Than a Single Country

February 23, 2021

I read “Facebook to Reverse News Ban on Australian Sites, Government to Make Amendments to Media Bargaining Code.” It’s official. Google paid up. Facebook stood up and flexed its biceps. The Australian government swatted at the flies in Canberra, gurgled a Fosters, and rolled over. The write up states:

Facebook will walk back its block on Australian users sharing news on its site after the government agreed to make amendments to the proposed media bargaining laws that would force major tech giants to pay news outlets for their content.

The after party will rationalize what happened. But from rural Kentucky, it certainly seems as if Facebook is now able to operate as a nation state. Facebook can impose its will upon a government. Facebook can do what it darn well pleases, thank you very much.

The write-up has a great quote attributed to Josh Frydenberg, the Australian government treasurer:

Facebook is now going to engage good faith negotiations with the commercial players.

Are there historical parallels? Sure, how about Caesar and the river thing?

Turning point and benchmark.

Stephen E Arnold, February 23, 2021

Facebook Algorithms: Pernicious, Careless, Indifferent, or No Big Deal?

February 4, 2021

What is good for the social media platform is not necessarily good for its users. Or society. The Startup examines the “Facebook AI Algorithm: One of the Most Destructive Technologies Ever Invented.” Facebook’s AI is marketed as a way to give users more of what they want to see and that it is—to a point. We suspect most users would like to avoid misinformation, but if it will keep eyeballs on the platform Facebook serves up fake news alongside (or instead of) reputable content. Its algorithms are designed to serve its interests, not ours. Considering Facebook has become the primary source of news in the U.S., this feature (not a bug) is now a real problem for society. Writer David Meerman Scott observes:

“The Facebook Artificial Intelligence-powered algorithm is designed to suck users into the content that interests them the most. The technology is tuned to serve up more and more of what you click on, be that yoga, camping, Manchester United, or K-pop. That sounds great, right? However, the Facebook algorithm also leads tens of millions of its 2.7 billion global users into an abyss of misinformation, a quagmire of lies, and a quicksand of conspiracy theories.”

As we have seen, such conspiracy theories can lead to dire real-world consequences. All because Facebook (and other social media platforms) lead users down personalized rabbit holes for increased ad revenue. Sites respond to criticism by banning some content, but the efforts are proving to be inadequate. Scott suggests the only real solution is to adjust the algorithms themselves to avoid displaying misinformation in the first place. Since this will mean losing money, though, Facebook is unlikely to do so without being forced to by regulators, advertisers, or its employees.

The Next Web looks at how these algorithms work in, “Here’s How AI Determines What You See on the Facebook News Feed.” Reporter Thomas Macaulay writes:

“The ranking system first collects candidate posts for each user, including those shared by their friends, Groups, or Pages since their last login. It then gives each post a score based on a variety of factors, such as who shared the content and how it matches with what the user generally interacts with. Next, a lightweight model narrows the pool of candidates down to a shortlist. This allows more powerful neural networks to give each remaining post a score that determines the order in which they’re placed. Finally, the system adds contextual features like diversity rules to ensure that the News Feed has a variety of content. The entire process is complete in the time it takes to open the Facebook app.”

Given recent events, it is crucial Facebook and other platforms modify their AI asap. What will it take?

Cynthia Murrell, February 4, 2021

Next Page »

  • Archives

  • Recent Posts

  • Meta