Algorithm Tuning: Zeros and Ones Plus Human Judgment

October 23, 2020

This is the Korg OT-120 Orchestral Tuner. You can buy it on Amazon for $53. It is a chromatic tuner with an eight octave detection range that supports band and orchestra instruments. Physics tune pianos, organs, and other instruments. Science!

image

This is the traditional piano tuner’s kit.

image

You will need ears, judgment, and patience. Richard Feynman wrote a letter to a piano tuner. The interesting point in Dr. Feynman’s note was information about the non-zero stiffness of piano strings affects tuning. The implication? A piano tuner may have to factor in the harmonics of the human ear.

The Korg does hertz; the piano tuner does squishy human, wetware, and subjective things.

I thought about the boundary between algorithms and judgment in terms of piano tuning as I read “Facebook Manipulated the News You See to Appease Republicans, Insiders Say”, published by Mother Jones, an information service not happy with the notes generated by the Facebook really big organ. The main idea is that human judgment adjusted zeros, ones, and numerical recipes to obtain desirable results.

The write up reports:

In late 2017, Zuckerberg told his engineers and data scientists to design algorithmic “ranking changes” that would dial down the temperature.

Piano tuners fool around to deliver the “sound” judged “right” for the venue, the score, and the musician. Facebook seems to be grabbing the old-fashioned tuner’s kit, not the nifty zeros and ones gizmos.

The article adds:

The code was tweaked, and executives were given a new presentation showing less impact on these conservative sites and more harm to progressive-leaning publishers

What happened?

We learn:

for more than two years, the news diets of Facebook audiences have been spiked with hyper conservative content—content that would have reached far fewer people had the company not deliberately tweaked the dials to keep it coming, even as it throttled independent journalism. For the former employee, the episode was emblematic of the false equivalencies and anti-democratic impulses that have characterized Facebook’s actions in the age of Trump, and it became “one of the many reasons I left Facebook.”

The specific impact on Mother Jones was, according to the article:

Average traffic from Facebook to our content decreased 37 percent between the six months prior to the change and the six months after.

Human judgment about tool use reveal that information issues once sorted slowly by numerous gatekeepers can be done more efficiently. The ones and zeros, however, resolve to what a human decides. With a big information lever like Facebook, the effort for change may be slight, but the impact significant. The problem is not ones and zeros; the problem is human judgment, intent, and understanding of context. Get it wrong and people’s teeth are set on edge. Unpleasant. Some maestros throw tantrums and seek another tuner.

Stephen E Arnold, October 23, 2020

Amusing, That Facebook: Born to Curate

October 22, 2020

Facebook loves it when users share news, photos, and opinions, unless they speak ill of the social media platform. Vice explains how Facebook limits free speech in: “Facebook Just Forced Its Most Powerful Critics Offline.”

Facebook does not like the Real Facebook Oversight Board, a group founded in September 2020 when the social media company failed to run its own oversight board in time for the US presidential election. Because Facebook did not like the Real Facebook Oversight Board, they used their legal clout to force the group offline. Facebook wrote the group’s ISP to remove its Web site and succeeded.

What is the Real Facebook Oversight Board?

“The group is made up of dozens of prominent academics, activists, lawyers, and journalists whose goal is to hold Facebook accountable in the run-up to the election next month. Facebook’s own Oversight Board, which was announced 13 months ago, will not meet for the first time until later this month, and won’t consider any issues related to the election.”

Facebook complained the Real Facebook Oversight Board was involved in phishing scams. Usually when a request to remove a Web site reaches an ISP, there is a despite resolution process that takes months and ultimately a court order must be obtained to terminate the site. Facebook had another Web site owned by the Real Facebook Oversight Board removed in the past.

Facebook denied responsibility stating the Real Facebook Oversight Board’s Web site was taken offline because it contained the word “facebook” and violated copyright. Email documentation from Facebook proves otherwise. The company is shaping reality in order to protect its public image and troll its critics. Is Facebook’s editorial process veering away from bright, white lines?

Whitney Grace, October 22, 2020

Journalists Do More Than Report: The Covid Determination

October 17, 2020

One of the DarkCyber research team alerted me to “Facebook Greatest Source of Covid-19 Disinformation, Journalists Say.” That’s the factoid, according to the “real” journalists at a British newspaper.

The main point of the write up may be an interesting way to send this message, “Hey, we are not to blame for erroneous Rona info.” I hear the message.

The write up states:

The majority of journalists covering the pandemic say Facebook is the biggest spreader of disinformation, outstripping elected officials who are also a top source, according to an international survey of journalism and Covid-19.

The survey prompted another Guardian article in August 2020.

Let’s assume Facebook and the other social media high pressure data hoses are responsible for bad, weaponized, or just incorrect Rona info. Furthermore, let’s accept these assertions:

Journalism is one of the worst affected industries during the pandemic as hundreds of jobs have been lost and outlets closed in Australia alone. Ninety per cent of journalists surveyed said their media company had implemented austerity measures including job losses, salary cuts and outlet closures.

The impression the write up creates in the malleable Play-doh of my mind is that journalists are no longer reporting the news. “Real” journalists are making the news, and it is about time!

The sample probably reflects the respondents reaction to the questions on the survey, which remain unknown to me. The survey itself may have been structured as a dark pattern. What better way to explain that bad things are happening to “real” journalists.

What’s interesting is that “real” journalists know that Facebook and other social media systems are bad.

One question, “How long has it taken “real” journalists to figure out the harsh realities of digital streams of users unfettered by internal or external constraints.

Maybe the news is: “It is too late.” Maybe the working hypothesis is that “better late than never”?

Stephen E Arnold, October 17, 2020

Facebook: Interesting Data If Accurate

October 16, 2020

DarkCyber spotted a factoid of interest to law enforcement professionals in “Facebook Responsible for 94% of 69 Million Child Sex Abuse Images Reported by US Tech Firms.”

Facebook has previously announced plans to fully encrypt communications in its Messenger app, as well as its Instagram Direct service – on top of WhatsApp, which is already encrypted – meaning no one apart from the sender and recipient can read or modify messages.

Now about Facebook’s content curation procedures? End-to-end encryption of ad supported private messaging services appears to benefit bad actors.

Stephen E Arnold, October 16, 2020

Avaaz Facebook Report: Another Road Map for Bad Actors?

October 14, 2020

DarkCyber is intrigued by research reports which try to alert the public to an issue. Often the reports provide a road map for bad actors who are looking for a new method or angle to practice their dark arts. “Facebook’s Algorithm: A Major Threat to Public Health” may be a recent example of doing right going wrong.

Avaaz is, according to the organization’s Web site:

a global web movement to bring people-powered politics to decision-making everywhere.

A 33-page study, published in late August 2020, is available without charge at this link. The publication covers health misinformation through the lens of Facebook’s apparently flawed content curation mechanisms.

For bad actors (real or would be), the document explains:

  • The relationship between Web pages with “disinformation” and Facebook sites
  • The amplification function of content generators
  • Utility of a message output in multiple languages
  • The role of backlinks
  • A list of “gaps in Facebook’s” content curation method.

Interesting report and one which may help some individuals operate more effectively. Facebook’s content curation has some flaws. The company flagged a photograph of onions as salacious. No, really.

Stephen E Arnold, October 14, 2020

Facebook: Merging Apps Before the Call to Break It Up

October 7, 2020

DarkCyber noted “Facebook Is Merging Messenger into Instagram.” The write up explains:

…Facebook is starting to unify its messaging platforms across apps. They will start including more of Messenger’s features into Instagram’s direct messaging chat platform. It will also add the ability to send messages across the two apps.

DarkCyber believes that unified messaging may have some downstream consequences. On one hand, certain government requests for data may be more helpful if Facebook provides the requested information. On the other hand, breaking up the company could become more difficult.

Stephen E Arnold, October 7, 2020

Facebook Is Nothing If Not Charming

October 5, 2020

Facebook spies on its users by collecting their personal information from hobbies, birthdays, relationships, and vacation spots. Facebook users voluntarily share this information publicly and/or privately. As a result, the company sells that information to advertisers. Facebook also spies on its competitors, but it does so in a more sophisticated way says the BBC article “Facebook Security App Used To ‘Spy’ On Competitors.”

Facebook apparently used its cross-party Onavo VPN to collect information on its competitors knowingly and in violation of anti-piracy laws. The Commons Committee discussed the incident in a report that is more than one hundred pages. Here is the gist of the report:

“The Digital, Culture, Media and Sport Committee wrote that through the use of Onavo, which was billed as a way to give users an extra layer of security, Facebook could ‘collect app usage data from its customers to assess not only how many people had downloaded apps, but how often they used them”.

The report added:

‘This knowledge helped them to decide which companies were performing well and therefore gave them invaluable data on possible competitors. They could then acquire those companies, or shut down those they judged to be a threat.”

Even more alarming are the details about ways Facebook could shut down services it provides to its competition. Twitter’s video sharing app Vine is an example of how Facebook destroyed a competitor. Twitter wanted Vine users to find friends via their Facebook accounts, but Zuckerberg nixed that idea. Vine shuttered in 2016.

Facebook does something equally nefarious with a white list of approved apps that are allowed to use Facebook user data. Among the 5,000 approved apps are Netflix, Airbnb, and Lyft. These app companies supposedly spend $250,000 on Facebook advertising to keep their coveted position.

Zuckerburg wrote in an email:

“I think we leak info to developers, but I just can’t think of any instances where that data has leaked from developer to developer and caused a real issue for us.”

There was the Cambridge Analytica scandal where voter information was collected through a personality quiz. The data of users and their friends was stolen and it profiled 82 million Americans, then that information was sold to the Cambridge Analytica company. The United Kingdom fined Facebook 500,000 pounds and the company apologized.

It will not be the first time Facebook steals and sells user information. We wonder how their competition spies on users and sells their data.

Whitney Grace, October 5, 2020

Facebook WhatsApp: Disappearing Media. Really? Gone for Good?

September 28, 2020

Facebook is endlessly entertaining. On one tentacle, the octopus company seeks to lessen the competitive threats from next generation social media like TikTok-type videos. On another tentacle, Facebook suggests that those in the European Union can do without Facebook. DarkCyber thinks of this as “the take my ball and go home” tactic. Ten year olds with minimal physical coordination but a stock of high end athletic equipment have been known to trot out this argumentative chestnut. Another tentacle semi cooperates with government officials. Another tentacle balances on a knife edge in order to keep employees happy with the wonderful social environment within the helpful company’s digital walled garden. There are other tentacles too, but I want to focus your attention on “WhatsApp Expiring Media Feature Details Tipped via New Beta Version.” Keep in mind that “beta” does not mean something a thumbtyper will be able to use.

The write up explains:

WhatsApp 2.20.201.6 beta for Android has been released with further references for a new feature called “Expiring Media.” The feature, as its name suggests, would make media files such as images, videos, and GIFs sent to the recipient’s phone over WhatsApp disappear once they are viewed.

Interesting. Just one question:

If media are disappeared for users, are those data deleted from the Facebook servers?

One hopes not; otherwise, some investigations will be slowed or halted.

Stephen E Arnold, September 28, 2020

Facebook: Fine Thinking

September 26, 2020

I read “Former Facebook Manager: We Took a Page from Big Tobacco’s Playbook.” The main idea is that a former Facebook professional revealed how the gears meshed within the Facebook distributed intelligence machine. For me, the allegedly truthful revelations add some color to my understanding of what I call high school science club thinking.

The write up quotes the actual factual testimony of Facebook’s former director of monetization (good title that), quoting a certain Tim Kendall as saying:

“We sought to mine as much attention as humanly possible… We took a page form Big Tobacco’s playbook, working to make our offering addictive at the outset.”

What’s interesting is the way in which Ars Technica approached the story. The article lets Mr. Kendall’s own words and some facts about Facebook’s fine manager-employee relations beef up the write up.

What’s interesting is the way in which Ars Technica approached the story. The article lets Mr. Kendall’s own words and some facts about Facebook’s fine manager-employee relations beef up the write up.

Facebook continues to capture the attention of the savvy US elected officials. The social media company opened for business in 2004. That works out to more than 15 years ago. Now after controversies with alleged “co-founders”, the pompous Etonian, and interactions with the clear-minded European union officials, Facebook is getting scrutinized by the US government.

What if Mr. Kendall is making Facebook look different like a reflection in a fun house mirror? What if Facebook is a happy, happy place? What if Facebook has contributed to social comity?

What if Facebook is the best thing one can say about 2020?

Stephen E Arnold, September 26, 2020

Facebook and Digital Partitioning

September 18, 2020

I am no expert on managing the Gen X, Y, and millennials creating must have services for thumbtypers. The services, like the young wizards, puzzle me. I don’t worry about it, but for Facebook’s Mark Zuckerberg, he worries and tries to remediate what seems to be a management Sudoku.

“Facebook Issues New Rules on Internal Employee Communication” explains new principles “to guide debates and conversations within Workplace. This is Facebook’s social network for employees. The article points out that Google moderates its internal message boards.

I live in rural Kentucky, but it seems to me that “principles” and humans who are digital content guards are an interesting development. The approach is even more interesting because Facebook has expressed a keen desire to facilitate social interactions.

I noted this passage in the CNBC write up:

The company will also be more specific about which parts of Workplace can be used to discuss social and political issues. This change will be so that employees do not have to confront social issues during their day-to-day work. Facebook’s new principles also ask that employees communicate with professionalism and continue to debate about the company’s work but do so in a respectful manner.

How does partitioning work in day-to-day communication? In computer speak, a partition is a chunk of a storage device. That data space is a separate logical volume. In a house, a partition divides one space into smaller spaces; for example, a big house in San Jose may have a “safe room.” The idea is that a person can enter the separate area and prevent an intruder from harming the individual. In the case of the storage device, a person or software system operates as the decision maker. the partition is created. The “user” gains access to the storage under certain conditions, but the user does not decide. The user just gets rights and lives with those rights.

The safe house is a different kettle of intentions. The safe room is entered by an individual who feels threatened or who wants to escape a Zoom call. The user locks the door and prevents others from getting into the safe room.

What’s the Facebook partition? Who decides? These will be interesting questions to answer as Facebook pushes forward with what I call “imposed adulting.” The partitioning of Workplace is an interesting step by a company which has been less than proactive in making certain types of decisions about social behavior within the Facebook datasphere.

A related question is, “How does partitioning work out in a social setting?” I napped through lectures about historical partitioning efforts. I vaguely recall one of my history professors (Dr. Philip Crane) expounding about the partitioning of Berlin after the second world war. My recollection is very fuzzy, but the impression I can dredge up from the murky depths of my memory is that it was not party time and pink balloons.

Net net: Partitioning a storage device is a god mode function. Partitioning in a social space is a less tidy logical operation. And when the computer partitioning meets the social partition? Excitement for sure.

Stephen E Arnold, September 18, 2020

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta