A group of small advertisers suing the Menlo Park social media titan alleged in the filing that Facebook “induced” advertisers to buy video ads on its platform because advertisers believed Facebook users were watching video ads for longer than they actually were. That “unethical, unscrupulous” behavior by Facebook constituted fraud because it was “likely to deceive” advertisers, the filing alleged.
The Facebook Management Play: Not Much to Change
November 22, 2018
I read two articles this morning. I came away with the thought that Facebook is not eager to change.
The first article is “As Problems Pile Up, Mark Zuckerberg Stands His Ground in Exclusive CNN Business interview.” The main idea appears to be:
Zuckerberg resisted growing calls for changes to Facebook’s C-suite, reiterated Facebook’s potential as a force for good, and pushed back at some of the unrelenting critical coverage of his company after a year of negative headlines about fake news, election meddling and privacy concerns.
The second article is “The Punctured Myth of Sheryl Sandberg.” Yep, the lean in thinker and doer. The main idea struck me as:
Sandberg played a central role in nearly every misdeed at Facebook that’s described in the Times piece. Singularly focused on the company’s stock price and its advertising-based business model, she worked to minimize data abuse and election interference.
So what?
Three observations:
- Facebook is not likely to change without some outside encouragement
- Ethical behavior appears to be a dynamic concept. Expedient behavior may be a suitable synonym.
- A company founded on getting info about potential dates has morphed into an organization capable of taking down carefully constructed social assemblies.
Change may be difficult. Habit, momentum, and money can be barriers. We may have a digital turkey to monitor.
Stephen E Arnold, November 22, 2018
Facial Recognition and Image Recognition: Nervous Yet?
November 18, 2018
I read “A New Arms Race: How the U.S. Military Is Spending Millions to Fight Fake Images.” The write up contained an interesting observation from an academic wizard:
“The nightmare situation is a video of Trump saying I’ve launched nuclear weapons against North Korea and before anybody figures out that it’s fake, we’re off to the races with a global nuclear meltdown.” — Hany Farid, a computer science professor at Dartmouth College
Nothing like a shocking statement to generate fear.
But there is a more interesting image recognition observation. “Facebook Patent Uses Your Family Photos For Targeted Advertising” reports that a the social media sparkler has an invention that will
attempt to identify the people within your photo to try and guess how many people are in your family, and what your relationships are with them. So for example if it detects that you are a parent in a household with young children, then it might display ads that are more suited for such family units. [US20180332140]
While considering the implications of pinpointing family members and linking the deduced and explicit data, consider that one’s fingerprint can be duplicated. The dupe allows a touch ID to be spoofed. You can get the details in “AI Used To Create Synthetic Fingerprints, Fools Biometric Scanners.”
For a law enforcement and intelligence angle on image recognition, watch for DarkCyber on November 27, 2018. The video will be available on the Beyond Search blog splash page at this link.
Stephen E Arnold, November 18, 2018
High School Science Club Management: The Facebook Method
November 16, 2018
I am not much of a Facebooker. We use a script to pump out the titles of the items we post in the Beyond Search blog. I try to ignore Facebook, but – I must admit – that has been tough the last few days. The New York Times finally jabbed its remaining investigative skills into the juicy, fat cables of Facebookland. My takeaway from the long newspaper story which has many atwitter is that HSSCM is alive and well. HSSCM means to me “high school science club management.”
What sparks me to write this fine morning in rural Kentucky is an essay by the chief lean inner at his link. To read this essay, I have been informed I have to log in. I did not. I assume I saw the full Monty, but who knows? In practice it doesn’t matter because the drift of the write up is:
What? Who knew?
Yeah, sounds about right. Who put “Great Balls of Fire” on the Woodruff High School PA system at 7 45 am in 1958? Those of us in the WHS Science Club said:
What? Who knew?
Here in frosty Harrod’s Creek, the stories from Facebookland reveal the basic workings of HSSCM: Say what’s necessary to make the annoying Mr. McDonald (our WHS principal) go away.
We were the Science Club. We are the future. We knew better.
Sophomoric explanations work fine when one is 15. Transported to a publicly traded company I grow weary.
Time for a change. Lean into that.
Stephen E Arnold, November 16, 2018
Facebook: Darned Busy
November 12, 2018
I read “Zuckerberg Rebuffs Request to Appear before UK Parliament.” Now this is an AP story, and I don’t want to get into a tussle with the organization’s legal eagles by quoting an AP content gem. Therefore, you will have to navigate to the original (AP is into original) story and read the words yourself. For me, I noted the alleged factoid that Facebook’s poobah is not going to chat about fake news with European officials. Nevertheless, Facebook appears to be interested in expanding its operations in Ireland and figuring out how to make money without creating situations which require European officials to demonstrate their appetite for information about what looks and acts like a country. Perhaps Facebook should summon European officials to the sparkling streets of San Francisco or do lunch in New York? Like that?
Stephen E Arnold, November 9, 2018
Facebook Sidesteps Balloons and Goes for Satellites
October 23, 2018
How can Facebook elevate its trust factor? Forget gliders. Facebook is now into satellites.
The Verge shares how Facebook wants to bring the Internet to the world in, “Facebook Is Developing An Internet Satellites After Shutting Down Drone Project.”
According to an application filed with FCC, Facebook has registered company called PointView Tech, LLC. and is working on a satellite called Athena. Athena will ideally be used to provide Internet broadband services to unserved and underserved areas around the world. Athena is the Greek goddess of wisdom. It is a fitting name, because the satellite Athena will spread wisdom and information.
Facebook had planned to use drones to bring Internet access through the Aquila project, but this was shunted. Aquila will instead focus on designing onboard software systems to guide the satellite.
“Now, it sounds like Facebook will continue to try and develop its own hardware, just a different variety this time. According to a September 2017 report on broadband development, more than half of Earth is still not online, and that the only way to do so would be to use low Earth orbit satellites that sit in space about 100 to 1,250 miles above the surface. There’s already a booming industry around satellite internet, with key players like SpaceX investing heavily in the space to become the new internet service providers of an untapped market. SpaceX launched its first satellites back in February.”
The article also points out that Facebook has ulterior motives in bringing the Internet to underserved and unserved areas. Facebook heads to space.
Whitney Grace, October 23, 2018
—
The War Room Fallacy: Facebook Embraces Its Confidence in Itself
October 19, 2018
I read “Facebook Opens A War Room To Fight Election Interference and Bad PR.” The idea is that a team can solve the problem of humans manipulating Facebook to change opinions, alter elections, or cause some physical or mental reaction.
In some companies, the “war room” lingo is replaced with “clear the decks” or “a SWAT team”. The idea is that a group of insiders can solve a problem. The assumption is, of course, that the insiders are able to resolve a crisis. In many cases, the crisis has been created by those insiders.
In my work career, I have found myself involved in various teams assembled to deal with a problem. One of my former colleagues who was a former TV news anchor, believed that he could solve any problem—yes, any problem—by forming a team to swarm, analyze, and resolve the issue.
I found this belief a little crazy. Consulting firms routinely employ this process clients. After all, who would pay seven figures for a group of MBAs and “really smart people” to to fix a tough problem unless the insiders were desperate. When an insider task force calls for help, the problem is a big one, and only the confidence of the consulting firm can save the day.
So whether the strike force is composed of insiders, just outside experts given the power to solve the problem, or a some hybrid group—the assumption is the same, “We can do it.”
In some cases, the special team can solve a problem, particularly if it is narrow and the team has the expertise to deal with the issue. It is unlikely that a group of MBAs could deal with the nuclear waste generated by the Fukushima disaster, for example. I would assume that the the power company has legions of strike forces at work. How are they doing? Well, check it out from a location well away from the radioactive facility.
The write up explains the Facebook approach:
20 employees — software engineers, threat intelligence and security, data scientists, researchers, lawyers and policy experts — keep their eyes glued to smaller screens. All are coiled tightly and waiting to spring into action when something untoward is spotted on the network.
Yeah, that sounds workable. Smart software cannot reliably identify and act upon weaponized information. Now 20 humans will be able to spot weaponized information and take action. Facebook has tens of millions of users posting content, and 20 people will be able to deal with the content flow? Yeah, that sounds like something a highly confident, somewhat unrealistic individual would cook up.
The article points out:
Facebook’s war room is the nerve center where the company will wage a potentially never-ending battle against disinformation and election interference.
Yeah. Disinformation. No problem.
I highlighted this statement:
The war room is a step in the right direction. It’s infinitely more important that Facebook is making strides in the battle against election interference… but stepping up its efforts — and giving the world a behind-the-scenes look — is a much-needed PR win.
Was it Mao who observed, “The longest journey begins with a single step.”
Yeah, as long as it is in the right direction. I noted the word “infinite.” That’s the scope of the Facebook problem and its strike force in the war room.
Infinite. No problem. 20 people.
Stephen E Arnold, October 19, 2018
Facebook Follies: Consistency, Completeness, and Credibility
October 17, 2018
Years ago, we set up Beyond Search so that posts were distributed to Facebook. The Beyond Search goose assumes that Facebook tracks what it can from our office in rural Kentucky. But Facebook is clogging our Overflight system with factoids and “real” news about a proud company anchored in a Harvard dorm.
For example, I learned today that Facebook said that it would not collect data via its Portal video calling service. Recode, a podcast company, that Facebook will collect data from this service and use it to target ads. One day, no use of data; a few days later, use of data. A misunderstanding or an alternative definition of consistency? The Beyond Search goose is deeply skeptical about the information flowing from Facebook. But the humans on the team love Facebook and can easily see that yes and no are exactly the same.
We also noted a report in the estimable Wall Street Journal. Apparently some advertisers misunderstood the completeness of Facebook’s reports about the number of people who watched the social media giant’s video ads. Some advertisers doubt that Facebook revealed necessary information about the efficacy of the system. With errors, the accuracy and completeness of the Facebook data are questioned. Log files can be baffling, and their data can be misinterpreted. Skeptics might suggest that click data are suggestive, not definitive. When it comes to delivering data about online traffic, complete is complete. Unless it is not. “Facebook Lured Advertisers by Inflating Ad Watch Times Up to 900 Percent: Lawsuit” asserts:
Finally, Axios reported that Facebook is delivering traffic from mobile phones to its publishing “partners.” That makes sense because online access is on its way to being the only way some people will get information, communicate with fellow humans, and output tracking data. Good news. But the Axios report suggests that “Facebook traffic to publishers is down.” Some traffic up, some down. Due to the credibility which clings to Facebook data like lint to black socks on a winter’s morn, it seems as if Facebook is chugging along. Chug, chug goes the credibility engine.
Net net: Facebook manifests itself as an outfit which behaves in a consistent manner, outputs complete information when asked, and maintains a posture which evokes credibility.
The Beyond Search goose believes this.
Stephen E Arnold, October 17, 2018
Facebook: A Rhetorical Punching Bag for Real Journalists
October 17, 2018
I got a kick out of “Facebook’s ‘Spam Purge’ Is Silencing Genuine Debate, Political Page Creators Say.” Years ago I had a teacher named George Harris, I believe. His favorite ploy was to craft “Have you stopped beating your wife?” questions. Nifty game. My response to him was to shift the assumption up a level and direct the question to his inner psychological processes; for example, “That’s interesting. Why do you ask?” He did not like my refusal to play his game. I think he longed for a car battery and, alligator clips, a bucket of water, and some rope. Fascinating idea, but he was a teacher and the methods of interrogators were beyond his reach.
The Guardian story reminded me of good old George. The psychological motivation is not difficult to discern. Facebook is an online information system which makes money by selling ads. Unlike the good old world of “real” journalism, Facebook apologizes and continues on its merry way.
The Facebook money machine had humble beginning in a dorm. The idea was to get information about individuals who might—a conditional idea—want to meet up in the student union and actually talk. From this noble idea has emerged a company which makes some ad starved newspapers green with envy.
The response is to point out that Facebook does not do a good job of balancing information for its users. Of course, when Facebook makes a decision, that decision is going to annoy some of the two billion Facebookers. Even better is that if Facebook does nothing, the company has abrogated its moral responsibility.
News flash: This is a company invented in a dorm and has not outgrown its original DNA.
I learned in the write up:
As a private entity, Facebook can enforce its terms however it sees fit, says the ACLU attorney Vera Eidelman. But this can have serious free speech consequences, especially if the social network is selectively enforcing its terms based on the content of the pages. “Drawing the line between ‘real’ and ‘inauthentic’ views is a difficult enterprise that could put everything from important political parody to genuine but outlandish views on the chopping block,” says Eidelman. “It could also chill individuals who only feel safe speaking out anonymously or pseudonymously.”
I can hear the snorts of laughter in my mind’s reconstruction of several real British newspaper professionals talking about the spike on which Facebook finds itself impaled.
Jolly good I say.
The write up invokes pathos, annoyance, and shock. These are useful rhetorical tricks, particularly when presented by individuals who have been injured in service to their country.
And the coup de grace:
Facebook did not respond to requests for comment.
Well done, old chap. Indeed. Now about throttling your children and that ad revenue, you yob?
Stephen E Arnold, October 17, 2018
HSSCM Method: Update for October 10, 2018
October 10, 2018
The management methods inspired by high school science club behaviors are noteworthy. The goose calls these HSSCM methods or “high school science club management methods” to honor the behaviors of individuals who loved technology but were unfettered by such non essentials as football practice, the student council, and working as a volunteer at the retirement facility near the high school. Chemistry, math, physics, biology—the future.
Two items caught the Beyond Search goose’s attention this fine day.
First, the goose noted “Leaked Transcript of Private Meeting Contradicts Google’s Official Story on China.” The source is one of the popular real news sources associated with some NSA related information. The point of the write up, which the goose assumes is spot on, is:
On Sept. 26, a Google executive faced public questions on the censorship plan for the first time. Keith Enright told the Senate Commerce, Science and Transportation Committee that there “is a Project Dragonfly,” but said “we are not close to launching a product in China.” When pressed to give specific details, Enright refused, saying that he was “not clear on the contours of what is in scope or out of scope for that project.”
Okay, that seems clear.
And, on September 23, 2018, a Googler said:
“Right now, all we’ve done is some exploration,” Gomes told the reporter, “but since we don’t have any plans to launch something, there’s nothing much I can say about it.”
The hitch in the git along surfaces in this comment from the write up:
In July, Gomes had informed employees that the plan was to launch the search engine as soon as possible — and to get it ready to be “brought off the shelf and quickly deployed” once approval from Beijing was received.
The HSSCM method is to say different things to different audiences. That seems similar to practices followed in the high school science clubs with which I am familiar. For example, “Did you hot wire the PA system to play rock and roll during Mr. Durham’s morning announcements?” Our sci club leader said, “No.”
Did not fly.
The second high school science club management method the goose spotted appeared in the real news story “Facebook Isn’t Sorry — It Just Wants Your Data.” Facebook, a firm which has been associated with Cambridge Analytica and the phrase “I’m sorry,” allegedly has created what BuzzFeed calls a “home surveillance device.”
We noted this statement in the write up:
It’s also further confirmation that Facebook isn’t particularly sorry for its privacy failures — despite a recent apology tour that included an expensive “don’t worry, we got this” mini-documentary, full-page apology ads in major papers, and COO Sheryl Sandberg saying things like, “We have a responsibility to protect your information. If we can’t, we don’t deserve it.” Worse, it belies the idea that Facebook has any real desire to reckon with the structural issues that obviously undergird its continued privacy missteps.
The HSSCM method is to do exactly what the science club wants. Need to experiment on pets, not frogs, as part of the biology course of study, have at it. I recall one of our science club members tried this stunt until the teacher learned that the student was expanding beyond the normal frog dissection.
These examples suggest that one just say what’s necessary to be left along. Then move forward. Fortunately the Beyond Search goose (a member, of course) evaded being cooked.
MBA programs may not have textbooks which explain the benefits of this approach. On the other hand, maybe the schools with forward looking professors do.
Stephen E Arnold, October 10, 2018
Tracking Facebook: The Job of a Real Journalist Is Stressful, Alarming
September 30, 2018
Want to know what the life of a “real” journalist is like? Navigate to “Exposing Cambridge Analytica: ‘It’s Been Exhausting, Exhilarating, and Slightly Terrifying.” Here in Harrod’s Creek we believe everything we read online, whether from Facebook, the GOOG, or the Guardian.
The write up is unusual because on one hand, the virtues of being curious and asking questions leads to “terrifying” experiences. On the other hand, the Guardian is just a tiny bit proud that it made the information available.
I learned:
Cadwalladr’s reporting led to the downfall of Cambridge Analytica and a public apology from Facebook’s Mark Zuckerberg who was forced to testify before congress. Facebook has since lost $120 billion from its share price.
That’s nosing into Elon Musk Tweet territory.
I knew social media was a force, but these are big numbers. Perhaps newspaper advertising will reach these heights with “stressful, alarming” assignments for the “real” journalists?
I learned:
It’s got easier every time I’ve published – sunlight is the best disinfectant etc.
Interesting idea in a world which seems to be emulating the fiction of 1984.
I learned what lubricant allowed the “real” journalist to move forward:
I have to say that the support of readers was absolutely crucial and was one of the things that enabled me to carry on. Not just because it helped give me the confidence to keep going, but also because it helped give the organization confidence. It takes a huge amount of resources and resolve for a news organization to keep publishing in the face of the kind of threats we were facing, and the support of the readers for the story and what we were trying to do really did help give my editors confidence, I think. And I’m really grateful for that.
Does this mean that the “real” newspaper was the motive force?
If so, then “real” newspapers are positive forces in today’s world and not conduits for popular culture, sports, and informed opinion.
My thought was, “I wonder if the Babylonian clay tablet brigade voiced similar sentiments when writing on sheepskin became the rage.”
Probably not.
Rah rah for the “real” journalist. Rah rah for the newspaper.
Any rah rahs for Facebook? Nah. Bro culture. Security laughing stock. Sillycon Valley.
But Cambridge Analytica? Yeah, British with a lifeline from some interesting Americans.
Stephen E Arnold, September 30, 2018