HSSCM Method: Update for October 10, 2018
October 10, 2018
The management methods inspired by high school science club behaviors are noteworthy. The goose calls these HSSCM methods or “high school science club management methods” to honor the behaviors of individuals who loved technology but were unfettered by such non essentials as football practice, the student council, and working as a volunteer at the retirement facility near the high school. Chemistry, math, physics, biology—the future.
Two items caught the Beyond Search goose’s attention this fine day.
First, the goose noted “Leaked Transcript of Private Meeting Contradicts Google’s Official Story on China.” The source is one of the popular real news sources associated with some NSA related information. The point of the write up, which the goose assumes is spot on, is:
On Sept. 26, a Google executive faced public questions on the censorship plan for the first time. Keith Enright told the Senate Commerce, Science and Transportation Committee that there “is a Project Dragonfly,” but said “we are not close to launching a product in China.” When pressed to give specific details, Enright refused, saying that he was “not clear on the contours of what is in scope or out of scope for that project.”
Okay, that seems clear.
And, on September 23, 2018, a Googler said:
“Right now, all we’ve done is some exploration,” Gomes told the reporter, “but since we don’t have any plans to launch something, there’s nothing much I can say about it.”
The hitch in the git along surfaces in this comment from the write up:
In July, Gomes had informed employees that the plan was to launch the search engine as soon as possible — and to get it ready to be “brought off the shelf and quickly deployed” once approval from Beijing was received.
The HSSCM method is to say different things to different audiences. That seems similar to practices followed in the high school science clubs with which I am familiar. For example, “Did you hot wire the PA system to play rock and roll during Mr. Durham’s morning announcements?” Our sci club leader said, “No.”
Did not fly.
The second high school science club management method the goose spotted appeared in the real news story “Facebook Isn’t Sorry — It Just Wants Your Data.” Facebook, a firm which has been associated with Cambridge Analytica and the phrase “I’m sorry,” allegedly has created what BuzzFeed calls a “home surveillance device.”
We noted this statement in the write up:
It’s also further confirmation that Facebook isn’t particularly sorry for its privacy failures — despite a recent apology tour that included an expensive “don’t worry, we got this” mini-documentary, full-page apology ads in major papers, and COO Sheryl Sandberg saying things like, “We have a responsibility to protect your information. If we can’t, we don’t deserve it.” Worse, it belies the idea that Facebook has any real desire to reckon with the structural issues that obviously undergird its continued privacy missteps.
The HSSCM method is to do exactly what the science club wants. Need to experiment on pets, not frogs, as part of the biology course of study, have at it. I recall one of our science club members tried this stunt until the teacher learned that the student was expanding beyond the normal frog dissection.
These examples suggest that one just say what’s necessary to be left along. Then move forward. Fortunately the Beyond Search goose (a member, of course) evaded being cooked.
MBA programs may not have textbooks which explain the benefits of this approach. On the other hand, maybe the schools with forward looking professors do.
Stephen E Arnold, October 10, 2018
Tracking Facebook: The Job of a Real Journalist Is Stressful, Alarming
September 30, 2018
Want to know what the life of a “real” journalist is like? Navigate to “Exposing Cambridge Analytica: ‘It’s Been Exhausting, Exhilarating, and Slightly Terrifying.” Here in Harrod’s Creek we believe everything we read online, whether from Facebook, the GOOG, or the Guardian.
The write up is unusual because on one hand, the virtues of being curious and asking questions leads to “terrifying” experiences. On the other hand, the Guardian is just a tiny bit proud that it made the information available.
I learned:
Cadwalladr’s reporting led to the downfall of Cambridge Analytica and a public apology from Facebook’s Mark Zuckerberg who was forced to testify before congress. Facebook has since lost $120 billion from its share price.
That’s nosing into Elon Musk Tweet territory.
I knew social media was a force, but these are big numbers. Perhaps newspaper advertising will reach these heights with “stressful, alarming” assignments for the “real” journalists?
I learned:
It’s got easier every time I’ve published – sunlight is the best disinfectant etc.
Interesting idea in a world which seems to be emulating the fiction of 1984.
I learned what lubricant allowed the “real” journalist to move forward:
I have to say that the support of readers was absolutely crucial and was one of the things that enabled me to carry on. Not just because it helped give me the confidence to keep going, but also because it helped give the organization confidence. It takes a huge amount of resources and resolve for a news organization to keep publishing in the face of the kind of threats we were facing, and the support of the readers for the story and what we were trying to do really did help give my editors confidence, I think. And I’m really grateful for that.
Does this mean that the “real” newspaper was the motive force?
If so, then “real” newspapers are positive forces in today’s world and not conduits for popular culture, sports, and informed opinion.
My thought was, “I wonder if the Babylonian clay tablet brigade voiced similar sentiments when writing on sheepskin became the rage.”
Probably not.
Rah rah for the “real” journalist. Rah rah for the newspaper.
Any rah rahs for Facebook? Nah. Bro culture. Security laughing stock. Sillycon Valley.
But Cambridge Analytica? Yeah, British with a lifeline from some interesting Americans.
Stephen E Arnold, September 30, 2018
Facebook: Interesting Real News Filtering
September 29, 2018
Here in Harrod’s Creek, it is difficult to determine what is accurate and what is not. For example, allegedly a university president fiddled his pay. Then we had rumors of a novel way to recruit basketball players. News about these events were filtered because, hey, basketball is a big deal along with interesting real estate deals in River City.
We read “Facebook Users Unable to Post Story about Huge Facebook Hack on Facebook.” A real news outfit in London noticed that stories about Facebook’s most recent security lapse were not appearing on Facebook.
Another real news outfit reported that some Facebook users saw this message:
“Action Blocked: Our security systems have detected that a lot of people are posting the same content, which could mean that it’s spam. Please try a different post.”
Facebook fans suggested that Facebook was not blocking a story which might put Facebook in a bad light.
Here in rural Kentucky we know that no Silicon Valley company would filter news about its own security problems.
Facebook is a fine outfit. Obviously the news about the security lapse was fake; otherwise, why would the information be blocked?
Just a misunderstanding which the 50 million plus people affected are certain to understand. What’s the big deal with regaining access to one’s account?
The Facebook service is free and just wonderful. Really wonderful.
Stephen E Arnold, September 29, 2018
Academic Sees Facebook Chasing Amazon
September 17, 2018
I assume the one trillion dollar Amazon poobah and the world’s richest hombre will not friend Facebook. The assumption is that the information in “How Facebook AI May Help to Change the Way We Shop Online in the Future” is accurate. The author is an accounting instructor at Villanova University. We do not have that type of expert in Harrod’s Creek. We do have some fast money guys from the health care outfits down the road, however.
The main point of the write up is that Facebook has some smart software which will change the way people shop. Maybe not in Harrod’s Creek, but certainly in a big city where the economic action is. (Keep in mind that insurance fraud is a core competency of some in the Commonwealth of Kentucky.)
I learned that:
The most powerful algorithm is called FBLearner Flow: Facebook could use its massive data on user preferences to anticipate the products that consumers want before consumers even realize it, and could work with retailers on predictive shipping.
Facebook also has DeepText and DeepFace. The trio of smart software adds up to a potential threat to Amazon.
The dismal performance of some facial recognition and image analysis systems is not a problem for the Facebook wizards. I learned:
DeepFace is used to identify people in photos and suggest that users tag people they know. In reality, DeepFace can recognize any face in any photograph on its own. This facial recognition algorithm is actually 97 percent accurate, incredibly even higher than humans who fall a close second at 96 percent accuracy, and the FBI at 85 percent.
The write up suggests that Facebook’s technology could, maybe, possibly could edge toward mind control.
Whatever.
My thought is that Facebook can snag more ad revenue. I think that Facebook ad gains might come at the expense of the Google. Google, unlike Amazon, seems to be drifting with Loon balloons, employee push back, and electric scooter investments. Amazon’s ads are just fly wheeling up and trying to build Mr. Bezos’ much beloved momentum.
Our research suggests that Amazon is implementing a game plan that once was associated with the pre 2006 Google; that is, a number of large scale plays for core business expansions. These range from policeware to back office financial services to replacing existing retail infrastructure with the Amazon equivalent of old school retail.
Ads, therefore, will be a billion dollar plus business at Amazon. Are those product listings ads or objective product summaries. That’s a question to ponder.
But ads may not become much more than just another Amazon revenue stream.
Facebook has to find revenue. Amazon, thanks to the cleverness of the happy Amazonians, is wallowing in revenue streams. Some employees may be unhappy, but most customers are thrilled for Amazon’s gentle approach to vendor lock in.
Net net: Facebook will do ads. Facebook will do smart software. Facebook will also have to figure out how to dodge the bullets regulators are now stuffing into regulatory weapons to tackle with “we’re sorry, we’ll do better” approach to business.
Stephen E Arnold, September 15, 2018
Facebook: The Old Is Newish Again
September 14, 2018
Social media giant, Facebook, has been making a very public effort to clean up its act and establish a greater sense of security for users. As this campaign is underway more troubling news recently came out regarding the platform. We learned all the disturbing information from a recent article in The Verge, “How Autocratic Governments Use Facebook Against Their Own Citizens.”
According to the story:
“Armed groups use Facebook to find opponents and critics, some of whom have later been detained, killed or forced into exile, according to human rights groups and Libyan activists…Swaggering commanders boast of their battlefield exploits and fancy vacations, or rally supporters by sowing division and ethnic hatred. Forged documents circulate widely, often with the goal of undermining Libya’s few surviving national institutions.”
While this is indeed interesting news, we’d say that it is not just limited to autocratic regimes. Take, for example, the news that the US government would like to start wiretapping Facebook’s messenger app. Clearly, some governments are using social media for more overt evil, however, we can’t imagine a nation in the world that would overlook this powerful tool and consider ways they can use it for their own good.
Patrick Roland, September 14, 2018
High School Science Club Management Methods: August 30, 2018
August 30, 2018
Years ago, I learned that Google was worried about government regulation. President Trump seems to be making moves in that direction. But my topic today is high school science club management methods or HSSCMM.
The first example is news about a group of Facebook staff who are concerned about the intolerant liberal culture within Facebook. Okay, Facebook is about friends and people who share interests or likes. The notion of a political faction within an online company is one more example of a potential weakness in HSSCM. The idea that an employee worked for a company, had a job description, and received money strikes me as inoperative. The problem is that the needs of the Science Club are not the needs of the people on the football team or the field hockey team. Will the lunchroom have tables for the Science Club folks and other tables for the sports? In my high school, the Science Club was different from the band and the student council. Snort, snort, we said, when asked to coordinate with the booster club to celebrate a big win. Snort, snort.
The second example the story “14 Powerful Human-Rights Groups Write to Google Demanding It Kill Plans to Launch a China Search Engine.” The issue for Google and China is revenue. How will the HSSCM address a group of human rights organizations. I assume that these entities can issue news releases, pump out Twitter messages, and update their Facebook pages. If that sounds like the recipe for information warfare, I am not suggesting such an aggressive approach. What’s important to me is that Google will have to dip into its management methods to deal with this mini protest.
The question is, “Are high school science club management methods up to these two challenges?
My view is, “Sure, really smart people can find clever solutions.”
On the other hand, the very management methods which made Facebook and Google the business home runs each is will have to innovate. Business school curricula may not cover how to manage revolts from unexpected sources.
Stephen E Arnold, August 30, 2018
More Administrative Action from Facebook
August 20, 2018
Rarely do we get a report from the front lines of the war on social spying and fake news. However, recently a story appeared that showcased Facebook’s heavy-handed tactics up close and personal. The article appeared in Gizmodo, titled: “Facebook Wanted to Kill This Investigative Tool.”
The story is about how one designer at Gizmodo tried creating a program that collected data on Facebook, trying to determine what they used their data farms for. It did not go well and the social media giant attempted to gain access to the offending account almost instantly.
“We argued that we weren’t seeking access to users’ accounts or collecting any information from them; we had just given users a tool to log into their own accounts on their own behalf, to collect information they wanted collected, which was then stored on their own computers. Facebook disagreed and escalated the conversation to their head of policy for Facebook’s Platform…”
News such as this has been slowly leaking its way into the mainstream. In short, Facebook has been attempting to crack down on offenders, but in the process might be going a little too far—this is not unlike overcorrecting a car while skidding on ice. Wall Street is more than a little worried they won’t pull out of this wreck, but some experts say it’s all just growing pains.
We think this could be another example of management decisions fueled by high school science club thinking.
Patrick Roland, August 20, 2018
Facebook: A New York City-Sized PR Problem
July 20, 2018
I read “Once nimble Facebook Trips Over Calls to Control Content.” If you are looking for this write up online, the story’s headline was changed to “What Stays on Facebook and What Goes? The Social Network Cannot Answer.” You may be able to locate the online version at this link. (No promises.) The dead tree version is on Page A1 of the July 20, 2018, edition which comes out on Thursday night. Got the timeline square?
I wanted to highlight a handful of comments in the “real” news story. Here we go with direct statements from the NYT article in red:
- The print version headline uses the phrase “once nimble.” Here in Harrod’s Creek that means stumbling bobolyne. In Manhattan, the phrase may mean something like “advertise more in the New York Times.” I am, of course, speculating.
- I marked in weird greenish yellow this statement: “Facebook still seems paralyzed over how to respond.” So much for nimble.
- Another: “Comically tripped up”. Yep, a clown’s smile on the front page of the NYT.
- My favorite: The context for being a bit out of his depth. Whatever does “yet lucidity remai9ned elusive.” Does this mean stupid, duplicitous, or something else?
- I thought Silicon Valley wunderkind were sharp as tacks. In the NYT, I read “Facebook executives’ tortured musings.” Not Saturday Night Live deep thoughts, just musings and tortured ones at that.
- How does Facebook perceive “real” journalism? Well, not the way the NYT does. I circled this phrase about Alex Jones, a luminary with some avid believers one mine drainage ditch down the road a piece which is Kentucky talk for “some”: “Just being false doesn’t violate community standards” and “Infowars was a publisher with a ‘different point of view.’”
- This is a nifty sequence crafted to recycle another “real” journalist’s scoop interview with Mark Zuckerberg: “what Facebook would or would not allow on its site became even more confusing.” So, a possible paralyzed clown who lacks lucidity is confusing.
- The “bizarre idea” word pair makes sure I understand what the NYT believes in a lack of clear thinking.
But these brief rhetorical flourishes set up this statement:
A Facebook spokeswoman [who is not identified] explained that it would be possible, theoretically, to deny the Holocaust without triggering Facebook’s hate-speech clause.
Those pesky algorithms are at work. But the failure to identify the person at Facebook who offered this information is not identified. Why not?
Here’s another longer statement from the NYT write up:
And what exactly constitutes imminent violence is a shifting line, the company said— it is still ‘iterating on’ its policy, and the rules may change.
I don’t want to be too dumb, but I would like to know who at the company offered the statement. A company, to my knowledge, cannot talk unless one considers firing a question at Amazon’s Alexa.
I put an exclamation point on this statement in the NYT article:
All of this fails a basic test: It’s not even coherent. It is a hodge podge of declarations and exceptions and exceptions to the exceptions.
Net net: Facebook has a public relations problem with the New York Times. Because of the influence of the “real” newspaper and its “real” journalists, Facebook has a PR problem of magnitude. Perhaps the point of the story is to create an opportunity for a NYT ad sales professional to explain the benefits of a full page ad across the print and online versions of the New York Times?
Stephen E Arnold, July 20, 2018
Facebook: A Fan of Infowars
July 13, 2018
I don’t know much about Infowars. I do know that the host has an interesting verbal style. The stories, however, don’t make much sense to me. I just ignore the host and the program.
However, if the information in “Facebook Proves It Isn’t Ready To Handle Fake News” is accurate, Facebook is okay with the host and the Infowars’ approach to information.
The write up reports a Facebook news expert as saying:
“I guess just for being false that doesn’t violate the community standards.” I think part of the fundamental thing here is that we created Facebook to be a place where different people can have a voice. And different publishers have very different points of view.
The Buzzfeed story makes this statement:
Despite investing considerable money into national ad campaigns and expensive mini documentaries, Facebook is not yet up to the challenge of vanquishing misinformation from its platform. As its videos and reporter Q&As take pains to note, Facebook knows the truth is messy and hard, but it’s still not clear if the company is ready to make the difficult choices to protect it.
Hey, it’s difficult for some people to deal with responsibility. Ease off. Facebook is trying hard to be better. Every day. Better.
Stephen E Arnold, July 13, 2018
Facebook: Information Governance?
July 9, 2018
Anyone else annoyed by the large amount of privacy disclosures filling your index and slowing down your favorite Web site? User data privacy and how companies are collecting and/or selling that information is a big issue.
Facebook is one of the more notorious data management case studies. Despite the hand waving, it may be easy for Facebook data to be appropriated.
Josip Franjkovi? writes how user data can be stolen in the post, “Getting Any Facebook User’s Friend List And Partial Payment Card Details.”
There are black hat and white hat hackers, the latter being the “good guys.” It is important for social media Web sites to hack themselves, so they can discover any weaknesses in their structures. Franjkovi? points out that Facebook uses a GraphQL endpoint that is only accessible their first part applications. He kept trying to break into the endpoint, even sending persisted queries on a loop. The same error message kept returning, but it did return information already available to the public and the privately held friends list.
The scarier hack was about credit card information:
“A bug existed in Facebook’s Graph API that allowed querying for any user’s payment cards details using a field named payment_modules_options. I found out about this field by intercepting all the requests made by Facebook’s Android application during registration and login flow.”
Thankfully Franjkovi? discovered this error and within four hours and thirteen minutes the issue was resolved. Credit card information was stolen this time around, but how much longer until it is again? We await Franjkovi?’s analysis of Google email being available to certain third parties.
Whitney Grace, July 9, 2018