Silicon Valley Management Crises Escalate

May 10, 2019

Early in my career I worked at Booz, Allen & Hamilton. There was lots of chatter about management from the MBAs. I listened, and I learned that management was a slippery fish.

Now the engineers, mathematicians, and scientists who are in charge of a couple of successful Silicon Valley firms are dealing with slippery fish, and some of these creatures are poisonous.

Let’s look at two examples.

The first appears in “Google Employees Ask Alphabet CEO to Address Walkout.” The idea is that employees are not happy, and they want to make this clear to colleagues and the real journalists who pay attention to real news. I learned:

The plea for Page’s involvement comes after months of worker protests against the mishandling of sexual harassment incidents, along with retaliation against those who report it, including the demotion and modifications of roles that female employees who reported harassment held.

Google denies retaliation, and some of the world’s smartest people employed by the online advertising firm are unhappy.

Unhappy employees means trouble with a capital T. There may be a Meredith Wilson opportunity here.

The second has been captured in statements from Chris Hughes, one of the “founders” of Facebook. This Facebooker has been on talking head TV, but the article “Facebook Co-Founder Chris Hughes: It’s Time to Break Up Facebook” does a good job of recycling the opinion piece Mr. Hughes crafted for the New York Times. I noted:

Hughes says that Zuckerberg has “unchecked power” and influence “far beyond that of anyone else in the private sector or in government.”

Okay, a founder and “friend” of Facebook is criticizing the company. The fix is painful because breaking up is hard to do.

Okay, two examples.

The Google problem is a revolt from within. The Facebook problem is a revolt of the insiders.

Neither Google nor Facebook is handling the management challenges in a smooth, friction free way.

Maybe it is time to call in the MBAs along with lots of lawyers to help with this Iron Man events? The high school science club is just not working. Sure, the money is still flowing, but like a gurgling Mauna Loa, further events are inevitable. Foosballl and colorful mouse pads won’t do the job. And algorithms? Nope.

Stephen E Arnold, May 10, 2019

Facebook: Yep, Privacy Is Our Business

May 7, 2019

I believe everything I read on the Internet. Here’s an example of truth, which may actually be true: “Facebook’s Contract Workers Are Looking at Your Private Posts to Train AI.” The main idea is that a post marked “private” allegedly may be perused by a contractor. In my experience, contractors are often far away from the office stuffed with supervisors. When I use contract workers, I just get the work. I don’t spend too much time micro-managing. If I hire a contractor via, I don’t interact after I post an email describing what I want done. My hunch is that contract workers can do quite a few things. I don’t know because they are “contract workers.”

The write up states:

Facebook confirmed to Reuters that the content being examined by WiPro’s workers includes private posts shared to a select numbers of friends, and that the data sometimes includes users’ names and other sensitive information. Facebook says it has 200 such content-labeling projects worldwide, employing thousands of people in total.

Yep, private information in the hands of contracts who are “employees” of WiPro.

Privacy is our business.

Stephen E Arnold, May 7, 2019

Defriending Facebook? Harsh

April 24, 2019

Whether it was earnest advice or a public-relations ploy, we’re told Mark Zuckerberg’s recent call for regulation would not actually fix the problems with Facebook. Canada’s CBC News describes “The Case Against Facebook: a ‘Dataopoly’ with Too Much Market Power.” I was interested in reporter Ramona Pringle’s explanation of a “dataopoly;” she cites Carleton University professor Dwayne Winseck, who teaches about Internet governance:

“[Winseck] says with its behemoth scale and singular control over the data of its users, Facebook is a ‘dataopoly.’ A company with a monopoly in a traditional, non-digital industry is able to charge consumers higher prices for goods or services due to the lack of competition. In the case of a dataopoly, the results of that unrivalled power can be less privacy, degraded quality of service, and political and social consequences, writes Prof. Maurice Stucke, an antitrust expert at the University of Tennessee College of Law. With more than two billion users who have few, if any, alternatives to the massive social network and its various platforms — which also include Instagram and WhatsApp — there is little incentive for Facebook to change the way it does business. Winseck says this is clear in the company’s ‘take-it-or-leave-it terms of service.’ Even if a user is uncomfortable with some of the Facebook’s practices, if they want to use the social network, they have no choice but to grin and bear it.”

On top of that, we’re reminded, Facebook keeps a tight grip on everything that crosses its platform, like the nature of its services, how advertisers can target users, and what it really does with all that juicy user data. The only real solution, Pringle insists, is the breakup of Zuckerberg’s company. Like others, this article is skeptical of Zuckerberg’s motives, noting that, for various reasons, Facebook could use some good PR about now. If this was the goal, did it backfire?

Cynthia Murrell, April 24, 2019

Facebook: A Bubbling Cauldron of PR Opportunity

April 23, 2019

I read “Facebook’s New Chief Lawyer Helped Write the Patriot Act.” Then I read “Facebook Taps Former Vulcan and Gates Ventures Exec John Pinette to Run Global Communications.” From these two real news stories, I concluded that the Facebook senior management team is circling its wagons, cleaning up the dorm room, and involving some individuals who may have been excluded from the high school science club party last year. Vulcan Ventures and the Patriot Act. Times are changing at the company which seems to struggle with privacy, legislative wrath, and trust.

Not a moment too soon.

The Guardian, an outfit eager to identify the possible frailties of humanoids in Silicon Valley, published “My TED Talk: How I Took on the Tech Titans in Their Lair” and reported via a contributor who gave a TED talk:

In the theatre, senior executives of Facebook had been “warned” beforehand. And within minutes of stepping off stage, I was told that its press team had already lodged an official complaint. In fairness, what multi-billion dollar corporation with armies of PRs, lawyers and crisis teams, not to mention, embarrassingly, our former deputy prime minister, Nick Clegg, wouldn’t want to push back on the charge that it has broken democracy? Facebook’s difficulty is that it had no grounds to challenge my statement. No counter-evidence. If it was innocent of all charges, why hasn’t Mark Zuckerberg come to Britain and answered parliament’s questions? Though a member of the TED team told me, before the session had even ended, that Facebook had raised a serious challenge to the talk to claim “factual inaccuracies” and she warned me that they had been obliged to send them my script. What factual inaccuracies, we both wondered. “Let’s see what they come back with in the morning,” she said. Spoiler: they never did.

I am not sure when the Patriot Act and Vulcan hires start work, but the Guardian write up may spin up some work for the new, fresh, clear-eyed Facebookers. Not a moment too soon. Wait. Maybe it is too late?

Stephen E Arnold, April 23, 2019

Facebook Search: Fun for Some?

April 19, 2019

Ah, Facebook. The news about millions of exposed passwords was almost lost in the buzz sparked by the now infamous “Report.” Every week, it seems, there is a Facebook goodie to delight.

Despite its modest flaws, Facebook might be a social media network becoming a fave of the Mashable reports in “Facebook’s Search Feature Has Some Pretty Creepy Suggestions” about the firm’s search function.

Allegedly the Facebook search function allowed users to search for photos of women, but not men. Inti De Ceukelaire, a Belgian security researcher, discovered that when he typed in “photos of my female friends,” he got the desired results. However, doing the opposite with “photos of my male friends” yielded memes Risqué search phrases were also automatically suggested:

“That discrepancy is troubling enough, but it gets worse. While testing out these searches, the first automatically suggested query was “photos of my female friends in bikinis,” which returned photos of women in bikinis, as well as one image of a topless woman, which would appear to violate Facebook’s rules against nudity. Facebook removed the image  following Mashable’s inquiry. Separately, “photos of my female friends at the beach” was also suggested.”

Mashable continued to test the big and discovered more questionable searches that contained what might be thought of as a “creep” factor. Searches with male in the search phrase, though, were more innocuous. Facebook reports that suggested search phrases are not based on an individual user’s history, but all of Facebook. In other words,

Who coded this search function? Maybe some men? Men just having fun?

Whitney Grace, April 18, 2019

Facebook: Technical Challenges Arise

April 17, 2019

I read “Facebook Suffers Blackout Again but the Hackers Have Nothing to Do with It.” What struck me and one member of the DarkCyber team as interesting was the tiny hint of Facebook’s technical ineptitude. The word “again” and then the reminder that the service flop was not caused by “hackers” sticks the point of the reportorial spear into Facebook.

We noted this statement:

Earlier in March when Facebook was down for hours some experts had pointed towards DDoS attacks (Distributed Denial of Service) which can sometimes cripple businesses. However, Facebook denied it and said in tweet that the outage was “not related to a DDoS attack”. It had blamed the server configuration change for the outage.

A technical glitch or a distracted technical management team?

Stephen E Arnold, April 17, 2019

Facebook: Friends Are Useful

April 16, 2019

I read “Mark Zuckerberg Leveraged Facebook User Data to Fight Rivals and Help Friends, Leaked Documents Show.” I must admit I was going to write about Alphabet Google YouTube DeepMind’s smart software which classified the fire in Paris in a way that displayed links to the 9/11 attack. I then thought, “Why not revisit Microsoft’s changing story about how much user information was lost via the email breach?” But I settled on a compromise story. Facebook allegedly loses control of documents. This is a security angle. The document reveal how high school science club management methods allegedly behave.

According to CNBC:

Facebook would reward favored companies by giving them access to the data of its users. In other cases, it would deny user-data access to rival companies or apps.

If true, the statement does not surprise me. I was in my high school science club, and I have a snapshot of our fine group of outcasts, wizards, and crazy people. Keep in mind: I was one of these exemplars of the high school.

Let’s put these allegedly true revelations in context:

  1. Facebook has amassed a remarkable track record in the last year
  2. Google, a company which contributed some staff to Facebook, seems to have some interesting behaviors finding their way into the “real news” media; for example, senior management avoiding certain meetings and generally staying out of sight
  3. Microsoft, a firm which dabbled in monopoly power, is trying to figure out how to convert its high school science club management methods in its personnel department to processes which match some employees’ expectations for workplace behavior.

What’s the view from Harrod’s Creek? Like the Lyft IPO and subsequent stock market performance, the day of reckoning does not arrive with a bang. Nope. The day creeps in on cat’s feet. The whimpering may soon follow.

Stephen E Arnold, April 16, 2019


Zuck Hunting: Investors Want to Blast a Clay Pigeon

April 14, 2019

I read “Facebook Investors Desperate to Boot Mark Zuckerberg from Chairmanship.” I wonder if these senior business professionals realize that the Zuck (Mark Zuckerberg) has taken anticipatory steps to remain in control of the Facebook privacy grinder.

The write up reports this sentence from an April 12, 2019, Securities & Exchange Commission filing:

[Zuckerberg’s] dual-class shareholdings give him approximately 60% of Facebook’s voting shares, leaving the board, even with a lead independent director, with only a limited ability to check Mr. Zuckerberg’s power,” reads the statement supporting the proposal. “We believe this weakens Facebook’s governance and oversight of management.

The write up summarizes some of the concern stakeholders have in the Zuck’s decision making.

Zuck (the “face” of Facebook) may not embrace the idea of a small step toward knocking his clay pigeons from the sky.

I learned that “Facebook isn’t a fan” of this idea.

From my vantage point in rural Kentucky, the situation seems to be:

  1. A lack of meaningful regulatory oversight and control on a company which has demonstrated a willingness to say “I am sorry.” Each time I hear these words from a Facebook professional, I think of John Cleese’s character being held upside down out of a window in “A Fish Called Wanda.”
  2. A desire to continue chugging forward in order to maintain what I call “HSSCMM” or high school science club management methods. (If you are not sure what this means, ask a high school science club student at an institution near you.)
  3. Confidence that the company’s team and its users can continue to bring the world together despite some modest evidence that Facebook causes a small amount of disruption.

The push back strikes me as an example of too little, too late. But at least there is some questioning of the HSSCMM’s efficacy.

I hear the call “pull” but that clay pigeon is flying free. The prized Zuck sails forward unscathed.

Stephen E Arnold, April 14, 2019

Looking Back: Facebook and Live Streams

April 9, 2019

Many have asked how Facebook could allow it—during the tragic mass shooting in New Zealand on March 15, the alleged perpetrator live-streamed the horror for 17 minutes. Now, CNET shares, “Facebook Explains Why its AI Didn’t Catch New Zealand Gunman’s Livestream.” Writers Erin Carson and Queenie Wong cite a post from Facebook VP Guy Rosen, and say the company just wasn’t prepared for such an event. They report:

“In order for AI to recognize something, it has to be trained on what it is and isn’t. For example, you might need thousands of images of nudity or terrorist propaganda to teach the system to identify those things. ‘We will need to provide our systems with large volumes of data of this specific kind of content, something which is difficult as these events are thankfully rare,’ Rosen said in the post. In addition, he noted that it’s a challenge for the system to recognize ‘visually similar’ images that could be harmless like live-streamed video games. ‘AI is an incredibly important part of our fight against terrorist content on our platforms, and while its effectiveness continues to improve, it is never going to be perfect,’ Rosen said. Facebook’s AI challenges also underscore how the social network relies on user reports. The social network didn’t get a user report during the alleged shooter’s live broadcast. That matters, Rosen said, because Facebook prioritizes reports about live videos.”

The first user report about this video came in 12 minutes after the stream ended. The company says fewer than 200 users viewed the video in real time, but that more than 4,000 views occurred before it was taken down.

With no vetting, no time delay, and just smart software, the shooting video was available.

Rosen does tell us how Facebook plans to address the issue going forward: continue to improve its AI’s matching technology; find a way to get user reports faster; and continue working with the Global Internet Forum to Counter Terrorism. Do these plans seem a nebulous to anyone else?

Three of the five eyes are taking steps to put sheriffs in the social media territory.

Cynthia Murrell, April 9, 2019

Making, Not Filtering, Disinformation

April 8, 2019

I spotted a link to this article on Sunday (April 7, 2019). The title of the “real news” report was “Facebook Is Asking to Be Regulated but Wants to Choose How.” The write ostensibly was about Facebook’s realization that regulation would be good for everyone. Mark Zuckerberg wants to be able to do his good work within a legal framework.

I noted this passage in the article:

Facebook has been in the vanguard of creating ways in which both harmful content can be generated and easily sent to anyone in the world, and it has given rise to whole new categories of election meddling. Asking for government regulation of “harmful content” is an interesting proposition in terms of the American constitution, which straight-up forbids Congress from passing any law that interferes with speech under the first amendment.

I also circled this statement:

Facebook went to the extraordinary lengths of taking out “native advertising” in the Daily Telegraph. In other words ran a month of paid-for articles demonstrating the sunnier side of tech, and framing Facebook’s efforts to curb nefarious activities on its own platform. There is nothing wrong with Facebook buying native advertising – indeed, it ran a similar campaign in the Guardian a couple of years ago – but this was the first time that the PR talking points adopted by the company have been used in such a way.

From Mr. Zuckerberg’s point of view, he is sharing his ideas.

From the Guardian’s point of view, he is acting in a slippery manner.

From the newspapers reporting about his activities and, in the case of the Washington Post, providing him with an editorial forum, news is news.

But what’s the view from Harrod’s Creek? Let me share a handful of observations:

  1. If a person pays money to a PR firm to get information in a newspaper, that information is “news” even if it sets forth an agenda
  2. Identifying disinformation or weaponized information is difficult, it seems, for humans involved in creating “real news”. No wonder software struggles. Money may cloud judgment.
  3. Information disseminated from seemingly “authoritative” sources is not much different from the info rocks from a digital slingshot. Disgruntled tweeters and unhappy Instagramers can make people duck and respond.

For me, disinformation, reformation, misinformation, and probably regular old run-of-the-mill information is unlikely to be objective. Therefore, efforts and motivations to identify and filter these payloads is likely to be very difficult.

Stephen E Arnold, April 8, 2019

Next Page »

  • Archives

  • Recent Posts

  • Meta