Silicon Valley Management Method: Has Broflow Replaced Workflow?

March 23, 2018

In early March, we noted a story about Silicon Valley and evil. “How Silicon Valley Went from ‘Don’t Be Evil’ to Doing Evil” reported about the “bro” culture and a casual approach to customer privacy. There was a nod to fake news too. We noted this statement:

“[A] handful of companies or concentrated in one or two regions. The great progress in the 1980s and 1990s took place in a highly competitive, and dispersed, environment not one dominated by firms that control 80 or 90 percent of key markets. Not surprisingly, the rise of the oligarchs coincides with a general decline in business startups, including in tech.”

Today we noted “Here is How Google Handles Right to Be Forgotten Requests.” We found this passage suggestive:

Witness statements submitted by Google “legal specialist” Stephanie Caro (who admitted: “I am not by training a lawyer”) for both trials explained: “The process of dealing with each delisting request is not automated – it involves individual consideration of each request and involves human judgment. Without such an individual assessment, the procedure put in place by Google would be open to substantial abuse, with the prospect of individuals, or indeed businesses, seeking to suppress search results for illegitimate reasons.”

No smart software needed it seems. And the vaunted technical company’s workflow with regard to removal requests? Possibly “casual” or “disorganized.”

When considered against the backdrop of Facebook-Cambridge Analytics, process seems less important than other tasks.

Perhaps some management expert will assign the term “bro-flow” to the organizational procedures implemented by some high profile technology firms?

Stephen E Arnold, March 23, 2018

 

Patrick Roland, March 9, 2018

Open Source Panda Simplifies Data Analysis

March 20, 2018

An article at Quartz draws our attention to a potential alternative to Excel—the open source Pandas—in, “Meet the Man Behind the Most Important Tool in Data Science.” Writer Dan Kopf profiles Panda’s developer, Wes McKinny, who launched the Python tool in 2009. In 2012, Pandas’ popularity took off. Now, Kopf tells us:

Millions of people around the world use Pandas. In October 2017 alone, Stack Overflow, a website for programmers, recorded 5 million visits to questions about Pandas from more than 1 million unique visitors. Data scientists at Google, Facebook, JP Morgan, and virtually other major company that analyze data uses Pandas. Most people haven’t heard of it, but for many people who do heavy data analysis—a rapidly growing group these days—life wouldn’t be the same without it. (Pandas is open source, so it’s free to use.) So what does Pandas do that is so valuable? I asked McKinney how he explains it to non-programmer friends. ‘I tell them that it enables people to analyze and work with data who are not expert computer scientists,’ he says. ‘You still have to write code, but it’s making the code intuitive and accessible. It helps people move beyond just using Excel for data analysis.’

McKinney is inspired to improve data science tools because he likes to “empower people to solve problems.” In fact, Pandas sprung from his frustration at the limitations of available tools when he first came to embrace Python. See the article to follow the developer from his time as a high school athlete to his current, full-time work on Pandas and other open source projects, as well as more on Pandas itself.

Cynthia Murrell, March 20, 2018

Quote to Note: CNBC on Facebook Management

March 19, 2018

Talking head TV does not capture my attention. I did spot an interesting write up this morning. Its title? “Facebook Is Facing Its Biggest Test Ever — and Its Lack of Leadership Could Sink the Company.” Tucked in the analysis is a quote to note. Here’s the passage which I highlighted in high intensity yellow:

There’s no outside attacker bringing Facebook down. It’s a circular firing squad that stems from the company’s fundamental business model of collecting data from users, and using that data to sell targeted ads.

The phrase which is quite nifty is “a circular firing squad.”

The Facebook – Cambridge Analytic dust up is interesting. Our take on the use of academics, industrial strength intelligence analysis methods, and manipulating viewpoints will be featured in the April 3, 2018, DarkCyber video news program.

Until then, enjoy the “circular firing squad” trope. Oh, and a happy honk to the author, editior, producer who okayed this phrase. Nifty.

Stephen E Arnold, March 19, 2018

Facebook: Now Expectations for Responsibility Are Rising

March 14, 2018

Recently, British Prime Minister Theresa May spoke out against the vengeful and often dangerous way in which social media has been utilized. According to one account she stood up for women and minorities and other groups being disenfranchised online. Good, right? Apparently, it was a little too late, as a fiery Guardian piece told us in, “Theresa May Thinks Facebook Will Police Itself? Some hope.”

In typical British journalistic tradition, the piece heavily criticizes the PM’s statement:

“This is typical Mayspeak: it mimes determination but is devoid of substance. It’s like hoping that the alcohol industry will help to stamp out binge drinking or that food manufacturers will desist from encouraging childhood obesity. Neither industry will comply for the simple reason that their continued prosperity depends on people drinking more alcohol and consuming more sugar and fat.”

While a politician saying that they trust Facebook and social media to police themselves is laughable no matter what country you live in, it raises an interesting question. Wired recently took up the same topic with an interesting spin. While its author acknowledges Facebook’s attempts at correcting its mistakes and being a safer platform for users, it points out that there’s a really simple way to handle this: more transparency. Social media giants are may find themselves forced to shift from “utility” mode to “responsible publisher” mode. When this occurs, the algorithms which help generate revenue may be found to have an unacceptable social downside.

Patrick Roland, March 14, 2018

Facebook Fails Discrimination Test

March 12, 2018

While racism and discrimination still plague society, the average person does not participate in it.  The Internet exacerbates hatred to the point that people believe it is more powerful today than it was in the past.  Social media Web sites do their best to prevent these topics from spreading by using sentiment analytics.  Sentiment analytics are still in their infancy and, on more than one occasion, have proven to work against their intended purpose.  TechCrunch shares that, “Facebook’s Ad System Shown Failing To Enforce Its Own Anti-Discriminatory Policy” is a recent example.

Facebook demands to be allowed to regulate themselves when it comes to abuse of their services, such as ads.  Despite the claims that Facebook can self-regulate itself, current events have proven the contrary.  The article points to Facebook’s claim that it disabled its ethnic affinity ad targeting for employment, housing, and credit.  ProPublica ran a test case by creating fake rental housing ads.  What did they discover? Facebook continues to discriminate:

However instead of the platform blocking the potentially discriminatory ad buys, ProPublica reports that all its ads were approved by Facebook “within minutes” — including an ad that sought to exclude potential renters “interested in Islam, Sunni Islam and Shia Islam”. It says that ad took the longest to approve of all its buys (22 minutes) — but that all the rest were approved within three minutes.

It also successfully bought ads that it judged Facebook’s system should at least flag for self-certification because they were seeking to exclude other members of protected categories. But the platform just accepted housing ads blocked from being shown to categories including ‘soccer moms’, people interested in American sign language, gay men and people interested in wheelchair ramps.

Facebook reiterated its commitment to anti-discrimination and ProPublica responds that if an outside research team was called to regulate Facebook then these ads would never have reached the Web.  Maybe Facebook should follow Google’s example and higher content curators to read every single ad to prevent the bad stuff from getting through.

Whitney Grace, March 12, 2018

Facebook Begins Censoring Content for Good and Ill

March 5, 2018

Facebook has been under a lot of scrutinies for fake news and propaganda lately. While the company has acknowledged its mistakes, the course it is taking to fix these problems should alarm people. We learned more on the social media giant’s censorship from a recent story in the Intercept, “Facebook Says It Is Deleting Accounts at the Direction of the U.S. and Israeli Governments.

According to the story:

Facebook has been on a censorship rampage against Palestinian activists who protest the decades-long, illegal Israeli occupation, all directed and determined by Israeli officials. Indeed, Israeli officials have been publicly boasting about how obedient Facebook is when it comes to Israeli censorship orders.

 

Shortly after news broke earlier this month of the agreement between the Israeli government and Facebook, Israeli Justice Minister Ayelet Shaked said Tel Aviv had submitted 158 requests to the social media giant over the previous four months asking it to remove content it deemed “incitement.” She said Facebook had granted 95 percent of the requests.

This is a no-win situation for Facebook. By trying to keep questionable content off the net, it opens the door for censoring its users. A slippery slope, to be sure. If we were to guess, Facebook will make a few more missteps before correcting things appropriately.

Patrick Roland, March 5, 2018

Self Regulation: Is This a Facebook Core Competency?

March 3, 2018

Recently, British Prime Minister Theresa May spoke out against the vengeful and often dangerous way in which social media has been utilized. According to one account she stood up for women and minorities and other groups being disenfranchised online. Good, right? Apparently, it was a little too late, as a fiery Guardian piece told us in, “Theresa May Thinks Facebook Will Police Itself? Some Hope.”

In typical British journalistic tradition, the piece heavily criticizes the PM’s statement:

“This is typical Mayspeak: it mimes determination but is devoid of substance. It’s like hoping that the alcohol industry will help to stamp out binge drinking or that food manufacturers will desist from encouraging childhood obesity. Neither industry will comply for the simple reason that their continued prosperity depends on people drinking more alcohol and consuming more sugar and fat.”

While a politician saying that they trust Facebook and social media to police themselves is laughable no matter what country you live in, it raises an interesting question. Wired recently took up the same topic with an interesting spin. While its author acknowledges Facebook’s attempts at correcting its mistakes and being a safer platform for users, it points out that there’s a really simple way to handle this: more transparency. Social media giants are shrouded in secrecy and until they can be more candid and open, all we’ll have is hot air from politicians and nothing more.

But Facebook may have the gift of governance: Both and art and a like?

Patrick Roland, March 3, 2018

Facebook Floundering Again?

March 2, 2018

It is no shock to say that Facebook has had some rough months lately. Amidst controversy over their handling of fake news and algorithms that seem to avoid friends, the company’s biggest problems are actually internal. The social media giant’s culture is beginning to gain a lot of attention, and not for good reasons. We learned more in a Slashdot excerpt, “A Facebook Employee Asked Reporter to Turn Off Phone So Facebook Couldn’t Track its Location.”

One troubling part of the story said:

“According to his recounting of the meeting, she asked him if he had been in touch with Nunez (the Gizmodo reporter, who eventually published this and this). He denied that he had been. Then she told him that she had their messages on Gchat, which Fearnow had assumed weren’t accessible to Facebook. He was fired. “Please shut your laptop and don’t reopen it,” she instructed him.”

There is a lot of interesting stuff here and it links to a larger Wired piece that shows even more of the depths. This is at odds with other reports of the company, which claim it has a winning culture and that Facebook’s corporate environment is a success. We are inclined to believe the Slashdot look at things. Likely, these positive spins are PR related in the wake of so many rotten news pieces. Facebook has long been an innovator, so it’ll be interesting to see if they can revolutionize their culture.

Patrick Roland, March 2, 2018

Pundit Unlikes Facebook

February 23, 2018

I think the author of “The #1 Reason Facebook Won’t Ever Change” has adopted a somewhat negative view of Facebook. A Greek philosopher whom one of my slightly addled teachers said was Heraclitus who lived in Ephesus (nice place!) offered this observation:

“No man ever steps in the same river twice, for it’s not the same river and he’s not the same man.”

The point of the “Facebook Won’t Ever Change” is that Heraclitus was dead wrong. I am not sure that my wacko teacher, the deadbeats sitting around the Harrod’s Creek cast iron stove, or I believe the assertion “won’t ever change.”

Setting aside dead Greek guys who lived in a city now a must see for those on Mediterranean cruises, what’s the “logic” behind this “won’t ever change” assertion?

Well, companies have a genetic profile. Yep, how’s that working out at GE, which is a bit of a limp noodle since Neutron Jack was in charge? (Could the same degradation at Facebook take place? If the DNA is unchanged, then GE will be roaring back, right?)

The idea is that Facebook collects data, sells ads, and performs other “services” to which the Facebook community is not privy. Therefore, the quest for money means that Facebook will keep on earning money or at least trying to earn money. (Just like GE, right?)

There are quite a few graphs which illustrate that companies in the Facebook-type sector want to be like Facebook. But the point is change.

Facebook, however, “won’t ever change.” And here’s the quote to prove it, at least to the author:

Facebook is about making money by keeping us addicted to Facebook. It always has been — and that’s why all of our angst and headlines are not going to change a damn thing.

A few observations.

First, I am delighted that Marc Zuckerberg  is not altruistic, a statement offered in the “won’t ever change” essay. Okay, good to know. Heraclitus’ alleged statement suggests that today’s Zuckerberg is not tomorrow’s Zuckerberg. But Facebook is immutable; Zuckerberg is. He is, however, unable to “change” Facebook. I suppose that’s why founders and senior managers are either forced out or burned out.

Second, government regulation could impose “guidelines” that Facebook might want to observe if it wishes to conduct business in countries wanting to “change” Facebook. The DNA can stay the same, but China or the European Union could move the mighty social ship.

Third, users—like the teens expressing their views about school shooting—could make another social media channel the go to system. User pressure might squeeze the social vessel into a new form or split it at its seams.

Fourth, there is the life cycle thing. Facebook was a date hunting system. Now it is, like Google, difficult to sum up. The DNA, as I understand ageing, does not work the way it did when the organism was a new born. In short, Facebook will age and, at some point, find itself wedged in a warehouse for the aged. (Where is Excite now, gentle reader?)

I understand the need to offer “thought leader” type statements. The problem is that big ideas should not be based on a view of change that does not match the reality of long dead Greek guys or common sense. That’s the number one reason I find the write up similar to the statements of my ever-so-informed teacher who championed the “step into the river” factoid.

Stephen E Arnold, February 23, 2018

Big Tech Giants Not Bulletproof

February 22, 2018

It’s safe to say that the honeymoon is over for the big tech companies that use big data to the extreme. The likes of Facebook, Apple, Google, and Amazon had a rough 2017 and things aren’t looking up, according to Forbes story, “Big Trouble for Facebook, Amazon, Google, and Apple in 2018.”

According to the story 2017 was the year:

We realized that maybe, just maybe, FAGA (Facebook, Apple, Google, Amazon) were no different from car companies, real estate brokerages, banks, insurance companies, big pharma and other technology companies – those guys. When FAGA joined the family of “regular” companies they lost some appeal. In fact, if the trend continues, FAGA might even find themselves on the list of some of the most disrespected companies!

 

2017 made it clear that FAGA exists for their shareholders, partners, executives and customers, in that order. What an awakening for even the love-is-blind crowd: just say it ain’t so!  Just another company?

Things are not so rosy for tech companies that once claimed they would change the world. Net neutrality might cause them to drastically shift their business models, and a recent Vanity Fair story pulled back the curtain on Silicon Valley and revealed a hedonistic culture that is a drastic shift in our perception. Does this mean we are in for a tide shift in tech? Mmmmm, probably not. These giants are firmly planted, but it is proof they are not bulletproof.

Patrick Roland, February 22, 2018

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta