Google Disintermediates Apps

May 27, 2019

Do you really want to find, download, and use a separate app when you order food or anything for that matter? No, of course not. Companies developing apps may push back a little, but there are other ways to make a living. Uber? Amazon delivery person until the robot driven vehicles arrive?

Hey, Google Bring Me a Chalupa!” explains that Google has sucked into its system functions once the domain of the independent app. Yes, disintermediation has arrived for startups. The write up states:

Now thanks to the clever folks at Google, hangry [editor’s note: this slang appeared in the original article] people everywhere can order food delivery directly from Google Search, Maps, and Google Assistant. That doesn’t mean that a Google intern is going to show up at your door with your White Castle Crave Case or pineapple pizza. Instead the tech giant is partnering with companies that are already in the delivery game—like DoorDash, Postmates, Delivery.com, Slice, ChowNow, and more on the way.

I am not sure what “partnering” means in the thrilling world of Alphabet Google. I will leave that to you to figure out.

What seems important here in Harrod’s Creek are these issues:

  • What’s the branding? Google or the oddball service absorbed into the Google environment?
  • How will Google prioritize information about the services playing ball with the online advertising company? Maybe buy advertising to get pride of place for that Chalupa?
  • Will Google set up sweet heart deals or buy a company which is getting traction via the Google service? How will the disintermediated service feel about that? Probably the disintermediated will bond. App developer and start up service company together again?

Convenience may come at a price? Do you think Google  will send the person who orders chalupas ads for related products?

Does disintermediation lead to unemployment or underemployment? That’s a positive, right?

Stephen E Arnold, May 27, 2019

Alphabet Spells Management Challenge

April 27, 2019

The Bloomberg outfit published allegedly accurate information about Google’s interesting approach to management. “Google Staffers Share Stories of ‘Systemic’ Retaliation” reports that there is a disagreement about how to run the online advertising railroad.

image

Was management responsible for this train wreck? Perhaps the employees were at fault. Were the staff on the train punished?

Whoo, whoo, whoo. That’s the laboring engine sound one can hear in train stations in places like Patna Station or Bayshore when one stands near the tracks.

The sounds from the Google, according to Bloomberg:

On Monday [April 22, 2019], two of those organizers, Meredith Whittaker and Claire Stapleton, wrote an email saying Google had punished them because of their activism. The two asked staffers to join them on Friday to discuss the company’s alleged actions, and during the meeting they shared more than a dozen other stories of internal retribution that they had collected over the past week. Like many meetings at Google, participants could watch via a video live-stream and submit questions and comments.

Chug, chug, chug. The Guardian newspaper sounds its whistle too.

The little engine that could continues to pull the freight for Alphabet Google senior managers. Bloomberg pointed out:

Google management publicly endorsed the employee walkout in the fall, giving the blessing for staff to vent frustration. But as dissent continued to rise inside Google, the company’s lawyers urged the U.S. government to give companies more leeway to reign in rebellious employees from organizing over workplace email. Google made that filing in a case pending before the National Labor Relations Board involving alleged retaliatory discipline against an employee. Another complaint involving alleged retaliation against staff was filed with the agency this week.

I think I hear the Alphabet Google Express announcement: “Unhappy passengers may debark at the next stop. Termination Junction. Next stop, Termination Junction.”

On one hand, a person who takes money to get a job, benefits, access to Foosball tables, and a Google mouse pad has an obligation to perform work. The idea is that the employer employs, and the employee does what he or she is told to do.

On the other hand, a person who does not like the work should do what? Quit? Protest? Talk with reporters from Bloomberg? Look for another job? Undermine software that sort of works?

What’s interesting to me is that the Alphabet Google train itself may come off the rails due to management missteps. I term the approach of some Silicon Valley high technology companies as the HSSCMM or High School Science Club Management Method. Sometimes its works and sometimes it appears to not work as the club members expect. What’s up with that?

Train wrecks just happen. Often with little warning. But in this case it looks to me as if one or two cracks in the drive train have appeared.

Stephen E Arnold, April 27, 2019

Crazy Consulting Baloney: Ambient Data Governance

April 13, 2019

I spotted this phrase in the capitalist’s tool. That would be Forbes Magazine, an outfit which publishes some pretty crazy management recipes. Navigate to “How To Prepare Your Company For Ambient Data Governance.” Click thorough the ads and the pop ups. The reward is a write up about “ambient data governance.” I must admit that I don’t have a clue why this phrase is necessary. Governance by itself is a limp noodle. What company has governance today? Wells Fargo, Facebook, or another one might wish to nominate? You pick. Management of most companies takes short cuts; for example, killing products after announcing them and putting the name of the product in advertisements (a nifty play at Apple, a company with governance one presumes).

Here’s a passage I noted:

Forrester recently issued a prediction for 2019, saying, “Ambient data governance will take the trauma out of old-school governance,” and predicting that “ambient data governance will prevail as a strategy to automate and intelligently scale data policy deployment while learning and adapting policies based on data consumer interaction.”

Okay, Forrester. Selling reports and consulting or is it consulting which yields reports? I just don’t know.

What does one do to become adept at “ambient data governance”? Easy. How about these astounding recommendations:

Appoint a chief data officer. Make sure this person can walk. This is the “ambient” I deduce.

Become data literate. Okay, what’s data? What’s literate mean in this context?

Evolve to an insights driven organization. Yeah, anyone at Forbes read Darwin? Intent-driven evolution is an interesting concept. The method does not work too well.

There you go. A recipe for governance. More like a recipe to write an article in the capitalist tool and hook a crazy buzzword to Forrester. SEO at work. Rev your Harley’s engine, Malcolm. Speed away from this management jargon accident.

Stephen E Arnold, April 13, 2019

Silicon Valley: Are Its Governance and Innovation Showing Signs of Deterioration?

March 30, 2019

I was zipping through the news items which assorted filter bubble robots fire at me each day. I noticed three items, which on the surface, appear to be unrelated. I asked myself, “What if there is a connection among each of these items?” Let’s take a look.

car in hole small

Was this driven by a Silicon Valley bro or smart software?

The first item is Apple’s admission that it cannot create a viable wireless charging device. The company has labored for years and admitted that it cannot pull off this “innovation.” “Apple Kills AirPower Charging Station, but Here Are Some Alternatives (for a Single Device)” states:

Citing technical difficulties in meeting its own standards, Apple has issued a statement announcing the long awaited AirPower wireless charging mat will not ship, ever. AirPower was announced alongside the iPhone X with a pending release date, and now, more than 550 days later, it has been cancelled.

Apple has people. Apple has money. Apple has failed. Problems exist with the butterfly keyboard. What’s happening?

The second item is about disappearing emails and messages. Some might describe these digital artifacts as evidence. The article “Some of Mark Zuckerberg’s Old Facebook Posts Have Disappeared” reports:

the social network accidentally deleted some of Zuckerberg’s old Facebook posts, including all the ones he made in 2007 and 2008.

This is interesting because I thought backups were mostly routine. Apparently this was not the case at Facebook. What’s happening?

The third item is about a failure to get one’s act together. I read “Google Accidentally Leaks Its ‘Nest Hub Max’ Smart Display.” I learned:

in a leak on its own website, Google might have accidentally revealed an upcoming product called the Nest Hub Max.

What’s happening?

Now let’s consider several hypotheses which may help me creep a bit closer to the thread linking these apparently isolated events.

  1. The three companies are not able to govern their commercial empires. A failure to deliver a product, an egregious and difficult to believe statement about “losing email”, and an inability to organize a news item—the problem is governance of the business process.
  2. The three companies seem indifferent to the implications of each firm’s individual actions: Apple’s misstatement about a device, Facebook’s continued dancing around information, and Google’s PR flub—each illustrates a deeper issue. I term it “high school science club management method.” Bright folks see what they see, and not what others see. HSSCMM at work.
  3. The three companies demonstrate the inherent weaknesses of the Silicon Valley approach: failure, ineptness or duplicitous behavior, and taking one’s eye off the ball.

Just a series of hypotheses, mind you. What if these examples are the tip of a fast melting iceberg?

Stephen E Arnold, March 30, 2019

Hashing Videos and Images Explained

March 17, 2019

A quite lucid explanation of video and image identification appears in “How Hashing Could Stop Violent Videos from Spreading.” Here’s one passage from the article:

Video hashing works by breaking down a video into key frames and giving each a unique alphanumerical signature, or hash. That hash is collected into a central database, where every video or photo that is uploaded to a platform is then compared against that dataset. The system requires a database of images and doesn’t use artificial intelligence to identify what is in an image — it only identifies a match between images and videos.

CNN emphasizes Microsoft’s PhotoDNA technology. Information about that system may be found at this link. The write up points out that Facebook and Google use “this technology.”

One question is, “If the technology is available and in use, why are offensive videos and images finding their way into public facing, easily accessible systems?”

The answer according to an expert quoted in the CNN story is:

The decision not to do this [implement more effective hashing filter methods] is a question of will and policy—not a question of technology.”

The answer is that platforms are one way to avoid the editorial responsibility associated with old school methods of communication; for example, wire services, newspapers, and magazine. These types of communication were not perfect, but in many cases, an editorial process prevented certain types of information from appearing  in certain publications. So far, the hands off approach of some digital channels and the over hyped use of smart software have not been as effective as the hopelessly old fashioned processes used by some traditional media outlets.

So will? Policy?

Nah, money, expediency, and the high school science club approach to management.

Stephen E Arnold, March 17, 2019

Flagships Lost in a Sea of Money, Fame, and Power

February 10, 2019

I read “The Ethical Dilemma Facing Silicon Valley’s Next Generation.” The headline sounds like an undergraduate essay created by a Red Bull crazed philosophy major at Duquesne University. (I should know. I attended Duquesne when working on an advanced degree in — wait for it — medieval religious literature.)

But this essay is not going to be read by a slightly off kilter professor with a passion for Søren Aabye Kierkegaard and Augustine’s On Christian Teaching.

No. This essay is aimed at those interested in technology and the intersection of Silicon Valley, Stanford University, and the scorched earth approach of “move fast and break things” wizards.

The write up includes this observation:

Stanford is known as “The Farm” because the verdant 8,000-acre campus was once home to founder Leland Stanford’s horses, but today tech firms and venture capitalists treat the 16,000-person student body like their own minor league ball club.

And the university is now flicking the switch on the archives of the university library which contains documents like Pausanias’s description of the temple of Apollo at Delphi. Stanford’s leaders, professors, and students may have forgotten the injunction (which I have anglicized):

Know thyself or gnóthi sautón

But universities, public and private, want to be just like Stanford.

The Ringer reports:

Professors are revamping courses to address the ethical challenges tech companies are grappling with right now. And university president Marc Tessier-Lavigne has made educating students on the societal impacts of technology a tentpole of his long-term strategic plan.

I found this item of information interesting:

In 2013, Stanford began directly investing in students’ companies, much like a venture capital firm.

One would think that universities provided education. The Ringer makes this somewhat surprising statement:

Stanford and computer science programs across the country may not be adequately equipped to wade through the ethical minefield that is expanding along with tech’s influence.

Who is equipped? Consultants from McKinsey, Bain, or Booz Allen? Politicians? Perhaps universities should seek council from the top three officials in Virginia to add an East Coast flair to the ethical challenge? What about individual thinkers? Jeffrey Skilling (Wharton and Enron) and Martin Shkreli (the pharma bro)? Soon El Chapo (a bro-chacho) will have time on his hands once a verdict is rendered in his trial.

Courses about ethics are sprouting like flowers after April showers in a temperate zone.

I underlined in yellow this passage which is almost bittersweet:

The [ethics] course’s popularity is a sign that the gravity of the moment is weighing on many Stanford minds. Antigone Xenopoulos, a junior majoring in symbolic systems (a techie-fuzzie hybrid major that incorporates computer science, linguistics, and philosophy), is a research assistant for CS181. She wasn’t the only student who quoted a line from Spider-Man to me—with great power comes great responsibility—when referencing the current landscape. “If they’re going to give students the tools to have such immense influence and capabilities, [Stanford] should also guide those students in developing ethical compasses,” she says.

Yep, Spiderman. Spiderman.

Net net:

  1. Stanford is not the “problem”; Stanford is one member of a class of entities which cultivate and harvest the problem
  2. Silicon Valley has and continues to function as a high school science club without a teacher supervisor
  3. Technology, unlike a cat, cannot be put back in a bag.

Years ago I did some work for an investment bank. One of the people in a meeting was filled with the the George Gilder observation about convergence. I asked this question of the group of 12 high powered people:

Do you think technology could be like gerbils or rabbits?

The question evoked silence.

The situation today is that the interaction of technology has created ecologies in which new creatures are thriving. The result is that certain facets of a pre-technology world have been crushed, killed, or left to starve by the new digital animals and their inventions.

The Ringer’s article reminded me that “ethics” and the ability to understand oneself are in danger of extinction.

As one of the investment bankers for whom I did some work was fond of saying, “Interesting. No?”

Stephen E Arnold, February 10, 2019

High School Science Club: Employee Walk About

November 1, 2018

High school science club management methods face an interesting situation. The science club has a hierarchy. The whiz kids on the lower levels of that hierarchy are not getting with the program. Allegedly a small percentage of Google’s work force are unhappy with handling of alleged sexual misconduct. Here in Harrod’s Creek, we assumed that members of the high school science club school of thought worried about math, Fourier transforms, and k-means. If “We’re the Organizers of the Google Walkout. Here Are Our Demands” contains accurate information, some affected by high school management methods have other interests; for example, fairness, respectful behavior, and other old fashioned ideas.

I learned:

All employees and contract workers across the company deserve to be safe.

Fancy that.

Here’s an outrageous demand:

A clear, uniform, globally inclusive process for reporting sexual misconduct safely and anonymously. The process today is not working, in no small part because HRs’ performance is assessed by senior management and directors, forcing them to put management’s interests ahead of employees reporting harassment and discrimination. The improved process should also be accessible to all: full-time employees, temporary employees, vendors, and contractors alike. Accountability, safety and an ability to report unsafe working conditions should not be dictated by employment status.

What’s next for practitioners of high school science club membership? Better business processes? Executives not given to dalliances with fascinating methods of motivation? More responsible decision making? Nah, HSSCM methods are just better.

Google’s implementation of such management methods is as interesting as the company’s progress on solving death.

Stephen E Arnold, November 1, 2018

 

Analytics: From Predictions to Prescriptions

October 19, 2018

I read an interesting essay originating at SAP. The article’s title: “The Path from Predictive to Prescriptive Analytics.” The idea is that outputs from a system can be used to understand data. Outputs can also be used to make “predictions”; that is, guesses or bets on likely outcomes in the future. Prescriptive analytics means that the systems tell or wire actions into an output. Now the output can be read by a human, but I think the key use case will be taking the prescriptive outputs and feeding them into other software systems. In short, the system decides and does. No humans really need be involved.

The write up states:

There is a natural progression towards advanced analytics – it is a journey that does not have to be on separate deployments. In fact, it is enhanced by having it on the same deployment, and embedding it in a platform that brings together data visualization, planning, insight, and steering/oversight functions.

What is the optimal way to manage systems which are dictating actions or just automatically taking actions?

The answer is, quite surprisingly, a bit of MBA consultantese: Governance.

The most obvious challenge with regards to prescriptive analytics is governance.

Several observations:

  • Governance is unlikely to provide the controls which prescriptive systems warrant. Evidence is that “governance” in some high technology outfits is in short supply.
  • Enhanced automation will pull prescriptive analytics into wide use. The reasons are one you have heard before: Better, faster, cheaper.
  • Outfits like the Google and In-Q-Tel funded Recorded Future and DarkTrace may have to prepare for new competition; for example, firms which specialize in prescription, not prediction.

To sum up, interesting write up. perhaps SAP will be the go to player in plugging prescriptive functions into their software systems?

Stephen E Arnold, October 19, 2018

Self Regulation: How Does That Work for Teen Aged Science Club Members?

June 15, 2018

I like the Platonic ideal of self regulation. Better yet let’s try for crowd sourced regulation. Tie dye T shirts are cool too.

Sometimes, it seems, humans are the answer. Unpaid, helpful humans. Motherboard profiles the little-sung YouTube “super users” in, “‘Are You Batman?’: How YouTube’s Volunteer Army Gets Channels Undeleted.” Writer Adrianne Jeffries opens with an anecdote in which an individual known as @Contributors_YT may have helped an unfortunate YouTube broadcaster get his channel back. She then explains:

“Increasingly, YouTube creators are getting help from anonymous YouTube super-users, including @Contributors_YT, who have access to a backchannel that allows them to escalate complaints to YouTube employees and sometimes get mistaken channel deletions or ‘false strikes’ against videos reversed. These super-users volunteer for YouTube through a company initiative that used to be called ‘YouTube Heroes’ but is now known as two separate programs, Trusted Flaggers and YouTube Contributors. They patrol the official YouTube Help Forum and social media, where many of them use TweetDeck to sift for keywords that signal distressed YouTubers. Most of the time, the volunteers simply add expertise, offering advice on everything from how to get more subscribers to technical support. They know YouTube’s Community Guidelines inside and out, and can usually figure out why action was taken and help fill in the gaps around YouTube’s notoriously poor communication with creators. Sometimes they pass along messages from YouTube staffers related to specific cases. But lately, as YouTube ramps up enforcement due to negative press coverage about the prominence of violent videos and conspiracy theories on the platform, they’ve been intervening more and more when videos or channels are incorrectly penalized. For YouTubers who get wrongly caught up in the company’s enormous, faceless content moderation machine, these volunteer crusaders are their last hope.”

See the article for several more examples of these YouTube do-gooders helping those who have been wronged by the zealous algorithm. It is worth remembering that, by now, some of these broadcasters have put years of work into their YouTube presences, and many rely on them for income. Should social media sites embrace the old school notion of editorial control and responsibility?

Nah. Nothing is more satisfying than watching self regulation in action. From Amazon reviews to comments offered to viewers of a live stream of the Hawaii volcano eruption, good judgment is on display.

Cynthia Murrell, June 15, 2018

Can Presscoin Keep News from Going Fake

February 23, 2018

Fake news is a topic that has everyone on all sides of the aisle concerned. However, One innovative idea merging fake news and big data might have solved the problem. We learned more from a recent The Next Web story, “PresscoinL The Largest Crowdfunding Effort to Address the News Crisis.”

According to the story:

As the GDP of the PressCoin economy grows, the value of tokens will rise.

And blockchain keeps it honest. Our hope is to grow a decentralized media network reaching 100 mln engaged ‘users/prosumers’ in five years time, and to meaningfully address 10 percent of the news media industry within a couple of years of that.

PressCoin’s business strategy rests on these legs:

  • Shared Design Philosophy–  Open Collaboration, Partnership, and Decentralization
  • Shared Technology Infrastructure– Seamless media technology cloud, underlying big-data systems, advanced APIs for engagement and analytics
  • Shared Business Services– Consumer Data Intelligence, Monetization, Enterprise Sales, Ecosystem Partnerships, Strategic Relations
  • Shared Developer Network– This fertile playground for agile experiments in the news/media/journalism sphere
  • Shared Fiat/Crypto Financial Services Infrastructure– Built on Cointype
  • Shared Venture Arm– Foster disruption within the ecosystem

While PressCoin is just getting off the ground, big data is already sorting out whether fake news can be trusted. One interesting use has been the story of how fake news might not have had that big an impact on the 2016 election. This could be the dawn of some very insightful times.

Patrick Roland, February 23, 2018

Next Page »

  • Archives

  • Recent Posts

  • Meta