Is It Lights Out on the Information Superhighway?

April 26, 2023

Vea4_thumb_thumb_thumbNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

We just completed a lecture about the shadow web. This is our way of describing a number of technologies specifically designed to prevent law enforcement, tax authorities, and other entities charged with enforcing applicable laws in the dark.

Among the tools available are roulette services. These can be applied to domain proxies so it is very difficult to figure out where a particular service is at a particular point in time. Tor has uttered noises about supporting the Mullvad browser and baking in a virtual private network. But there are other VPNs available, and one of the largest infrastructure service providers is under what appears to be “new” ownership. Change may create a problem for some enforcement entities. Other developers work overtime to provide services primarily to those who want to deploy what we call “traditional Dark Web sites.” Some of these obfuscation software components are available on Microsoft’s GitHub.

I want to point to “Global Law Enforcement Coalition Urges Tech Companies to Rethink Encryption Plans That Put Children in Danger from Online Abusers.” The main idea behind the joint statement (the one to which I point is from the UK’s National Crime Agency) is:

The announced implementation of E2EE on META platforms Instagram and Facebook is an example of a purposeful design choice that degrades safety systems and weakens the ability to keep child users safe. META is currently the leading reporter of detected child sexual abuse to NCMEC. The VGT has not yet seen any indication from META that any new safety systems implemented post-E2EE will effectively match or improve their current detection methods.

From my point of view, a questionable “player” has an opportunity to make it possible to enforce laws related to human trafficking, child safety, and related crimes like child pornography. The “player” seems interested in implementing encryption that would make government enforcement more difficult, if not impossible in some circumstances.

The actions of this “player” illustrate what’s part of a fundamental change in the Internet. What was underground is now moving above ground. The implementation of encryption in messaging applications is a big step toward making the “regular” Internet or what some called the Clear Web into a new version of the Dark Web. Not surprisingly, the Dark Web will not go away, but why develop Dark Web sites when Clear Web services provide communications, secrecy, the ability to transmit images and videos, and perform financial transactions related to these data. Thus the Clear Web is falling into the shadows.

My team and I are not pleased with ignoring appropriate and what we call “ethical” behavior with specific actions to increase risks to average Internet users. In fact, some of the “player’s actions” are specifically designed to make the player’s service more desirable to a market segment once largely focused on the Dark Web.

More than suggestions are needed in my opinion. Direct action is required.

Stephen E Arnold, April 26, 2023

What Smart Software Will Not Know and That May be a Problem

April 26, 2023

This blog post is the work of a real, live dinobaby. No smart software involved.

I read a short item called “Who Owns History? How Remarkable Historical Footage Is Hidden and Monetized.” The main point of the article was to promote a video which makes clear that big companies are locking “extraordinary footage… behind paywalls.” The focus is on images, and I know from conversations with people with whom I worked who managed image rights years ago. The companies are history; for example, BlackStar and Modern Talking Pictures. And there were others.

Images are now a volleyball, and the new spiker on the Big Dog Team is smart software generated images. I have a hunch that individuals and companies will aggregate as many of these as possible. The images will then be subject to the classic “value adding” process and magically become for fee. Image trolls will feast.

I don’t care too much about images. I do think more about textual and tabular content. The rights issue is a big one, but I came at smart software from a different angle. Smart software has to be trained, whether via a traditional human constructed corpus, a fake-o corpus courtesy of the synthetic data wizards, or some shotgun marriage of “self training” and a mash up of other methods.

But what if important information are not available to the smart software? Won’t that smart software be like a student who signs up for Differential Geometry without Algebraic Topology? Lots of effort but that insightful student may not be in gear to keep pace with other students in the class. Is not knowing the equivalent of being uninformed or just dumb?

One of the issues I have with smart software is that some content, which I think is essential to clear thinking, is not available to today’s systems. Let me give one example. In 1963, when I was sophomore at a weird private university, a professor urged me to read the metaphysics text by a person named A. E. Taylor. The college I attended did not have too many of Dr. Taylor’s books. There was a copy of his Aristotle and nothing else. I did some hunting and located a copy of Elements of Metaphysics, a snappy thriller.

However, Dr. Taylor wrote a number of other books. I went looking for these because I assume that the folks training smart data want to make sure the “model” has information about the nature of information and related subjects. Guess what? Project Gutenberg, the Internet Archive, and the online gem Amazon have the Aristotle book and a couple of others. FYI: You can get a copy of A. E. Taylor’s Metaphysics for $3.88, a price illustrating the esteem in which Dr. Taylor’s work is held today.

My team and I ran some queries on the smart software systems to which we have access. We learned that information from Dr. Taylor is a scarce as hen’s teeth. We shifted gears and checked out information generated by the much loved brother of Henry James. More of William James’s books were available at bargain basement prices. A collection of essays was less than $2 on Amazon.

My point is that images are likely to be locked up behind a paywall. However, books which may be important to one’s understanding of useless subjects like ethics, perception, and information are not informing the outputs of the smart software we probed. (Yes, we mean you, gentle Bard, and you too ChatGPT.)

Does the possible omission of these types of content make a difference?

Probably not. Embrace synthetic data. The “old” content is not digitally massaged. Who cares? We are in “good enough” land. It’s like a theme park with a broken rollercoaster and some dicey carnies.,

Stephen E Arnold, April 26, 2023

Google: A PR Special Operation Underway

April 25, 2023

Vea4_thumb_thumb_thumbNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

US television on Sunday, April 16, 2023. Assorted blog posts and articles by Google friends like Inc. Magazine. Now the British Guardian newspaper hops on the bandwagon.

Navigate toGoogle Chief Warns AI Could Be Harmful If Deployed Wrongly.” Let me highlight a couple of statements in the write up and then offer a handful of observations designed intentionally to cause some humanoids indigestion.

The article includes this statement:

Sundar Pichai also called for a global regulatory framework for AI similar to the treaties used to regulate nuclear arms use, as he warned that the competition to produce advances in the technology could lead to concerns about safety being pushed aside.

Also, this gem:

Pichai added that AI could cause harm through its ability to produce disinformation.

And one more:

Pichai admitted that Google did not fully understand how its AI technology produced certain responses.

Enough. I want to shift to the indigestion inducing portion of this short essay.

First, Google is in Code Red. Why? What were the search wizards under the guidance of Sundar and Prabhakar doing for the last year? Obviously not paying attention to the activity of OpenAI. Microsoft was and stole the show at the hoe down in Davos. Now Microsoft has made available a number of smart services designed to surf on its marketing tsunami and provide more reasons for enterprise customers to pay for smart Microsoft software. Neither the Guardian nor Sundar seem willing to talk about the reality of Google finding itself in the position of Alta Vista, Lycos, or WebCrawler in the late 1990s and early 2000s when Google search delivered relevant results. At least Google did until it was inspired by the Yahoo, GoTo, and Overture approach to making cash. Back to the question: Why ignore the fact that Google is in Code Red? Why not ask one half of the Sundar and Prabhakar Comedy Team how they got aced by a non-headliner act at the smart software vaudeville show?

Second, I loved the “could cause harm.” What about the Android malware issue? What about the ads which link to malware in Google search results? What about the monopolization of online advertising and pricing ads beyond the reach of many small businesses? What about the “interesting” videos on YouTube? Google has its eye on the “could” of smart software without paying much attention to the here-and-now downsides of its current business. And disinformation? What is Google doing to scrub that content from its search results. My team identified a distributor of pornography operating in Detroit. That operator’s content can be located with a single Google query. If Google cannot identify porn, how will it flag smart software’s “disinformation”?

Finally, Google for decades has made a big deal of hiring the smartest people in the world. There was a teen whiz kid in Moscow. There was a kid in San Jose with a car service to get him from high school to the Mountain View campus. There is deep mind with its “deep” team of wizards. Now this outfit with more than 100,000 (more or less full time geniuses) does not know how its software works. How will that type of software be managed by the estimable Google? The answer is, “It won’t.” Google’s ability to manage is evident with heart breaking stories about its human relations and personnel actions. There are smart Googlers who think the software is alive. Does this person have company-paid mental health care? There are small businesses like an online automobile site in ruins because a Googler downchecked the site years ago for an unknown reason. The Google is going to manage something well?

My hunch is that Google wants to make sure that it becomes the primary vendor of ready-to-roll training data and microwavable models. The fact that Amazon, Microsoft, and a group of Chinese outfits are on the same information superhighway illustrates one salient fact: The PR tsunami highlights Google’s lack of positive marketing action and the taffy-pull sluggishness of demos that sort of work.

What about the media which ask softball questions and present as substance recommendations that the world agree on AI rules? Perhaps Google should offer to take over the United Nations or form a World Court of AI Technology? Maybe Google should just be allowed to put other AI firms out of business and keep trying to build a monopoly based on software the company doesn’t appear to understand?

The good news is that Sundar did not reprise the Paris demonstration of Bard. That only cost the company a few billion when the smart software displayed its ignorance. That was comedic, and I think these PR special operations are fodder for the spring Sundar and Prabhakar tour of major cities.

The T shirts will not feature a dinosaur (Googzilla, I believe) freezing in a heavy snow storm. The art can be produced using Microsoft Bing’s functions too. And that will be quite convenient if Samsung ditches Google search for Bing and its integrated smart software. To add a bit of spice to Googzilla’s catered lunch is the rumor that Apple may just go Bing. Bye, bye billions, baby, bye bye.

If that happens, Google loses: [a] a pickup truck filled with cash, [b] even more technical credibility, and [c] maybe Googzilla’s left paw and a fang. Can Sundar and Prabhakar get applause when doing one-liners with one or two performers wearing casts and sporting a tooth gap?

Stephen E Arnold, April 25, 2023

NSO Group: How Easy Are Mobile Hacks?

April 25, 2023

I am at the 2023 US National Cyber Crime Conference, and I have been asked, “What companies offer NSO-type mobile phone capabilities?” My answer is, “Quite a few.” Will I name these companies in a free blog post? Sure, just call us at 1-800-YOU-WISH.

A more interesting question is, “Why is Israel-based NSO Group the pointy end of a three meter stick aimed at mobile devices?” (To get some public information about newly recognized NSO Group (Pegasus) tricks, navigate to “Triple Threat. NSO Group’s Pegasus Spyware Returns in 2022 with a Trio of iOS 15 and iOS 16 Zero-Click Exploit Chains.” I would point out that the reference to Access Now is interesting, and a crime analyst may find a few minutes examining what the organization does, its “meetings,” and its hosting services time well spent. Will I provide that information in a free blog post. Please, call the 800 number listed above.)

Now let’s consider the question regarding the productivity of the NSO technical team.

First, Israel’s defense establishment contains many bright people and a world-class training program. What happens when you take well educated people, the threat of war without warning, and an outstanding in-service instructional set up? The answer is, “Ideas get converted into exercises. Exercises become test code. Test code gets revised. And the functional software becomes weaponized.”

Second, the “in our foxhole” mentality extends once trained military specialists leave the formal service and enter the commercial world. As a result, individuals who studied, worked, and in some cases, fought together set up companies. These individuals are a bit like beavers. Beavers do what beavers do. Some of these firms replicate functionality similar to that developed under the government’s watch and sell those products. Please, note, that NSO Group is an exception of sorts. Some of the “insights” originated when the founders were repairing mobile phones. The idea, however, is the same. Learning, testing, deploying, and the hiring individuals with specialized training by the Israeli government. Keep in mind the “in my foxhole” notion, please.

Third, directly or indirectly important firms in Israel or, in some cases, government-assisted development programs provide: [a] Money, [b] meet up opportunities like “tech fests” in Tel Aviv, and [c] suggestions about whom to hire, partner with, consult with, or be aware of.

Do these conditions exist in other countries? In my experience, to some degree this approach to mobile technology exploits does. There are important differences. If you want to know what these are, you know the answer. Buzz that 800 number.

My point is that the expertise, insights, systems, and methods of what the media calls “the NSO Group” have diffused. As a result, there are more choices than ever before when it comes to exploiting mobile devices.

Where’s Apple? Where’s Google? Where’s Samsung? The firms, in my opinion, are in reactive mode, and, in some cases, they don’t know what they don’t know.

Stephen E Arnold, April 25, 2023

AI That Sort of, Kind of Did Not Work: Useful Reminders

April 24, 2023

Vea4_thumb_thumbNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

I read “Epic AI Fails. A List of Failed Machine Learning Projects.” My hunch is that a write up suggesting that smart software may disappoint in some cases is not going to be a popular topics. I can hear the pooh-poohs now: “The examples used older technology.” And “Our system has been engineered to avoid that problem.” And “Our Large Language Model uses synthetic data which improves performance and the value of system outputs.” And “We have developed a meta-layer of AI which integrates multiple systems in order to produce a more useful response.”

Did I omit any promises other than “The check is in the mail” or “Our customer support team will respond to your call immediately, 24×7, and with an engineer, not a smart chatbot because. Humans, you know.”

The main point of the article from Analytics India, an online publication, provides some color on interesting flops; specifically:

  • Amazon’s recruitment system. Think discrimination against females. Amazon’s Rekognition system and its identification of elected officials as criminals. Wait. Maybe those IDs were accurate?
  • Covid 19 models. Moving on.
  • Google and the diabetic retinopathy detection system. The marketing sounded fine. Candy for breakfast? Sure, why not?
  • OpenAI’s Samantha. Not as crazy as Microsoft Tay but in the ballpark.
  • Microsoft Tay. Yeah, famous self instruction in near real time.
  • Sentient Investment AI Hedge Fund. Your retirement savings? There are jobs at Wal-Mart I think.
  • Watson. Wow. Cognitive computing and Jeopardy.

The author takes a less light-hearted approach than I. Useful list with helpful reminders that it is easier to write tweets and marketing collateral than deliver smart software that delivers on sales confections.

Stephen E Arnold, April 24, 2023

Divorcing the Google: Legal Eagles Experience a Frisson of Anticipation

April 24, 2023

No smart software has been used to create this dinobaby’s blog post.

I have poked around looking for a version or copy of the contract Samsung signed with Google for the firms’ mobile phone tie up. Based on what I have heard at conferences and read on the Internet (of course, I believe everything I read on the Internet, don’t you?), it appears that there are several major deals.

The first is the use of and access to the mindlessly fragmented Android mobile phone software. Samsung can do some innovating, but the Google is into providing “great experiences.” Why would a mobile phone maker like Samsung allow a user to manage contacts and block mobile calls without implementing a modern day hunt for gold near Placer.

The second is the “suggestion” — mind you, the suggestion is nothing more than a gentle nudge — to keep that largely-malware-free Google Play Store front and center.

The third is the default search engine. Buy a Samsung get Google Search.

Now you know why the legal eagles a shivering when they think of litigation to redo the Google – Samsun deal. For those who think the misinformation zipping around about Microsoft Bing displacing Google Search, my thought would be to ask yourself, “Who gains by pumping out this type of disinformation?” One answer is big Chinese mobile phone manufacturers. This is Art of War stuff, and I won’t dwell on this. What about Microsoft? Maybe but I like to think happy thoughts about Microsoft. I say, “No one at Microsoft would engage in disinformation intended to make life difficult for the online advertising king. Another possibility is Silicon Valley type journalists who pick up rumors, amplify them, and then comment that Samsung is kicking the tires of Bing with ChatGPT. Suddenly a “real” news outfit emits the Samsung rumor. Exciting for the legal eagles.

The write up “Samsung Can’t Dump Google for Bing As the Default Search Engine on Its Phones” does a good job of explaining the contours of a Google – Samsung tie up.

Several observations:

First, the alleged Samsung search replacement provides a glimpse of how certain information can move from whispers at conferences to headlines.

Second, I would not bet against lawyers. With enough money, contracts can be nullified, transformed, or left alone. The only option which disappoints attorneys is the one that lets sleeping dogs lie.

Third, the growing upswell of anti-Google sentiment is noticeable. That may be a far larger problem for Googzilla than rumors about Samsung. Perceptions can be quite real, and they translate into impacts. I am tempted to quote William James, but I won’t.

Net net: If Samsung wants to swizzle a deal with an entity other than the Google, the lawyers may vibrate with such frequency that a feather or two may fall off.

Stephen E Arnold, April 24, 2023

Google: Any Day Now, Any Day Now

April 21, 2023

Vea4_thumb_thumbNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

I read what could be a recycled script from the Sundar and Prabhakar Comedy Show. Although not yet a YouTube video series, the company is edging ever closer to becoming the most amusing online advertising company in Mountain View.

Google Devising Radical Search Changes to Beat Back A.I. Rivals” is chock full of one-liners. Now these are not as memorable as Jack Benny’s “I’m thinking it over” or Abbott and Costello’s “I don’t know is on third”, but the Google is in the ball park.

I liked these statements:

The tech giant is sprinting. [Exactly how does Googzilla sprint?]

Google is racing [Okay, Kentucky Derby stuff or NASCAR stuff? One goes at the speed of organisms, and the other is into the engineering approach to speed. Google is in progressive tense mode, not delivering results mode.]

we’re excited about bringing new A.I.-powered features to search, and will share more details soon.” [I laughed at the idea of an outfit in panic and Red Alert mode getting exciting. Is this like a high school science club learning that it has qualified to participate in the international math competition or excite like members of the high school science club learning that the club will not be expelled for hijacking the principal’s morning announcements.]

“Modernizing its search engine has become an obsession at Google…” [I wonder if this is the type of obsession that pulled the Google VP to his yacht with a specialized contractor allegedly in possession of a controlled substance or the legal eagle populating his nest or the Google HR mastermind who made stochastic parrot the go-to phrase for discrimination and bias.’’]

The article contains more comedic gems. The main point is that my team and I cannot keep pace with the number of new applications of the chatbot technology. Amazon is giving the capability away free. China’s technical sector continues to beaver away adding to its formidable array of software capabilities. Plus we spotted a German outfit able to crank out interesting videos of former President Obama making fascinating statements about another former president.

The future and progressive present tenses are interesting. Other firms are outputting features, services, and products at a remarkable pace.

And what’s the Google search sensitive professionals doing? Creating more grist for the Sundar and Prabhakar Comedy Show.

The only problem is that Google continues to talk, do PR, and promise. What’s that suggest about quantum supremacy or delivering relevant search results? I do know one thing. If I want an answer, I am going to run the query on the You.com service, thank you very much.

Stephen E Arnold, April 21, 2023

AI: Sucking Value from Those with Soft Skills

April 21, 2023

Vea4_thumb_thumbNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

I read an essay called “Beyond Algorithms: Skills Of Designers That AI Can’t Replicate.” The author has a specific type of expertise. The write up explains that his unique human capabilities cannot be replicated in smart software.

I noted this somewhat poignant passage:

Being designerly takes thinking, feeling, and acting like a designer…. I used the head, heart, and hands approach for transformative sustainability learning (Orr, Sipos, et al.) to organize these designerly skills related to thinking (head), feeling (heart), and doing (hands), and offer ways to practice them.

News flash: Those who can use smart software to cut costs and get good enough outputs from smart software don’t understand “designerly.”

I have seen lawyers in meeting perspire when I described methods for identifying relevant sections of information from content sucked into as part of the discovery process. Why memorize Bates number 525 when a computing device provides that information in an explicit form. Zippy zip. The fear, in my experience, is that lawyers often have degrees in history or political science, skipped calculus, and took golf instead of computer science. The same may be said of most knowledge workers.

The idea is that a human has “knowledge value,” a nifty phrase cooked up by Taichi Sakaiya in his MITI-infused book The Knowledge Value Revolution or a History of the Future.

The author of the essay perceives his designing skill as having knowledge value. Indeed his expertise has value to himself. However, the evolving world of smart software is not interested in humanoids’ knowledge value. Software is a way to reduce costs and increase efficiency.

The “good enough” facet of the smart software revolution de-values what makes the designer’s skill generate approbation, good looking stuff, and cash.

No more. The AI boomlet eliminates the need to pay in time and resources for what a human with expertise can do. As soon as software gets close enough to average, that’s the end of the need for soft excellence. Yes, that means lots of attorneys will have an opportunity to study new things via YouTube videos. Journalists, consultants, and pundits without personality will be kneecapped.

Who will thrive? The answer is in the phrase “the 10X engineer.” The idea is that a person with specific technical skills to create something like an enhancement to AI will be the alpha professional.  The vanilla engineer will find himself, herself, or itself sitting in Starbucks watching TikToks.

The present technology elite will break into two segments: The true elite and the serf elite. What’s that mean for today’s professionals who are not coding transformers? Those folks will have a chance to meet new friends when sharing a Starbucks’ table.

Forget creativity. Think cheaper, not better.

Stephen E Arnold, April 21, 2023

The Google Will Means We Are Not Lagging Behind ChatGPT: The Coding Angle

April 20, 2023

Vea4_thumb_thumbNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

I read another easily-spotted Google smart software PR imitative. Google’s professionals apparently ignore the insights of the luminary Jason Calacanis. In his “The Rise of AutoGPT and AO Anxieties” available absolutely anywhere the energetic Mr. Calacanis can post the content, a glimpse of the Google anxiety is explained. One of Mr. Calacanis’ BFFs points out that companies with good AI use the AI to make more and better AI. The result is that those who plan, anticipate, and promise great AI products and services cannot catch up to those who are using AI to super-charge their engineers. (I refuse to use the phrase 10X engineer because it is little more than a way to say, “Smart engineers are now becoming 5X or 10X engineers.” The idea is that “wills” and “soon” are flashing messages that say, “We are now behind. We will never catch up.”

I thought about the Thursday, April 13, 2023, extravaganza when I read “DeepMind Says Its New AI Coding Engine Is As Good As an Average Human Programmer.” The entire write up is one propeller driven Piper Cub skywriting messages about the future. I quote:

DeepMind has created an AI system named AlphaCode that it says “writes computer programs at a competitive level.” The Alphabet subsidiary tested its system against coding challenges used in human competitions and found that its program achieved an “estimated rank” placing it within the top 54 percent of human coders. The result is a significant step forward for autonomous coding, says DeepMind, though AlphaCode’s skills are not necessarily representative of the sort of programming tasks faced by the average coder.

Mr. Calacanis and his BFFs were not talking about basic coding as the future. Their focus was on autonomous AI which can string together sequences of tasks. The angle in my lingo is “meta AI”; that is, instead of a single smart query answered by a single smart system, the instructions in natural language would be parsed by a meta-AI which would pull back separate responses, integrate them, and perform the desired task.

What’s Google’s PR team pushing? Competitive programming.

Code Red? Yeah, that’s the here and now. The reality is that Google is in “will” mode. Imagine for a moment that Mr. Calacanis and his BFFs are correct. What’s that mean for Google? Will Google catch up with “will”?

Stephen E Arnold, April 20, 2023

Google Panic: Just Three Reasons?

April 20, 2023

Vea4_thumb_thumb_thumbNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

I read tweets, heard from colleagues, and received articles emailed to me about Googlers’ Bard disgruntlement?  In my opinion, Laptop Magazine’s summary captures the gist of the alleged wizard annoyance. “Bard: 3 Reasons Why the Google Staff Hates the New ChatGPT Rival.”

I want to sidestep the word “hate”. With 100,000 or so employees a hefty chunk of those living in Google Land will love Bard. Other Google staff won’t care because optimizing a cache function for servers in Brazil is a world apart. The result is a squeaky cart with more squeaky wheels than a steam engine built in 1840.

The three trigger points are, according to the write up:

  1. Google Bard outputs that are incorrect. The example provided is that Bard explains how to crash a plane when the Bard user wants to land the aircraft safely. So stupid.
  2. Google (not any employees mind you) is “indifferent to ethical concerns.” The example given references Dr. Timnit Gebru, my favorite Xoogler. I want to point out that Dr. Jeff Dean does not have her on this weekend’s dinner party guest list. So unethical.
  3. Bard is flawed because Google wizards had to work fast. This is the outcome of the sort of bad judgment which has been the hallmark of Google management for some time. Imagine. Work. Fast. Google. So haste makes waste.

I want to point out that there is one big factor influencing Googzilla’s mindless stumbling and snorting. The headline of the Laptop Magazine article presents the primum mobile. Note the buzzword/sign “ChatGPT.”

Google is used to being — well, Googzilla — and now an outfit which uses some Google goodness is in the headline. Furthermore, the headline calls attention to Google falling behind ChatGPT.

Googzilla is used to winning (whether in patent litigation or in front of incredibly brilliant Congressional questioners). Now even Laptop Magazine explains that Google is not getting the blue ribbon in this particular, over-hyped but widely followed race.

That’s the Code Red. That is why the Paris presentation was a hoot. That is why the Sundar and Prabhakar Comedy Tour generates chuckles when jokes include “will,” “working on,” “coming soon”  as part of the routine.

Once again, I am posting this from the 2023 National Cyber Crime Conference. Not one of the examples we present are from Google, its systems, or its assorted innovation / acquisition units.

Googzilla for some is not in the race. And if the company is in the ChatGPT race, Googzilla has yet to cross the finish line.

That’s the Code Red. No PR, no Microsoft marketing tsunami, and no love for what may be a creature caught in a heavy winter storm. Cold, dark, and sluggish.

Stephen E Arnold, April 26, 2023

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta