Google: An Ad Crisis Looms from the Cancer of Short Videos

September 7, 2023

The weird orange newspaper ran a story which I found important. To read the article, you will need to pony up cash; I suggest you consider doing that. I want to highlight a couple of key points in the news story and offer a couple of observations.

9 3 sick ads

An online advertising expert looks out his hospital window and asks, “I wonder if the cancer in my liver will be cured before the cancer is removed from my employer’s corporate body?” The answer may be, “Liver cancer can be has a five year survival rate between 13 to 43 percent (give or take a few percentage points).” Will the patient get back to Foosball and off-site meetings? Is that computer capable of displaying TikTok videos? Thanks, Mother MJ. No annoying red appeal this banners today.

The article “Shorts Risks Cannibalising Core YouTube Business, Say Senior Staff” contains an interesting (although one must take with a dollop of mustard and some Dead Sea salt):

Recent YouTube strategy meetings have discussed the risk that long-form videos, which produce more revenue for the company, are “dying out” as a format, according to these people.

I am suspicious of quotes from “these people.” Nevertheless, let’s assume that the concern at the Google is real like news from “these people.”

The idea is that Google has been asleep at the switch as TikTok (the China linked short video service) became a go-to destination for people seeking information. Yep, some young people search TikTok for information, not just tips on self-harm and body dysmorphia. Google’s reaction was slow and predictable: Me too me too me too. Thus, Google rolled out “Shorts,” a TikTok clone and began pushing it to its YouTube faithful.

The party was rolling along until “these people” sat down and looked at viewing time for longer videos and the ad revenue from shorter videos. Another red alert siren began spinning up.

The orange newspaper story asserted:

In October last year, YouTube reported its first-ever quarterly decline in ad revenue since the company started giving its performance separately in 2020. In the following two quarters, the platform reported further falls compared with the same periods the previous year.

With a decline in longer videos, the Google cannot insert as many ads. If people watch shorter videos, Google has reduced ad opportunities. Although Google would love to pump ads into 30 second videos, viewers (users) might decide to feed their habit elsewhere. And where one may ask? How about TikTok or the would be cage fighter’s Meta service?

Several observations:

  1. Any decline in ad revenue is a force multiplier at the Google. The costs of running the outfit are difficult to control. Google has not been the best outfit in the world in creating new, non ad revenue streams in the last 25 years. That original pay-to-play inspiration has had legs, but with age, knees and hips wear out. Googzilla is not as spry as it used to be and its bright idea department has not found sustainable new revenue able to make up for a decline in traditional Google ad revenue… yet.
  2. The cost of video is tough to weasel out of Google’s financial statements. The murky “cloud” makes it easy to shift some costs to the enabler of the magical artificial intelligence push at the company. In reality, video is a black hole of costs. Storage, bandwidth, legal compliance, creator hassles, and overhead translate to more ads. Long videos are one place to put ads every few minutes. But when the videos are short like those cutting shapes dance lessons, the “short” is a killer proposition.
  3. YouTube is a big deal. Depending on whose silly traffic estimates one believes, YouTube is as big a fish in terms of eyeballs as Google.com search. Google search is under fire from numerous directions. Prabhakar Raghavan has not mounted much of a defense to the criticisms directed at Google search’s genuine inability to deliver relevant search results. Now the YouTube ad money flow is drying up like streams near Moab.

Net net: YouTube has become a golden goose. But short videos are a cancer and who can make fois gras out of a cancerous liver?

Stephen E Arnold, September 7, 2023

Gannett: Whoops! AI Cost Cutting Gets Messy

September 6, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Gannett, the “real” news bastion of excellence experimented with smart software. The idea is that humanoids are expensive, unreliable, and tough to manage. Software — especially smart software — is just “set it an forget it.”

9 2 mother kid mess

A young manager / mother appears in distress after her smart software robot spilled the mild. Thanks, MidJourney. Not even close to what I requested.

That was the idea in the Gannett carpetland. How did that work out?

Gannett to Pause AI Experiment after Botched High School Sports Articles” reports:

Newspaper chain Gannett has paused the use of an artificial intelligence tool to write high school sports dispatches after the technology made several major flubs in articles in at least one of its papers.

The estimable Gannett organization’s effort generated some online buzz. The CNN article adds:

The reports were mocked on social media for being repetitive, lacking key details, using odd language and generally sounding like they’d been written by a computer with no actual knowledge of sports.

That statement echoes my views of MBAs with zero knowledge of business making bonehead management decisions. Gannett is well managed; therefore, the executives are not responsible for the decision to use smart software to cut costs and expand the firm’s “real” news coverage.

I wonder if the staff terminated would volunteer to return to work to write “real” news? You know. The hard stuff like high school sports articles.

Stephen E Arnold, September 6, 2023

Generative AI: Not So Much a Tool But Something Quite Different

August 24, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Thirty years ago I had an opportunity to do a somewhat peculiar job. I had written for a publisher in the UK a version of a report my team and I prepared about Japanese investments in its Fifth Generation Computer Revolution or some such government effort. A wealthy person who owned a medium-sized financial firm asked me if I would comment on a book called The Meaning of the Microcosm. “Sure,” I said.

8 24 sea creature

This tiny, cute technology creature has just crawled from the ocean, and it is looking for lunch. Who knew that it could morph into a much larger and more disruptive beast? Thanks, MidJourney. No review committee for me this morning.

What I described was technology’s Darwinian behavior. I am not sure I was breaking new ground, but it seemed safe for me to point to how a technology survived. Therefore, I argued in a private report to this wealthy fellow, that if betting on a winner would make one rich. I tossed in an idea that I have thought about for many years; specifically, as technologies battle to “survive,” the technologies evolve and mutate. The angle I have commented about for many years is simple: Predicting how a technology mutates is a tricky business. Mutations can be tough to spot or just pop up. Change just says, “Hello, I am here.”

I thought about this “book commentary project” when I read “How ChatGPT Turned Generative AI into an Anything Tool.” The article makes a number of interesting observations. Here’s one I noted:

But perhaps inadvertently, these same changes let the successors to GPT3, like GPT3.5 and GPT4, be used as powerful, general-purpose information-processing tools—tools that aren’t dependent on the knowledge the AI model was originally trained on or the applications the model was trained for. This requires using the AI models in a completely different way—programming instead of chatting, new data instead of training. But it’s opening the way for AI to become general purpose rather than specialized, more of an “anything tool.”

I am not sure that “anything tool” is a phrase with traction, but it captures the idea of a technology that began as a sea creature, morphing, and then crawling out of the ocean looking for something to eat. The current hungry technology is smart software. Many people see the potential of combining repetitive processes with smart software in order to combine functions, reduce costs, or create alternatives to traditional methods of accomplishing a task. A good example is the use college students are making of the “writing” ability of free or low cost services like ChatGPT or You.com.

But more is coming. As I recall, in my discussion of the microcosm book, I made the point that Mr. Gilder’s point that small-scale systems and processes can have profound effects on larger systems and society as a whole. But a technology “innovation” like generative AI is simultaneously “small” and “large”. Perspective and point of view are important in software. Plus, the innovations of the transformer and the larger applications of generative AI to college essays illustrate the scaling impact.

What makes AI interesting for me at this time is that genetic / Darwinian change is occurring across the scale spectrum. On one hand, developers are working to create big applications; for instance, SaaS solutions that serve millions of users. On the other hand, shifting from large language models to smaller, more efficient methods of getting smart aim to reduce costs and speed the functioning of the plumbing.

The cited essay in Arstechnica is on the right track. However, the examples chosen are, it seems to me, ignoring the surprises the iterations of the technology will deliver. Is this good or bad? I have no opinion. What is important than wild and crazy ideas about control and regulation strike me as bureaucratic time wasting. It was millions a years ago to get out of the way of the hungry creature from the ocean of ones and zeros and try to figure out how to make catch the creature and have dinner, turn its body parts into jewelry which can be sold online, or processing the beastie into a heat-and-serve meal at Trader Joe’s.

My point is that the generative innovations do not comprise a “tool.” We’re looking at something different, semi-intelligent, and evolving with speed. Will it be let’s have lunch or one is lunch?

Stephen E Arnold, August 24, 2023

Amazon: You Are Lovable… to Some I Guess

August 21, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Three “real news” giants have make articles about the dearly beloved outfit Amazon. My hunch is that the publishers were trepidatious when the “real” reporters turned in their stories. I can hear the “Oh, my goodness. A negative Amazon story.” Not to worry. It is unlikely that the company will buy ad space in the publications.

8 17 giant

A young individual finds that the giant who runs an alleged monopoly is truly lovable. Doesn’t everyone? MidJourney, after three tries I received an original image somewhat close to my instructions.

My thought is the fear that executives at the companies publishing negative information about the lovable Amazon could hear upon coming home from work, “You published this about Amazon. What if our Prime membership is cancelled? What if our Ring doorbell is taken offline? And did you think about the loss of Amazon videos? Of course not, you are just so superior. Fix your own dinner tonight. I am sleeping in the back bedroom tonight.”

The first story is “How Amazon’s In-House First Aid Clinics Push Injured Employees to Keep Working.” Imagine. Amazon creating a welcoming work environment in which injured employees are supposed to work. Amazon is pushing into healthcare. The article states:

“What some companies are doing, and I think Amazon is one of them, is using their own clinics to ‘treat people’ and send them right back to the job, so that their injury doesn’t have to be recordable,” says Jordan Barab, a former deputy assistant secretary at OSHA who writes a workplace safety newsletter.

Will Amazon’s other health care units operate in a similar way? Of course not.

The second story is “Authors and Booksellers Urge Justice Dept. to Investigate Amazon.” Imagine. Amazon exploiting its modest online bookstore and its instant print business to take sales away from the “real” publishers. The article states:

On Wednesday[August 16, 2023], the Open Markets Institute, an antitrust think tank, along with the Authors Guild and the American Booksellers Association, sent a letter to the Justice Department and the Federal Trade Commission, calling on the government to curb Amazon’s “monopoly in its role as a seller of books to the public.”

Wow. Unfair? Some deliveries arrive in a day. A Kindle book pops up in the incredibly cluttered and reader-hostile interface in seconds. What’s not to like?

The third story is from the “real news outfit” MSN which recycles the estimable CNBC “talking heads”. This story is “Amazon Adds a New Fee for Sellers Who Ship Their Own Packages.” The happy family of MSN and CNBC report:

Beginning Oct. 1, members of Amazon’s Seller Fulfilled Prime program will pay the company a 2% fee on each product sold, according to a notice sent to merchants … The e-commerce giant also charges sellers a referral fee between 8% and 15% on each sale. Sellers may also pay for things like warehouse storage, packing and shipping, as well as advertising fees.

What’s the big deal?

To admirer who grew up relying on a giant company, no problem.

Stephen E Arnold, August 21, 2023

The ISP Ploy: Heck, No, Mom. I Cannot Find My Other Sock?

August 16, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Before I retired, my team and I were doing a job for the US Senate. One day at lunch we learned that Google could not provide employment and salary information  to a government agency housed in the building in which we were working. The talk, as I recall, was tinged with skepticism. If a large company issues paychecks and presumably files forms with the Internal Revenue Service, records about who and wages were available. Google allowed many people to find answers, but the company could not find its employment data. The way things work in Washington, DC, to the best of my recollection, a large company with considerable lobbying help and a flock of legal eagles can make certain processes slow. As staff rotate, certain issues get pushed down the priority pile and some — not everyone, of course — fade away.

8 16 cant find it mom

A young teen who will mature into a savvy ISP tells his mom, “I can’t find my other sock. It is too hard for me to move stuff and find it. If it turns up, I will put it in the laundry.” This basic play is one of the keys to the success of the Internet Service Provider the bright young lad runs today. Thanks, MidJourney. You were back online and demonstrating gradient malfunctioning. Perhaps you need a bit of the old gain of function moxie?

I thought about this “inability” to deliver information when I read “ISPs Complain That Listing Every Fee Is Too Hard, Urge FCC to Scrap New Rule.” I want to focus on one passage in the article and suggest that you read the original report. Keep in mind my anecdote about how a certain big tech outfit handles some US government requests.

Here’s the snippet from the long source document:

…FCC order said the requirement to list “all charges that providers impose at their discretion” is meant to help broadband users “understand which charges are part of the provider’s rate structure, and which derive from government assessments or programs.” These fees must have “simple, accurate, [and] easy-to-understand name[s],” the FCC order said. “Further, the requirement will allow consumers to more meaningfully compare providers’ rates and service packages, and to make more informed decisions when purchasing broadband services. Providers must list fees such as monthly charges associated with regulatory programs and fees for the rental or leasing of modem and other network connection equipment,” the FCC said.

Three observations about the information in the passage:

  1. The argument is identical to that illustrated by the teen in the room filled with detritus. Crap everywhere makes finding easy for the occupant and hard for anyone else. Check out Albert Einstein’s desk on the day he died. Crap piled everywhere. Could he find what he needed? According to his biographers, the answer is, “Yes.”
  2. The idea that a commercial entity which bills its customers does not have the capacity to print out the little row entries in an accounting system is lame in my opinion. The expenses have to labeled and reported. Even if they are chunked like some of the financial statements crafted by the estimable outfits Amazon and Microsoft, someone has the notes or paper for these items. I know some people who could find these scraps of information; don’t you?
  3. The wild and crazy government agencies invite this type of corporate laissez faire behavior. Who is in charge? Probably not the government agency if some recent anti-trust cases are considered as proof of performance.

Net net: Companies want to be able to fiddle the bills. Period. Printing out comprehensive products and services prices reduces the gamesmanship endemic in the online sector.

Stephen E Arnold, August 16, 2023

Sam AI-Man: A Big Spender with Trouble Ahead?

August 15, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

$700,000 per day. That’s an interesting number if it is accurate. “ChatGPT In Trouble: OpenAI May Go Bankrupt by 2024, AI Bot Costs Company $700,000 Every Day” states that the number is the number. What’s that mean? First, forget salaries, general and administrative costs, the much-loved health care for humans, and the oddments one finds on balance sheets. (What was that private executive flight to Tampa Bay?)

81 cannt pay ees

A young entrepreneur realizes he cannot pay his employees. Thanks, MidJourney, whom did you have in your digital mind?

I am a dinobaby, but I can multiply. The total is $255,500,000. I want to ask about money (an investment, of course) from Microsoft, how the monthly subscription fees are floating the good ship ChatGPT, and the wisdom of hauling an orb to scan eyeballs from place to place. (Doesn’t that take away from watching the bourbon caramel cookies reach their peak of perfection? My hunch is, “For sure.”)

The write up reports:

…the shift from non-profit to profit-oriented, along with CEO Sam Altman’s lack of equity ownership, indicates OpenAI’s interest in profitability. Although Altman might not prioritize profits, the company does. Despite this, OpenAI hasn’t achieved profitability; its losses reached $540 million since the development of ChatGPT.

The write up points out that Microsoft’s interest in ChatGPT continues. However, the article observes:

Complicating matters further is the ongoing shortage of GPUs. Altman mentioned that the scarcity of GPUs in the market is hindering the company’s ability to enhance and train new models. OpenAI’s recent filing for a trademark on ‘GPT-5’ indicates their intention to continue training models. However, this pursuit has led to a notable drop in ChatGPT’s output quality.

Another minor issue facing Sam AI-Man is that legal eagles are circling. The Zuck dumped his pet Llama as open source. And the Google and Googley chugs along and Antropic “clawed” into visibility.

Net net: Sam AI-Man may find that he will an opportunity to explain how the dial on the garage heater got flipped from Hot to Fan Only.

Stephen E Arnold, August 15, 2023

Killing Horses? Okay. Killing Digital Information? The Best Idea Ever!

August 14, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Fans at the 2023 Kentucky Derby were able to watch horses killed. True, the sport of kings parks vehicles and has people stand around so the termination does not spoil a good day at the races. It seems logical to me that killing information is okay too. Personally I want horses to thrive without brutalization with mint juleps, and in my opinion, information deserves preservation. Without some type of intentional or unintentional information, what would those YouTuber videos about ancient technology have to display and describe?

In the Age of Culling” — an article in the online publication tedium.co — I noted a number of ideas which resonated with me. The first is one of the subheads in the write up; to wit:

CNet pruning its content is a harbinger of something bigger.

The basic idea in the essay is that killing content is okay, just like killing horses.

The article states:

I am going to tell you right now that CNET is not the first website that has removed or pruned its archives, or decided to underplay them, or make them hard to access. Far from it.

The idea is that eliminating content creates an information loss. If one cannot find some item of content, that item of content does not exist for many people.

I urge you to read the entire article.

I want to shift the focus from the tedium.co essay slightly.

With digital information being “disappeared,” the cuts away research, some types of evidence, and collective memory. But what happens when a handful of large US companies effectively shape the information training smart software. Checking facts becomes more difficult because people “believe” a machine more than a human in many situations.

8 13 library

Two girls looking at a museum exhibit in 2028. The taller girl says, “I think this is what people used to call a library.” The shorter girl asks, “Who needs this stuff. I get what I need to know online. Besides this looks like a funeral to me.” The taller girl replies, “Yes, let’s go look at the plastic dinosaurs. When you put on the headset, the animals are real.” Thanks MidJourney for not including the word “library” or depicting the image I requested. You are so darned intelligent!

Consider the power information filtering and weaponizing conveys to those relying on digital information. The statement “harbinger of something bigger” is correct. But if one looks forward, the potential for selective information may be the flip side of forgetting.

Trying to figure out “truth” or “accuracy” is getting more difficult each day. How does one talk about a subject when those in conversation have learned about Julius Caesar from a TikTok video and perceive a problem with tools created to sell online advertising?

This dinobaby understands that cars are speeding down the information highway, and their riders are in a reality defined by online. I am reluctant to name the changes which suggest this somewhat negative view of learning. One believes what one experiences. If those experiences are designed to generate clicks, reduce operating costs, and shape behavior — what’s the information landscape look like?

No digital archives? No past. No awareness of information weaponization? No future. Were those horses really killed? Were those archives deleted? Were those Shakespeare plays removed from the curriculum? Were the tweets deleted?

Let’s ask smart software. No thanks, I will do dinobaby stuff despite the efforts to redefine the past and weaponize the future.

Stephen E Arnold, August 14, 2023

MBAs, Lawyers, and Sociology Majors Lose Another Employment Avenue

August 4, 2023

Note: Dinobaby here: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid. Services are now ejecting my cute little dinosaur gif. (´?_?`) Like my posts related to the Dark Web, the MidJourney art appears to offend someone’s sensibilities in the datasphere. If I were not 78, I might look into these interesting actions. But I am and I don’t really care.

Some days I find MBAs, lawyers, and sociology majors delightful. On others I fear for their future. One promising avenue of employment has now be cut off. What’s the job? Avocado peeler in an ethnic restaurant. Some hearty souls channeling Euell Gibbons may eat these as nature delivers them. Others prefer a toast delivery vehicle or maybe a dip to accompany a meal in an ethnic restaurant or while making a personal vlog about the stresses of modern life.

Chipotle’s Autocado Robot Can Prep Avocados Twice as Fast as Humans” reports:

The robot is capable of peeling, seeding, and halving a case of avocados significantly faster than humans, and the company estimates it could cut its typical 50-minute guacamole prep time in half…

When an efficiency expert from a McKinsey-type firm or a second tier thinker from a mid-tier consulting firm reads this article, there is one obvious line of thought the wizard will follow: Replace some of the human avocado peelers with a robot. Projecting into the future while under the influence of spreadsheet fever, an upgrade to the robot’s software will enable it to perform other jobs in the restaurant or food preparation center; for example, taco filler or dip crafter.

Based on this actual factual write up, I have concluded that some MBAs, lawyers, and sociology majors will have to seek another pathway to their future. Yard sale organizer, pet sitter, and possibly the life of a hermit remain viable options. Oh, the hermit will have GoFundMe and  BuyMeaCoffee pages. Perhaps a T shirt or a hat?

Stephen E Arnold, August 4, 2023

Netflix Has a Job Opening. One Job Opening to Replace Many Humanoids

July 27, 2023

Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

I read “As Actors Strike for AI Protections, Netflix Lists $900,000 AI Job.” Obviously the headline is about AI, money, and entertainment. Is the job “real”? Like so much of the output of big companies, it is difficult to determine how much is clickbait, how much is surfing on “real” journalists’ thirst for the juicy info, and how much is trolling? Yep, trolling. Netflix drives a story about AI’s coming to Hollywood.

The write up offers Hollywood verbiage and makes an interesting point:

The [Netflix job] listing points to AI’s uses for content creation:“Artificial Intelligence is powering innovation in all areas of the business,” including by helping them to “create great content.” Netflix’s AI product manager posting alludes to a sprawling effort by the business to embrace AI, referring to its “Machine Learning Platform” involving AI specialists “across Netflix.”

The machine learning platform or MLP is an exercise in cost control, profit maximization, and presaging the future. If smart software can generate new versions of old content, whip up acceptable facsimiles, and eliminate insofar as possible the need for non-elite humans — what’s not clear.

The $900,000 may be code for “Smart software can crank out good enough content at lower cost than traditional Hollywood methods.” Even the TikTok and YouTube “stars” face an interesting choice: [a] Figure out how to offload work to smart software or [b] learn to cope with burn out, endless squabbles with gatekeepers about money, and the anxiety of becoming a has-been.

Will humans, even talented ones, be able to cope with the pressure smart software will exert on the production of digital content? Like the junior attorney and cannon fodder for blue chip consulting companies, AI is moving from spitting out high school essays to more impactful outputs.

One example is the integration of smart software into workflows. The jargon about this enabling use of smart software is fluid. The $900,000 job focuses on something that those likely to be affected can understand: A good enough script and facsimile actors and actresses with a mouse click.

But the embedded AI promises to rework the back office processes and the unseen functions of humans just doing their jobs. My view is that there will be $900K per year jobs but far fewer of them than there are regular workers. What is the future for those displaced?

Crafting? Running yard sales? Creating fine art?

Stephen E Arnold, July 27, 2023

Ethics Are in the News — Now a Daily Feature?

July 27, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_t[1]Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

It is déjà vu all over again, or it seems like it. I read “Judge Finds Forensic Scientist Henry Lee Liable for Fabricating Evidence in a Murder Case.” Yep, that is the story. Scientist Lee allegedly has a knack for non-fiction; that is, making up stuff or arranging items in a special way. One of my relatives founded Hartford, Connecticut, in the 1635. I am not sure he would have been on board with this make-stuff-up approach to data. (According to our family lore, John Arnold was into beating people with a stick.) Dr. Lee is a big wheel because he worked on the 1995 running-through-airports trial. The cited article includes this interesting sentence:

[Scientist] Lee’s work in several other cases has come under scrutiny…

7 22 scientist and cookies

No one is watching. A noted scientist helps himself to the cookies in the lab’s cookie jar. He is heard mumbling, “Cookies. I love cookies. I am going to eat as many of these suckers as I can because I am alone. And who cares about anyone else in this lab? Not me.” Chomp chomp chomp. Thanks, MidJourney. You depicted an okay scientist but refused to create an image of a great leader whom I identified by proper name. For this I paid money?

Let me mention three ethics incidents which for one reason or another hit my radar:

  1. MIT accepting cash from every young person’s friend Jeffrey Epstein. He allegedly killed himself. He’s off the table.
  2. The Harvard ethics professor who made up data. She’s probably doing consulting work now. I don’t know if she will get back into the classroom. If she does it might be in the Harvard Business School. Those students have a hunger for information about ethics.
  3. The soon-to-be-departed president of Stanford University. He may find a future using ChatGPT or an equivalent to write technical articles and angling for a gig on cable TV.

What do these allegedly true incidents tell us about the moral fiber of some people in positions of influence? I have a few ideas. Now the task is remediation. When John Arnold chopped wood in Hartford, justice involved ostracism, possibly a public shaming, or rough justice played out to the the theme from Hang ‘Em High.

Harvard, MIT, and Stanford: Aren’t universities supposed to set an example for impressionable young minds? What are the students learning? Anything goes? Prevaricate? Cut corners? Grub money?

Imagine sweatshirts with the college logo and these words on the front and back of the garment. Winner. Some at Amazon, Apple, Facebook, Google, Microsoft, and OpenAI might wear them to the next off-site. I would wager that one turns up in the Rayburn House Office Building wellness room.

Stephen E Arnold, July 27, 2023

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta