Publishers and Remora: Choose the Right Host and Stop Complaining, Please

October 20, 2023

dino-10-19-timeline-333-fix-4_thumb_thumbThis essay is the work of a dumb humanoid. No smart software involved.

Today, let’s reflect on the suckerfish or remora. The fish attaches itself to a shark and feeds on scraps of the host’s meals or nibbles on the other parasites living on their food truck. Why think about a fish with a sucking disk on its head?

Navigate to “Silicon Valley Ditches News, Shaking an Unstable Industry.” The article reports as “real” news:

Many news companies have struggled to survive after the tech companies threw the industry’s business model into upheaval more than a decade ago. One lifeline was the traffic — and, by extension, advertising — that came from sites like Facebook and Twitter. Now that traffic is disappearing.

Translation: No traffic, no clicks. No clicks and no traffic mean reduced revenue. Why? The days of printed newspapers and magazines are over. Forget the costs of printing and distributing. Think about people visiting a Web site. No traffic means that advertisers will go where the readers are. Want news? Fire up a mobile phone and graze on the information available. Sure, some sites want money, but most people find free services. I like France24.com, but there are options galore.

Wikipedia provides a snap of a remora attached to a scuba diver. Smart remora hook on to a fish with presence.

The shift in content behavior has left traditional publishing companies with a challenge: Generating revenue. Newspapers specialized news services have tried a number tactics over the years. The problem is that the number of people who will pay for content is large, but finding those people and getting them to spit out a credit card is expensive. At the same time, the cost of generating “real” news is expensive as well.

In 1992, James B. Twitchell published Carnival Culture: The Trashing of Taste in America. The book offered insight into how America has embraced showmanship information. Dr. Twitchell’s book appeared 30 years ago. Today Google, Meta, and TikTok (among other digital first outfits) amplify the lowest common denominator of information. “Real” publishing aimed higher.

The reluctant adjustment by “white shoe” publishing outfits was to accept traffic and advertising revenue from users who relied on portable surveillance devices. Now the traffic generators have realized that “attention magnet” information is where the action is. Plus smart software operated by do-it-yourself experts provides a flow of information which the digital services can monetize. A digital “mom” will block the most egregious outputs. The goal is good enough.

The optimization of content shaping now emerging from high-technology giants is further marginalizing the “real” publishers.

Almost 45 years ago, the president of a company with a high revenue online business database asked me, “Do you think we could pull our service off the timesharing vendors and survive?” The idea was that a product popular on an intermediary service could be equally popular as a standalone commercial digital product.

I said, “No way.”

The reasons were obvious to me because my team had analyzed this question over the hill and around the barn several times. The intermediary aggregated information. Aggregated information acts like a magnet. A single online information resource does not have the same magnetic pull. Therefore, the cost to build traffic would exceed the financial capabilities of the standalone product. That’s why commercial database products were rolled up by large outfits like Reed Elsevier and a handful of other companies.

Maybe the fix for the plight of the New York Times and other “real” publishers anchored in print is to merge and fast. However, further consolidation of newspapers and book publishers takes time. As the New York Times “our hair is on fire” article points out:

Privately, a number of publishers have discussed what a post-Google traffic future may look like, and how to better prepare if Google’s A.I. products become more popular and further bury links to news publications… “Direct connections to your readership are obviously important,” Ms. LaFrance [Adrienne LaFrance, the executive editor of The Atlantic] said. “We as humans and readers should not be going only to three all-powerful, attention-consuming mega platforms to make us curious and informed.” She added: “In a way, this decline of the social web — it’s extraordinarily liberating.”

Yep, liberating. “Real” journalists can do TikToks and YouTube videos. A tiny percentage will become big stars and make big money until they don’t. The senior managers of “shaky” “real” publishing companies will innovate. Unfortunately start ups spawned by “real” publishing companies face the same daunting odds of any start up: A brutal attrition rate.

Net net: What will take the place of the old school approach to newspapers, magazines, and books. My suggestion is to examine smart software and the popular content on YouTube. One example is the MeidasTouch “network” on YouTube. Professional publishers take note. Newspaper and magazine publishers may also want to look at what Ben Meiselas and cohorts have achieved. Want a less intellectual approach to information dominance, ask a teenager about TikTok.

Yep, liberating because some of those in publishing will have to adapt because when X.com or another high technology alleged monopoly changes direction, the sucker fish has to go along for the ride or face a somewhat inhospitable environment, hunger, and probably a hungry predator from a bottom feeding investment group.

Stephen E Arnold, October 20, 2023

Do Teens Read or Screen Surf? Yes, Your Teens

October 2, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

I am glad I am old. I read “Study Reveals Some Teens Receive 5,000 Notifications Daily, Most Spend Almost Two Hours on TikTok.” The write up is a collection of factoids. I don’t know if these are verifiable, but taken as a group, the message is tough to swallow. Here’s a sample of the data:

  • Time spent of TikTok: Two hours a day or 38 percent of daily online use. Why? “Reading and typing are exhausting.”
  • 20 percent of the teenies in the sample receive more than 500 notifications a day. A small percentage get 5,000 per day.
  • 97 percent of teenies were on their phone during the school day.

The future is in the hands of the information gatekeepers and quasi-monopolies, not parents and teachers it seems.

What will a population of swipers, scrollers, and kick-backer do?

My answer is, “Not much other than information grazing.”

Sheep need herders and border collies nipping at their heels.

Thus, I am glad I am old.

Stephen E Arnold, October 2, 2023

What Is for Lunch? A Digital Hot Dog or Burger?

September 28, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_t[1]Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

I read “EU Warns Elon Musk after Twitter Found to Have Highest Rate of Disinformation.” My hunch is that the European Union did not plan on getting into the swamps of epistemological thought. But there the EU is. Knee deep. The write up includes a pointer too research about “disinformation.”

9 27 decide

“Do I want a digital hot dog or a digital burger?” The young decider must choose when grazing online. He believes he likes both hot dogs and burgers. But what is the right choice? Mom will tell him. Thanks, MidJourney, you gradient descent master.

The cited article states:

On Twitter, she [European commissioner V?ra Jourová] said “disinformation actors were found to have significantly more followers than their non-disinformation counterparts and tend to have joined the platform more recently than non-disinformation users”.

The challenge in my mind is one that occupied Henri Bergson. Now the EU wants to codify what information is “okay” and what information is “not okay.” The point of view becomes important. The actual “disinformation” is “information.” Therefore, the EU wants to have the power to make the determination. 

Is it possible the EU wants to become the gatekeeper? Is information blocked or deleted “gone”? What about those who believe the “disinformation”?  Pretty exciting and probably a bit problematic if the majority of a population believe the “disinformation” to be accurate. How does one resolve this challenge?

Another committee meeting to neutralize “disinformation” and the technologies facilitating dissemination? Sounds like a good next step? What’s for lunch?

Stephen E Arnold, September 28, 2023

Trust in an Online World: Very Heisenbergian Issue

September 12, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Digital information works a bit like a sandblaster. The idea is that a single grain of sand has little impact. But use a gizmo that pumps out a stream of sand grains at speed, and you have a different type of tool. The flow of online information is similar. No one gets too excited about one email or one short video. But pump out lots of these and the results is different.

9 12 red sofa

The sales person says, “You can this this red sofa for no money down.” The pitch is compelling. The sales person says, “You can read about our products on Facebook and see them in TikToks.” The husband and wife don’t like red sofas. But Facebook and TikTok? Thanks, MidJourney, continue your slide down the gradient descent.

The effects of more than 20 years of unlimited data flow, one can observe the effects in many places. I have described some of these effects in my articles which appeared in specialist publications, my monographs, and in my lectures. I want to focus on one result of the flow of electronic information; that is, the erosion of social structures. Online is not the only culprit, but for this short essay, it will serve my purpose.

The old chestnut is that information  is power is correct. Another truism is that the more information, the more transparency is created. That’s not a spot on statement.

Poll: Americans Believe AI Will Hurt Elections” explains how flows of information have allegedly eroded trust in the American democratic process. The write up states:

Half of Americans expect misinformation spread by AI to impact who wins the 2024 election — and one-third say they’ll be less trusting of the results because of artificial intelligence…

The allegedly accurate factoid can be interpreted in several ways. First, the statement about lack of trust may be disinformation. The idea is that process of voting will be manipulated. Second, a person can interpret the factoid as the truth about how information erodes a social concept. Third, the statement can be viewed as an error, like those which make peer reviewed articles suspect or non reproducible.

The power of information in this case is to view the statement as one of the grains of sand shot from the body shop’s sand blaster. If one pumps out enough “data” about a bad process, why wouldn’t a person just accept the statements as accurate. Propaganda, weaponized information, and online advertising work this way.

Each reader has to figure out how to interpret the statement. As the body of accessible online information expands, think of those items as sand grains. Now let’s allow smart software to “learn” from the sand grains.

At what point is the dividing line between what’s accurate and what’s not disappear.

Net net: Online information erodes. But it is not just trust which is affected. It is the thought process required to determine what is synthetic and what is “real.” Reality consists of flows of online information. Well, that’s an issue, isn’t it?

Net net: The new reality is uncertainty. The act of looking changes things. Very quantum and quite impactful on the social fabric in my opinion.

Stephen E Arnold, September 12, 2023

Content Moderation: Modern Adulting Is Too Much Work

August 28, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Content moderation requires editorial policies. Editorial policies cost money. Editorial policies must be communicated. Editorial policies must be enforced by individuals trained in what information is in bounds or out of bounds. Commercial database companies had editorial policies. One knew what was “in” Compendex, Predicasts, Business Dateline, and and similar commercial databases. Some of these professional publishers have worked to keep the old-school approach in place to serve their customers. Other online services dumped the editorial policies approach to online information because it was expensive and silly. I think that lax or no editorial policies is a bad idea. One can complain about how hard a professional online service was or is to use, but one knows the information placed into the database.

8 26 take out garbage

“No, I won’t take out the garbage. That’s a dirty job,” says the petulant child. Thanks, MidJourney, you did not flash me the appeal message this morning.

Fun fact. Business Dateline, originally created by the Courier Journal & Louisville Times, was the first online commercial database to include corrections to stories made by the service’s sources. I am not sure if that policy is still in place. I think today’s managers will have cost in mind. Extras like accuracy are going to be erased by the belief that the more information one has, the less a mistake means.

I thought about adulting and cost control when I read “Following Elon Musk’s Lead, Big Tech Is Surrendering to Disinformation.” The “real” news story reports:

Social media companies are receding from their role as watchdogs against political misinformation, abandoning their most aggressive efforts to police online falsehoods in a trend expected to profoundly affect the 2024 presidential election.

Creating, producing, and distributing electronic information works when those involved have a shared belief in accuracy, appropriateness, and the public good. One those old-fashioned ideas are discarded what’s the result? From my point of view, look around. What does one see in different places in the US and elsewhere? What can be believed? What is socially-acceptable behavior?

When one defines adulting in terms of cost, civil life is eroded in my opinion. Defining responsibility in terms of one’s self interest is one thing that seems to be the driving force of many decisions. I am glad I am a dinobaby. I am glad I am old. At least we tried to enforce editorial policies for ABI/INFORM, Business Dateline, the Health Reference Center, and the other electronic projects in which I was involved. Even our early Internet service ThePoint (Top 5% of the Internet) which became part of Lycos many years ago had an editorial policy.

Ah, the good old days when motivated professionals worked to provide accurate, reliable reference information. For those involved in those projects, I thank you. For those like the companies mentioned in the cited WaPo story, your adulting is indeed a childish response to an important task.

What is the fix? One approach is the Chinese government / TikTok paying Oracle to moderate TikTok content. I wonder what the punishment for doing a “bad” job is. Is this the method to make “correct” decisions? The surveillance angle is an expensive solution. What’s the alternative?

Stephen E Arnold, August 28, 2023


India Where Regulators Actually Try or Seem to Try

August 22, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

I read “Data Act Will Make Digital Companies Handle Info under Legal Obligation.” The article reports that India’s regulators are beavering away in an attempt to construct a dam to stop certain flows of data. The write up states:

Union Minister of State for Electronics and Information Technology Rajeev Chandrasekhar on Thursday [August 17, 2023] said the Digital Personal Data Protection Act (DPDP Act) passed by Parliament recently will make digital companies handle the data of Indian citizens under absolute legal obligation.

What about certain high-technology companies operating with somewhat flexible methods? The article uses the phrase “punitive consequences of high penalty and even blocking them from operating in India.”

8 18 eagles

US companies’ legal eagles take off. Destination? India. MidJourney captures 1950s grade school textbook art quite well.

This passage caught my attention because nothing quite like it has progressed in the US:

The DPDP [Digital Personal Data Protection] Bill is aimed at giving Indian citizens a right to have his or her data protected and casts obligations on all companies, all platforms be it foreign or Indian, small or big, to ensure that the personal data of Indian citizens is handled with absolute (legal) obligation…

Will this proposed bill become law? Will certain US high-technology companies comply? I am not sure of the answer, but I have a hunch that a dust up may be coming.

Stephen E Arnold, August 22, 2023

Why Encrypted Messaging Is Getting Love from Bad Actors

August 17, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

The easier it is to break the law or circumvent regulations, the more people will give into their darker nature. Yes, this is another of Arnold’s Laws of Online along with online data flows erode ethical behavior. I suppose the two “laws” go together like Corvettes and fuel stops, tattoos and body art, or Barbie and Ken dolls.

Banks Hit with $549 Million in Fines for Use of Signal, WhatsApp to Evade Regulators’ Reach” explains a behavior I noticed when I was doing projects for a hoop-de-do big time US financial institution.

Let’s jump back in time to 2005: I arrived for a meeting with the bank lugging my lecture equipment. As I recall, I had a couple of laptops, my person LCD projector, a covey of connectors, and a couple of burner phones and SIMs from France and the UK.

8 9 banker and mobiles

“What are you looking at?” queries the young financial analyst on the sell side. I had interrupted a young, whip-smart banker who was organizing her off-monitoring client calls. I think she was deciding which burner phone and pay-as-you-go SIM to use to pass a tip about a major financial deal to a whale. Thanks, MidJourney. It only took three times for your smart software to show mobile phones. Outstanding C minus work. Does this MBA CFA look innocent to you? She does to me. Doesn’t every banker have multiple mobile phones?

One bright bank type asked upon entering the meeting room as I was stowing and inventorying my gear after a delightful taxi ride from the equally thrilling New York Hilton, “Why do you have so many mobile phones?” I explained that I used the burners in my talks about cyber crime. The intelligent young person asked, “How do you connect them?” I replied, “When I travel, I buy SIMs in other countries. I also purchase them if I see a US outfit offering a pay-as-you-go SIM.” She did not ask how I masked my identity when acquiring SIMs, and I did not provide any details like throwing the phone away after one use.

Flash forward two months. This time it was a different conference room. My client had his assistant and the bright young thing popped into the meeting. She smiled and said, “I have been experimenting with the SIMs and a phone I purchased on Lexington Avenue from a phone repair shop.”

“What did you learn?” I asked.

She replied, “I can do regular calls on the mobile the bank provides. But I can do side calls on this other phone.”

I asked, “Do you call clients on the regular phone or the other phone?”

She said, “I use the special phone for special clients.”

Remember this was late 2005.

The article dated August 8, 2023, appeared 18 years after my learning how quickly bright young things can suck in an item of information and apply it to transferring information supposedly regulated by a US government agency. That’s when I decided my Arnold Law about people breaking the law when it is really easy one of my go-to sayings.

The write up stated:

U.S. regulators on Tuesday announced a combined $549 million in penalties against Wells Fargo and a raft of smaller or non-U.S. firms that failed to maintain electronic records of employee communications. The Securities and Exchange Commission disclosed charges and $289 million in fines against 11 firms for “widespread and longstanding failures” in record-keeping, while the Commodity Futures Trading Commission also said it fined four banks a total of $260 million for failing to maintain records required by the agency.

How long has a closely regulated sector like banking been “regulated”? A long time.

I want to mention that I have been talking about getting around regulations which require communication monitoring for a long time. In fact, in October 2023, at the Massachusetts / New York Association of Crime Analysts conference. In my keynote, I will update my remarks about Telegram and its expanding role in cyber and regular crime. I will also point out how these encrypted messaging apps have breathed new, more secure life into certain criminal activities. We have an organic ecosystem of online-facilitated crime, crime that is global, not a local stick up at a convenient store at 3 am on a rainy Thursday morning.

What does this news story say about regulatory action? What does it make clear about behavior in financial services firms?

I, of course, have no idea. Just like some of the regulatory officers at financial institutions and some regulatory agencies.

Stephen E Arnold, August 17, 2023

Does Information Filtering Grant the Power to Control People and Money? Yes, It Does

August 15, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

I read an article which I found interesting because it illustrates how filtering works. “YouTube Starts Mass Takedowns of Videos Promoting Harmful or Ineffective Cancer Cures.” The story caught my attention because I have seen reports that the US Food & Drug Administration has been trying to explain its use of language in the midst of the Covid anomaly. The problematic word is “quips.” The idea is that official-type information was not intended as more than a “quip.” I noted the explanations as reported in articles similar to “Merely Quips? Appeals Court Says FDA Denunciations of Iv$erm#ctin Look Like Command, Not Advice.” I am not interested in either the cancer or FDA intentions per se.

7 22 digital delphi

Two bright engineers built a “filter machine.” One of the engineers (the one with the hat) says, “Cool. We can accept a list of stop words or a list of urls on a watch list and block the content.” The other says, “Yes, and I have added a smart module so that any content entering the Info Shaper is stored. We don’t want to lose any valuable information, do we?” The fellow with the hat says, “No one will know what we are blocking. This means we can control messaging to about five billion people.” The co-worker says, “It is closer to six billion now.” Hey, MidJourney, despite your troubles with the outstanding Discord system, you have produced a semi-useful image a couple of weeks ago.

The idea which I circled in True Blue was:

The platform will also take action against videos that discourage people from seeking professional medical treatment as it sets out its health policies going forward.

I interpreted this to mean that Alphabet Google is now implementing what I would call editorial policies. The mechanism for deciding what content is “in bounds” and what content is “out of bounds” is not clear to me. In the days when there were newspapers and magazines and non-AI generated books, there were people of a certain type and background who wanted to work in departments responsible for defining and implementing editorial policies. In the days before digital online services destroyed the business models upon which these media depended were destroyed, the editorial policies operated as an important component of information machines. Commercial databases had editorial policies too. These policies helped provide consistent content based on the guidelines. Some companies did not make a big deal out of the editorial policies. Other companies and organizations did. Either way, the flow of digital content operated like a sandblaster. Now we have experienced 25 years of Wild West content output.

Why do II  — a real and still alive dinobaby — care about the allegedly accurate information in “YouTube Starts Mass Takedowns of Videos Promoting Harmful or Ineffective Cancer Cures”? Here are three reasons:

  1. Control of information has shifted from hundreds of businesses and organizations to a few; therefore, some of the Big Dogs want to make certain they can control information. Who wants a fake cancer cure? Like other types of straw men, most people say yes to this type of filtering. A B testing can “prove” that people want this type of filtering I would suggest.
  2. The mechanisms to shape content have been a murky subject for Google and other high technology companies. If the “Mass Takedowns” write up is accurate, Google is making explicit its machine to manage information. Control of information in a society in which many people lack certain capabilities in information analysis and the skills to check the provenance of information are going to operate in a “frame” defined by a commercial enterprise.
  3. The different governmental authorities appear to be content to allow a commercial firm to become the “decider in chief” when it comes to information flow. With concentration and consolidation comes power in my opinion.

Is there a fix? No, because I am not sure that independent thinking individuals have the “horsepower” to redirect the direction the big machine is heading.

Why did I bother to write this? My hope is that someone start thinking about the implications of a filtering machine. If one does not have access to certain information like a calculus book, most people cannot solve calculus problems. The same consequence when information is simply not available. Ban books? Sure, great idea. Ban information about a medication? Sure, great idea. Ban discourse on the Internet? Sure, great idea.

You may see where this type of thinking leads. If you don’t, may I suggest you read Alexis de Tocqueville’s Democracy in America. You can find a copy at this link. (Verified on August 15, 2023, but it may be disappeared at any time. And if you can’t read it, you will not know what the savvy French guy spelled out in the mid 19th century.) If you don’t know something, then the information does not exist and will not have an impact on one’s “thinking.”

One final observation to young people, although I doubt I have any youthful readers: “Keep on scrolling.”

Stephen E Arnold, August 15, 2023

 

Killing Horses? Okay. Killing Digital Information? The Best Idea Ever!

August 14, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Fans at the 2023 Kentucky Derby were able to watch horses killed. True, the sport of kings parks vehicles and has people stand around so the termination does not spoil a good day at the races. It seems logical to me that killing information is okay too. Personally I want horses to thrive without brutalization with mint juleps, and in my opinion, information deserves preservation. Without some type of intentional or unintentional information, what would those YouTuber videos about ancient technology have to display and describe?

In the Age of Culling” — an article in the online publication tedium.co — I noted a number of ideas which resonated with me. The first is one of the subheads in the write up; to wit:

CNet pruning its content is a harbinger of something bigger.

The basic idea in the essay is that killing content is okay, just like killing horses.

The article states:

I am going to tell you right now that CNET is not the first website that has removed or pruned its archives, or decided to underplay them, or make them hard to access. Far from it.

The idea is that eliminating content creates an information loss. If one cannot find some item of content, that item of content does not exist for many people.

I urge you to read the entire article.

I want to shift the focus from the tedium.co essay slightly.

With digital information being “disappeared,” the cuts away research, some types of evidence, and collective memory. But what happens when a handful of large US companies effectively shape the information training smart software. Checking facts becomes more difficult because people “believe” a machine more than a human in many situations.

8 13 library

Two girls looking at a museum exhibit in 2028. The taller girl says, “I think this is what people used to call a library.” The shorter girl asks, “Who needs this stuff. I get what I need to know online. Besides this looks like a funeral to me.” The taller girl replies, “Yes, let’s go look at the plastic dinosaurs. When you put on the headset, the animals are real.” Thanks MidJourney for not including the word “library” or depicting the image I requested. You are so darned intelligent!

Consider the power information filtering and weaponizing conveys to those relying on digital information. The statement “harbinger of something bigger” is correct. But if one looks forward, the potential for selective information may be the flip side of forgetting.

Trying to figure out “truth” or “accuracy” is getting more difficult each day. How does one talk about a subject when those in conversation have learned about Julius Caesar from a TikTok video and perceive a problem with tools created to sell online advertising?

This dinobaby understands that cars are speeding down the information highway, and their riders are in a reality defined by online. I am reluctant to name the changes which suggest this somewhat negative view of learning. One believes what one experiences. If those experiences are designed to generate clicks, reduce operating costs, and shape behavior — what’s the information landscape look like?

No digital archives? No past. No awareness of information weaponization? No future. Were those horses really killed? Were those archives deleted? Were those Shakespeare plays removed from the curriculum? Were the tweets deleted?

Let’s ask smart software. No thanks, I will do dinobaby stuff despite the efforts to redefine the past and weaponize the future.

Stephen E Arnold, August 14, 2023

Useful Cloud Market Share Data: Accurate? Well, Close Enough for Horseshoes

August 9, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Anyone looking for a handy summary of data about big cloud players will find “AWS vs Google Cloud vs Microsoft Azure” worth reading. The article mentions the big folks and includes some data about smaller (although large) players; for example, Oracle. Trigger warning: The article users the term “hyperscalers” which I find a bit rizzy for my rhetorical spice cupboard.

Here are three representative items from the article. For more numbers, navigate to the original, please.

  1. Amazon’s worldwide [cloud] market share is 34 percent.
  2. The Google Cloud (bless those kind Googlers) is a bold 10 percent.
  3. Microsoft “cloud” [a fuzzy wuzzy nebulous and undefined word] surpassed $110 billion in annual revenue for 2022 and Azure accounted for $55 billion of the $110 billion.

Why is the cloud a big money maker? The article has an answer: Generative AI. Okay, that’s a good reason. I think there may be other factors as well.

If you collect these types of data, you will find the short write up a good reference point for a few months.

Stephen E Arnold, August 9, 2023

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta