Click Counting: It Is 1992 All Over Again

March 31, 2025

dino orange_thumb_thumb_thumb_thumbDinobaby says, “No smart software involved. That’s for “real” journalists and pundits.

I love it when search engine optimization experts, online marketing executives, and drum beaters for online advertising talk about clicks, clickstreams, and click metrics. Ho ho ho.

I think I was involved in creating a Web site called Point (The Top 5% of the Internet). The idea was simple: Curate and present a directory of the most popular sites on the Internet. It was a long shot because the team did not want to do drugs, sex, and a number of other illegal Web site profiles for the directory. The idea was that in 1992 or so, no one had a Good Housekeeping Seal of Approval-type of directory. There was Yahoo, but if one poked around, some interesting Web sites would display in their low resolution, terrible bandwidth glory.

To my surprise, the idea worked and the team wisely exited the business when someone a lot smarter than the team showed up with a check. I remember fielding questions about “traffic”. There was the traffic we used to figure out what sites were popular. Then there was traffic we counted when visitors to Point hit the home page and read profiles of sites with our Good Housekeeping-type of seal.

I want to share that from those early days of the Internet the counting of clicks was pretty sketchy. Scripts could rack up clicks in a slow heartbeat. Site operators just lied or cooked up reports that served up a reality in terms of tasty little clicks.

Why are clicks bogus? I am not prepared to explain the dark arts of traffic boosting which today is greatly aided by  scripts instantly generated by smart software. Instead I want to highlight this story in TechCrunch: “YouTube Is Changing How YouTube Shorts Views Are Counted.” The article does a good job of explaining how one monopoly is responding to its soaring costs and the slow and steady erosion of its search Nile River of money.

The write up says:

YouTube is changing how it counts views on YouTube Shorts to give creators a deeper understanding of how their short-form content is performing

I don’t know much about YouTube. But I recall watching little YouTubettes which bear a remarkable resemblance to TikTok weaponized data bursts just start playing. Baffled, I would watch a couple of seconds, check that my “autoplay” was set to off, and then kill the browser page. YouTubettes are not for me.

Most reasonable people would want to know several things about their or any YouTubette; for example:

  1. How many times did a YouTubette begin to play and then was terminated in less that five seconds
  2. How many times a YouTubette was viewed from start to bitter end
  3. How many times a YouTubette was replayed in its entirety by a single user
  4. What device was used
  5. How many YouTubettes were “shared”
  6. The percentage of these data points compared against the total clicks of a short nature or the full view?

You get the idea. Google has these data, and the wonderfully wise but stressed firm is now counting “short views” as what I describe as the reality: Knowing exactly how many times a YouTubette was played start to finish.

According to the write up:

With this update, YouTube Shorts will now align its metrics with those of TikTok and Instagram Reels, both of which track the number of times your video starts or replays. YouTube notes that creators will now be able to better understand how their short-form videos are performing across multiple platforms. Creators who are still interested in the original Shorts metric can view it by navigating to “Advanced Mode” within YouTube Analytics. The metric, now called “engaged views,” will continue to allow creators to see how many viewers choose to continue watching their Shorts. YouTube notes that the change won’t impact creators’ earnings or how they become eligible for the YouTube Partner Program, as both of these factors will continue to be based on engaged views rather than the updated metric.

Okay, responding to the competition from one other monopolistic enterprise. I get it. Okay, Google will allegedly provided something for a creator of a YouTubette to view for insight. And the change won’t impact what Googzilla pays a creator. Do creators really know how Google calculates payments? Google knows. With the majority of the billions of YouTube videos (short and long) getting a couple of clicks, the “popularity” scheme boils down to what we did in 1992. We used whatever data was available, did a few push ups, and pumped out a report.

Could Google follow the same road map? Of course not. In 1992, we had no idea what we were doing. But this is 2025 and Google knows exactly what it is doing.

Advertisers will see click data that do not reflect what creators want to see and what viewers of YouTubettes and probably other YouTube content really want to know: How many people watched the video from start to finish?

Google wants to sell ads at perhaps the most difficult point in its 20 year plus history. That autoplay inflates clicks. “Hey, the video played. We count it,” can you conceptualize the statement? I can.

Let’s not call this new method “weaponization.” That’s too strong. Let’s describe this as “shaping” or “inflating” clicks.

Remember. I am a dinobaby and usually wrong. No high technology company would disadvantage a creator or an advertiser. Therefore, this change is no big deal. Will it help Google deal with its current challenges? You can now ask Google AI questions answered by its most sophisticated smart software for free.

Is that an indication that something is not good enough to cause people to pay money? Of course not. Google says “engaged views” are still important. Absolutely. Google is just being helpful.

Stephen E Arnold, March 31, 2025

Google: Android and the Walled Garden

March 31, 2025

dino orange_thumb_thumb_thumb_thumb_thumbDinobaby says, “No smart software involved. That’s for “real” journalists and pundits.

In my little corner of the world, I do not see Google as “open.” One can toss around the idea 24×7, and I won’t change my mind. Despite its unusual approach to management, the company has managed to contain the damage from Xooglers’ yip yapping about the company. Xoogler.co is focused on helping people. I suppose there are versions of Sarah Wynn-Williams “Careless People” floating around. Few talk much about THE Timnit Gebru “parrot” paper. Google is, it seems, just not the buzz generator it was in 2006, the year the decline began to accelerate in my opinion.

We have another example of “circling the wagons” strategy. It is a doozy.

Google Moves All Android Development Behind Closed Doors” reports with some “real” writing and recycling of Google generated slick talk an interesting shift in the world of the little green man icon:

Google had to merge the two branches, which lead to problems and issues, so Google decided it’s now moving all development of Android behind closed doors

How many versions of messaging apps did Google have before it decided that “let many flowers bloom” was not in line with the sleek profile the ageing Google want to flaunt on Wall Street?

The article asks a good question:

…if development happens entirely behind closed doors, with only the occasional code drop, is the software in question really open source? Technically, the answer is obviously ‘yes’ – there’s no requirement that development take place in public. However, I’m fairly sure that when most people think of open source, they think not only of occasionally throwing chunks of code over the proverbial corporate walls, but also of open development, where everybody is free to contribute, pipe in, and follow along.

News flash from the dinobaby: Open source software, when bandied about by folks who don’t worry too much about their mom missing a Social Security check means:

  1. We don’t want to chase and fix bugs. Make it open source and let the community do it for free.
  2. We probably have coded up something that violates laws. By making it open source, we are really benefiting those other developers and creating opportunities for innovation.
  3. We can use the buzzword “open source” and jazz the VCs with a term that is ripe with promise for untold riches
  4. A student thinks: I can make my project into open source and maybe it will help me get a job.
  5. A hacker thinks: I can get “cred” by taking my exploit and coding a version that penetration testers will find helpful and possibly not discover the backdoor.

I have not exhausted the kumbaya about open source.

It is clear that Google is moving in several directions, a luxury only Googzillas have:

First, Google says, “We will really, really, cross my fingers and hope to die, share code … just like always.

Second, Google can add one more oxen drawn wagon to its defensive circle. The company will need it when the licensing terms for Android include some very special provisions. Of course, Google may be charitable and not add additional fees to its mobile OS.

Third, it can wave the “we good managers” flag.

Fourth, as the write up correctly notes:

…Darwin, the open source base underneath macOS and iOS, is technically open source, but nobody cares because Apple made it pretty much worthless in and of itself. Anything of value is stripped out and not only developed behind closed doors, but also not released as open source, ensuring Darwin is nothing but a curiosity we sometimes remember exists. Android could be heading in the same direction.

I think the “could” is a hedge. I penciled in “will.” But I am a dinobaby. What do I know?

Stephen E Arnold, March 31, 2025

Apple CEO Chases Chinese AI and Phone Sales

March 31, 2025

While the hullabaloo about making stakes in China’s burgeoning market has died down, Big Tech companies still want pieces of the Chinese pie or dumpling would be a better metaphor here. An example of Big Tech wanting to entrench itself in the ChinBaiese market is Apple. Mac Rumors reports that Apple CEO Tim Cook was recently in China and he complimented start-up Deepseek for its AI models. The story, “Apple CEO Tim Cook Praises China’s Deepseek”

While Cook didn’t say he would pursue a partnership with Deepseek, he was impressed with their AI models. He called them excellent, because Deepseek delivers AI models with high performance capabilities that have lower costs and compute requirements. Deepseek’s research has been compared to OpenAI for achieving similar results by using less resources.

When Cook visited China he reportedly made an agreement with Alibaba Group to integrate its Qwen models into Apple Intelligence. There are also rumors that Apple’s speaking with Baidu about providing LLMs for the Chinese market.

Does this mean that Tim Apple hopes he can use Chinese smart tech in the iPhone and make that more appealing to Chinese users? Hmmmm.

Cook conducted more business during his visit:

In addition to his comments on AI, Cook announced plans to expand Apple’s cooperation with the China Development Research Foundation, alongside continued investments in clean energy development. Throughout his visit, Cook posted updates on the Chinese social media platform Weibo, showcasing a range of Apple products being used in classrooms, creative environments, and more.

Cook’s comments mark a continuation of Apple’s intensified focus on the Chinese market at a time when the company is facing declining iPhone shipments and heightened competition from domestic brands. Apple’s smartphone shipments in China are believed to have fallen by 25% year-over-year in the fourth quarter of 2024, while annual shipments dropped 17% to 42.9 million units, placing Apple behind local competitors Vivo and Huawei.”

It’s evident that Apple continues to want a piece of the Chinese dumpling, but also seeks to incorporate Chinese technology into its products. Subtle, Tim Apple, subtle.

Whitney Grace, March 31, 2025

Programmers: The Way of the Dodo Bird?

March 27, 2025

dino orange_thumb_thumb_thumb_thumb_thumb_thumbAnother dinobaby blog post. Eight decades and still thrilled when I point out foibles.

Let’s just assume that the US economy is A-OK. One discipline is indispensable now and in the future. What is it? The programmer.

Perhaps not if the information in “Employment for Computer Programmers in the U.S. Has Plummeted to Its Lowest Level Since 1980—Years Before the Internet Existed” is accurate.

The write up states:

There are now fewer computer programmers in the U.S. than there were when Pac-Man was first invented—years before the internet existed as we know it. Computer-programmer employment dropped to its lowest level since 1980, the Washington Post reported, using data from the Current Population Survey from the Bureau of Labor Statistics. There were more than 300,000 computer-programming jobs in 1980. The number peaked above 700,000 during the dot-com boom of the early 2000s but employment opportunities have withered to about half that today. U.S. employment grew nearly 75% in that 45-year period, according to the Post.

What’s interesting is that article makes a classification decision I wasn’t expecting; specifically:

Computer programmers are different from software developers, who liaise between programmers and engineers and design bespoke solutions—a much more diverse set of responsibilities compared to programmers, who mostly carry out the coding work directly. Software development jobs are expected to grow 17% from 2023 to 2033, according to the Bureau of Labor Statistics. The bureau meanwhile projects about a 10% decline in computer programming employment opportunities from 2023 to 2033.

Let’s go with the distinction.

Why are programmers’ jobs disappearing? The write up has the answer:

There has been a 27.5% plummet in the 12-month average of computer-programming employment since about 2023—coinciding with OpenAI’s introduction of ChatGPT the year before. ChatGPT can handle coding tasks without a user needing more detailed knowledge of the code being written. The correlation between the decline of programmer jobs and the rise of AI tools signals to some experts that the burgeoning technology could begin to cost some coding experts their jobs.

Now experts are getting fired? Does that resonate with everyone? Experts.

There is an upside if one indulges in a willing suspension of disbelief. The write up says:

Programmers will be required to perform complicated tasks, Krishna argued, and AI can instead serve to eliminate the simpler, time-consuming tasks those programmers would once need to perform, which would increase productivity and subsequently company performance.

My question, “Did AI contribute to this article?” In my opinion, something is off. It might be dependent on the references to the Bureau of Labor Statistics and “real” newspapers as sources for the numbers. Would a high school debate teacher give the green light to the logic in categorizing and linking those heading for the termination guillotine and those who are on the path to carpet land. The use of AI hype as fact is interesting as well.

I am thrilled to be a dinobaby.

Stephen E Arnold, March 27, 2025

The Future of Programming in an AI Spruik World

March 26, 2025

Software engineers are, reasonably, concerned about losing their jobs to AI. Australian blogger Clinton Boys asks, "How Will LLMs Take Our Jobs?" After reading several posts by programmers using LLMs for side projects, he believes such accounts suggest where we are headed. He writes:

"The consensus seems to be that rather than a side project being some sort of idea you have, then spend a couple of hours on, maybe learn a few things, but quickly get distracted by life or a new side project, you can now just chuck your idea into the model and after a couple of hours of iterating you have a working project. To me, this all seems to point to the fact that we are currently in the middle of a significant paradigm shift, akin to the transition from writing assembly to compiled programming languages. A potential future is unfolding before our eyes in which programmers don’t write in programming languages anymore, but write in natural language, and generative AI handles the grunt work of actually writing the code, the same way a compiler translates your C code into machine instructions."

Perhaps. But then, he ponders, will the job even fit the title of "engineer"? Will the challenges and creative potential many love about this career vanish? And what would they do then? Boys suggests several routes one might take, with the caveat that a realistic path forward would probably blend several of these. He recognizes one could simply give up and choose a different career entirely. An understandable choice, if one can afford to start over. If not, one might join the AI cavalcade by learning how to create LLMs and/or derive value from them. It may also be wise to climb the corporate ladder—managers should be safer longer, Boys expects. Then again one might play ostrich:

"You could also cross your fingers and hope it pans out differently — particularly if, like me you find the vision of the future spruiked by the most bullish LLM proponents a little ghoulish and offensive to our collective humanity."

Always an option, we suppose. I had to look up the Australian term "spruik." According to Wordsmith.org, it means "to make an elaborate speech, especially to attract customers." Fitting. Finally, Boys says, one could bet on software connoisseurs of the future. Much as some now pay more for hand-made pastries or small-batch IPAs, some clients may be willing to shell out for software crafted the old-fashioned way. One can hope.

Cynthia Murrell, March 26, 2025

YouTube: Another Big Cost Black Hole?

March 25, 2025

dino orange_thumbAnother dinobaby blog post. Eight decades and still thrilled when I point out foibles.

I read “Google Is in Trouble… But This Could Change Everything – and No, It’s Not AI.” The write up makes the case that YouTube is Google’s big financial opportunity. I agree with most of the points in the write up. The article says:

Google doesn’t clearly explain how much of the $40.3 billion comes from the YouTube platform, but based on their description and choice of phrasing like “primarily include,” it’s safe to assume YouTube generates significantly more revenue than just the $36.1 billion reported. This would mean YouTube, not Google Cloud, is actually Google’s second-biggest business.

Yep, financial fancy dancing is part of the game. Google is using its financial reports as marketing to existing stakeholders and investors who want a part of the still-hot, still-dominant Googzilla. The idea is that the Google is stomping on the competition in the hottest sectors: The cloud, smart software, advertising, and quantum computing.

image

A big time company’s chief financial officer enters his office after lunch and sees a flood of red ink engulfing his work space. Thanks, OpenAI, good enough.

Let’s flip the argument from Google has its next big revenue oil gusher to the cost of that oil field infrastructure.

An article appeared in mid-February 2025. I was surprised that the information in that write up did not generate more buzz in the world of Google watchers. “YouTube by the Numbers: Uncovering YouTube’s Ghost Town of Billions of Unwatched, Ignored Videos” contains some allegedly accurate information. Let’s assume that these data, like most information about online, is close enough for horseshoes or purely notional. I am not going to summarize the methodology. Academics come up with interesting ways to obtain information about closely guarded big company products and services.

The write up says:

the research estimates a staggering 14.8 billion total videos on YouTube as of mid-2024. Unsurprisingly, most of these videos are barely noticed. The median YouTube upload has just 41 views, with 4% garnering no views at all. Over 74% have no comments and 89% have no likes.

Here are a couple of other factoids about YouTube as reported in the Techspot article:

The production values are also remarkably modest. Only 14% of videos feature a professional set or background. Just 38% show signs of editing. More than half have shaky camerawork, and audio quality varies widely in 85% of videos. In fact, 40% are simply music tracks with no voice-over.

And another point I found interesting:

Moreover, the typical YouTube video is just 64 seconds long, and over a third are shorter than 33 seconds.

The most revealing statement in the research data appears in this passage:

… a senior researcher [said] that this narrative overlooks a crucial reality: YouTube is not just an entertainment hub – it has become a form of digital infrastructure. Case in point: just 0.21% of the sampled videos included any kind of sponsorship or advertising. Only 4% had common calls to action such as liking, commenting, and subscribing. The vast majority weren’t polished content plays but rather personal expressions – perhaps not so different from the old camcorder days.

Assuming the data are reasonably good Google has built plumbing whose cost will rival that of the firm’s investments in search and its cloud.

From my point of view, cost control is going to become as important as moving as quickly as possible to the old-school broadcast television approach to content. Hit shows on YouTube will do what is necessary to attract an audience. The audience will be what advertisers want.

Just as Google search has degraded to a popular “experience,” not a resource for individuals who want to review and extract high value information, YouTube will head the same direction. The question is, “Will YouTube’s pursuit of advertisers mean that the infrastructure required to permit free video uploads and storage be sustainable?

Imagine being responsible for capital investments at the Google. The Cloud infrastructure must be upgraded and enhanced. The AI infrastructure must be upgraded and enhanced. The quantum computing and other technology-centric infrastructures must be upgraded an enhanced. The adtech infrastructure must be upgraded and enhanced. I am leaving out some of the Google’s other infrastructure intensive activities.

The main idea is that the financial person is going to have a large job paying for hardware, software, maintenance, and telecommunications. This is a different cost from technical debt. These are on-going and constantly growing costs. Toss in unexpected outages, and what does the bean counter do. One option is to quit and another is to do the Zen thing to avoid have a stroke when reviewing the cost projections.

My take is that a hit in search revenue is likely to add to the firm’s financial challenges. The path to becoming the modern version of William Paley’s radio empire may be in Google’s future. The idea that everything is in the cloud is being revisited by companies due to cost and security concerns. Does Google host some sketchy services on its Cloud?

YouTube may be the hidden revenue gem at Google. I think it might become the infrastructure cost leader among Google’s stellar product line up. Big companies like Google don’t just disappear. Instead the black holes of cost suck them closer to a big event: Costs rise more quickly than revenue.

At this time, Google has three cost black holes. One hopes none is the one that makes Googzilla join the ranks of the street people of online dwell.

Net net: Google will have to give people what they will watch. The lowest common denominator will emerge. The costs flood the CFO’s office. Just ask Gemini what to do.

Stephen E Arnold, March 25, 2025

The Gentle Slide Down the Software Quality Framework

March 21, 2025

dino orange_thumb_thumb_thumb_thumb_thumb_thumb_thumbYep, another dinobaby original.

I listened to a podcast called “The WAN Show,” featuring a couple of technology buffs who sell T shirts, mugs, and screwdrivers. What was notable about the program which is available on Apple podcasts was the opening story. In a nutshell, the two fellows made clear some problems with Apple’s hardware. The key statement made by one of the fellows was, “I will pay my way to Cupertino and help you Apple engineers to fix the problems. I will do it for free.” A number of people younger than I believe that an individual can overcome a bureaucracy.

image

Someone is excited about taking the long slide down in software quality. Thanks, OpenAI, definitely good enough.

I forget about the comment and the pitch to buy a backpack until I read “Apple’s Software Quality Crisis: When Premium Hardware Meets Subpar Software.” The write up hit upon some of the WAN grousing and introduced a number of ideas about Apple’s management focus.

Here’s a comment from the write up I circled:

The performance issues don’t stop at sluggish response times. During these use cases, my iPad overheated, making it uncomfortable to hold or even rest the palm on, raising concerns about potential long-term hardware damage. What made this particularly frustrating is that these aren’t third-party applications pushing the hardware to its limits. These are Apple’s own applications that should be theoretically optimized for their hardware. After demonstrating the issues in person to Apple Store staff (that were courteous and professional), the support representative that was handling my case suggested a hardware replacement. However, after further discussion, we both concluded this was likely a software problem rather than a hardware defect.

To a dinobaby like me, I interpreted the passage as saying, “The problem can’t be fixed. Suck it up, buttercup.”

I then discovered more than 1,000 comments to the “Apple’s Software Quality Crisis” article. I scanned them and then turned to one of the ever reliable smart software systems to which I have access and asked, “What are the main themes of the 1,000 comments.

Here’s what the smart software output, and, please, keep in mind, that smart software hallucinates, goes bonkers, and if a product of Google, really has trouble with cheese-related prompts. The found points output are:

  • Persistent Bugs: Users report long-standing issues, such as date-handling errors in Contacts that have remained unresolved for years. ?
  • Declining User Experience: There’s a sentiment that recent design changes, like the macOS Settings app, have led to a less intuitive user experience. ?
  • Inconsistent Quality Across Platforms: Some users feel that Apple’s software quality has become comparable to other platforms, lacking the distinctiveness it once had.
  • Ineffective Bug Reporting: Concerns are raised about Apple’s bug reporting system, with users feeling their feedback doesn’t lead to timely fixes.

Okay, we have a sample based on one podcast, one blog essay, and a number of randos who have commented on the “Apple’s Software Quality Crisis” article. Let me offer several observations:

  1. Apple, like Amazon, Facebook (Metazuck or whatever), Google, and Microsoft cannot deliver software that does much more than achieve the status of “good enough.” Perhaps size and the limitations of humans contribute to this wide spread situation?
  2. The problem is not fixable because new software comes out and adds to the woes of the previous software. Therefore, the volume of problems go up and there is neither money nor time to pay down the technical debt. In my experience, this means that a slow descent on a quite fungible gradient occurs. The gravity of technical debt creates the issues the individuals complaining identify.
  3. The current economic and regulatory environment does not punish these organizations for their products and services. The companies’ managers chug along, chase their bonuses, and ignore the gentle drift to quite serious problems between the organizations and their customers.

So what? Sorry, I have no solutions. Many of the “fixes” require deep familiarity with origin software. Most fixes are wrappers because rewrites take too long or the information required to fix one thing and not break two others is not available.

Welcome, to the degrading status quo.

Stephen E Arnold, March 21, 2025

Good News for AI Software Engineers. Others, Not So Much

March 20, 2025

dino orange_thumb_thumbAnother dinobaby blog post. No AI involved which could be good or bad depending on one’s point of view.

Spring is on the way in rural Kentucky. Will new jobs sprout like the dogwoods? Judging from the last local business event I attended, the answer is, “Yeah, maybe not so much.”

But there is a bright spot in the AI space. “ChatGPT and Other AI Startups Drive Software Engineer Demand” says:

AI technology has created many promising new opportunities for software engineers in recent years.

That certainly appears to apply to the workers in the smart software power houses and the outfits racing to achieve greater efficiency via AI. (Does “efficiency” translate to non-AI specialist job reductions?)

Back to the good news. The article asserts:

Many sectors have embraced digital transformation as a means of improving efficiency, enhancing customer experience, and staying competitive. Industries like manufacturing, agriculture, and even construction are now leveraging technologies like the Internet of Things (IoT), artificial intelligence (AI), and machine learning. Software engineers are pivotal in developing, implementing, and maintaining these technologies, allowing companies to streamline operations and harness data analytics for informed decision-making. Smart farming is just one example that has emerged as a significant trend where software engineers design applications that optimize crop yields through data analysis, weather forecasting, and resource management.

Yep, the efficiency word again. Let’s now dwell on the secondary job losses, shall we. This is a good news blog post.

The essay continues:

The COVID-19 pandemic drastically accelerated the shift towards remote work. Remote, global collaboration has opened up exciting opportunities for most professionals, but software engineers are a major driving factor of that availability in any industry. As a result, companies are now hiring engineers from anywhere in the world. Now, businesses are actively seeking tech-savvy individuals to help them leverage new technologies in their fields. The ability to work remotely has expanded the horizons of what’s possible in business and global communications, making software engineering an appealing path for professionals all over the map.

I liked the “hiring engineers from anywhere in the world.” That poses some upsides like cost savings for US AI staff. That creates a downside because a remote worker might also be a bad actor laboring to exfiltrate high value data from the clueless hiring process.

Also, the Covid reference, although a bit dated, reminds people that the return to work movement is a way to winnow staff. I assume the AI engineer will not be terminated but for those unlucky enough to be in certain DOGE and McKinsey-type consultants targeting devices.

As I said, this is a good news write up. Is it accurate? No comment. What about efficiency? Sure, fewer humans means lower costs. What about engineers who cannot or will learn AI? Yeah, well.

Stephen E Arnold, March 20, 2025

AI: Apple Intelligence or Apple Ineptness?

March 20, 2025

dino orange_thumb_thumb_thumbAnother dinobaby blog post. No AI involved which could be good or bad depending on one’s point of view.

I read a very polite essay with some almost unreadable graphs. “Apple Innovation and Execution” says:

People have been claiming that Apple has forgotten how to innovate since the early 1980s, or longer – it’s a standing joke in talking about the company. But it’s also a question.

Yes, it is a question. Slap on your Apple goggles and look at the world from the fan boy perspective. AI is not a thing. Siri is a bit wonky. The endless requests to log in to use Facetime and other Apple services are from an objective point of view a bit stupid. The annual iPhone refresh. Er, yeah, now what are the functional differences again? The Apple car? Er, yeah.

image

Is that an innovation worm? Is that a bad apple? One possibility is that innovation worm is quite happy making an exit and looking for a better orchard. Thanks, You.com “Creative.” Good enough.

The write up says:

And ‘Apple Intelligence’ certainly isn’t going to drive a ‘super-cycle’ of iPhone upgrades any time soon. Indeed, a better iPhone feature by itself was never going to drive fundamentally different growth for Apple

So why do something which makes the company look stupid?

And what about this passage?

And the failure of Siri 2 is by far the most dramatic instance of a growing trend for Apple to launch stuff late. The software release cycle used to be a metronome: announcement at WWDC in the summer, OS release in September with everything you’d seen. There were plenty of delays and failed projects under the hood, and centres of notorious dysfunction (Apple Music, say), and Apple has always had a tendency to appear to forget about products for years (most Apple Watch faces don’t support the key new feature in the new Apple Watch) but public promise were always kept. Now that seems to be slipping. Is this a symptom of a Vista-like drift into systemically poor execution?

Some innovation worms are probably gnawing away inside the Apple. Apple’s AI. Easy to talk about. Tough to convert marketing baloney crafted by art history majors into software of value to users in my opinion.

Stephen E Arnold, March 20, 2025

AI: Meh.

March 19, 2025

It seems consumers can see right through the AI hype. TechRadar reports, “New Survey Suggests the Vast Majority of iPhone and Samsung Galaxy Users Find AI Useless—and I’m Not Surprised.” Both iPhones and Samsung Galaxy smartphones have been pushing AI onto their users. But, according to a recent survey, 73% of iPhone users and 87% of Galaxy users respond to the innovations with a resounding “meh.” Even more would refuse to pay for continued access to the AI tools. Furthermore, very few would switch platforms to get better AI features: 16.8% of iPhone users and 9.7% of Galaxy users. In fact, notes writer Jamie Richards, fewer than half of users report even trying the AI features. He writes:

“I have some theories about what could be driving this apathy. The first centers on ethical concerns about AI. It’s no secret that AI is an environmental catastrophe in motion, consuming massive amounts of water and emitting huge levels of CO2, so greener folks may opt to give it a miss. There’s also the issue of AI and human creativity – TechRadar’s Editorial Associate Rowan Davies recently wrote of a nascent ‘cultural genocide‘ as a result of generative AI, which I think is a compelling reason to avoid it. … Ultimately, though, I think AI just isn’t interesting to the everyday person. Even as someone who’s making a career of being excited about phones, I’ve yet to see an AI feature announced that doesn’t look like a chore to use or an overbearing generative tool. I don’t use any AI features day-to-day, and as such I don’t expect much more excitement from the general public.”

No, neither do we. If only investors would catch on. The research was performed by phone-reselling marketplace SellCell, which surveyed over 2,000 smartphone users.

Cynthia Murrell, March 19, 2025

Next Page »

  • Archives

  • Recent Posts

  • Meta