IBM Watson: Creative Re-Explaining

February 25, 2022

I read “IBM Charts New Brand Direction With Campaign Built Around Creativity.”

The article contains an interesting statement allegedly articulated by Jonathan Adashek, cco and svp of marketing and communications at IBM

Adashek said IBM has historically had trouble articulating a clear and unifying purpose for a business as sprawling and multifaceted as the 110-year-old enterprise giant has become. But with business moves like the Kyndryl spinoff helping to strengthen the company’s core focus on growth areas like artificial intelligence and hybrid cloud computing, IBM decided it was time to boil down its public-facing message.

Does this mean the Watson “anti creativity” has been left behind?

Nope. Here’s some evidence:

Ogilvy global chief creative officer Liz Taylor said the concept for the campaign evolved out of the idea that a certain type of creative thinking is central to the business projects that many IBM clients are attempting to tackle—and that the company’s range of enterprise tech and consulting services can help with that. “It really started in the sort of notion of this era of creativity is the defining currency of business,” Taylor said. “It’s not necessarily creativity in the way I might think of my job, but our audience is just increasingly responsible for creating and executing visions for how to compete in this new world.”

Yep, IBM is creative: Clever contracts related to a certain nation state in the good old WW2 era, addressing cancer and telling, “You are history”, and now a type of creative different from that delivered by Madison Avenue-types.

Yep, “not necessarily creativity in the way I might think of my job” which is to explain that IBM fuels creativity.

Logical? Not necessarily. Did you know that IBM’s creativity allowed it to acquire a Microsoft Azure consulting firm called Neudisic? Buying innovation and a revenue stream for a semi successful cloud provider? Yes. Creative? Sure.

Stephen E Arnold, February 25, 2022

After Main Street Retail, Amazon Targets Big Blue

February 24, 2022

Amazon is making it easy to abandon mainframes for its cloud services, Data Center Knowledge reveals in, “AWS Is Out to Kill Mainframes.” In other words, IBM. AWS Mainframe Modernization allows companies to transfer their operations to AWS and either morph legacy applications into Java-based services or keep existing code with few changes. The service promises to automate the process with development, testing, and deployment tools. Though some folks are still mainframe aficionados, others see those systems as decidedly out of date. Writer Max Smolaks admits mainframes excel at processing power, security, and uptime. However, he explains:

“These systems are incredibly expensive and difficult to maintain, and the pool of people qualified to deal with their legacy software is shrinking all the time. AWS has been trying to get customers off mainframes and into its data centers for years. This time, the company says it has built a runtime enthronement provides all the necessary compute, memory, and storage to run both refactored and replatformed applications while automatically handling capacity provisioning, security, load balancing, scaling, and application health monitoring. Since this is all done via public cloud, there are no upfront costs, and customers only pay for the amount of compute provisioned.”

Mainframe Modernization is not yet fully deployed across the globe, but is available for preview in certain regions. We are reminded the concept of remastering legacy software has been done before:

“A similar model has been promoted by other companies, like the Swiss startup LzLabs, which has been developing a product called Software-Defined Mainframe since 2011, based on its own COBOL and Java interoperability architecture. Going in a different direction, the Open Mainframe Project founded in 2015 is attempting to make existing machines a lot more useful, by teaching them to run on Linux, rather than proprietary operating systems like z/OS.”

Smolaks notes folks have been foretelling the death of the mainframe for decades now. Will this AWS initiative be the one to finally vanquish it?

Cynthia Murrell, February 24, 2022

Google Maps: A Pithy Abstract

February 23, 2022

If you want to see an example of a very interesting précis navigate to this Twitter sequence. Google’s terms of service for Google Maps is very Googley. You can read the stipulations at this link. A tweet thing user using the entity name “pzakrzewski” offers this summary:

Don’t use it.

The “it” is Google Maps.

Never fear pzakrzewski I am not able to figure out [a] how to find a location, [b] locate street view, and [c] go to the Cuba Libre restaurant, once ostracized because the establishment did not play Google local or Google ads.

I agree with the spirit and intent of your excellent distillation of Googley writing.

Stephen E Arnold, February 23, 2022

Department of Defense: Troubling News about Security

February 21, 2022

It looks like a lack of resources and opaque commercial cloud providers are two factors hampering the DOD’s efforts to keep the nation cyber-safe. Breaking Defense discusses recent research from the Pentagon’s Director of Operational Test and Evaluation (DOT&E) in, “Pentagon’s Cybersecurity Tests Aren’t Realistic, Tough Enough: Report.” We encourage anyone interested in this important topic to check out the article and/or the report itself. Reporter Jaspreet Gill summarizes:

“[The report] states DoD should refocus its cybersecurity efforts on its cyber defender personnel instead of focusing primarily on the technology associated with cyber tools, networks and systems, and train them to face off against more real threats earlier in the process. For now, cybersecurity ‘Red Teams’ are stretched too thin and the ones that do test military systems are doing it with one hand tied behind their back compared to what actual adversaries would do, the report said.”

Enabling these teams to do their best work would mean giving them more time on the network to test vulnerabilities, more extensive toolsets, realistic rules of engagement, and better end-to-end planning, the report explains. In addition, it states, cyber security training must be expanded to include mission defense teams, system users, response-action teams, commanders, and network operators. We also learn that current funding practices effectively prohibit setting up offices dedicated to cyber technology effectiveness and training. Seriously? See the write-up for more recommendations that should be obvious.

The following bit is particularly troubling in this age of increasing privatization and corporate power. Gill informs us:

“The assessment also found DoD’s cyber concerns increasingly mirror those in the commercial sector due to increasing reliance on commercial products and infrastructure, especially with cloud services. The report recommends the Pentagon renegotiate contracts with commercial cloud providers and establish requirements for future contracts. ‘The DOD increasingly uses commercial cloud services to store highly sensitive, classified data, but current contracts with cloud vendors do not allow the DOD to independently assess the security of cloud infrastructure owned by the commercial vendor, preventing the DOD from fully assessing the security of commercial clouds. Current and future contracts must provide for threat-realistic, independent security assessments by the DOD of commercial clouds, to ensure critical data is protected.’”

Well yes—again that seems obvious. Public-private partnerships should be enacted with a dash of common sense. Unfortunately, that can be difficult to come by amidst bureaucracy.

Cynthia Murrell, February 21, 2022

Google Minus: Putting Wood Behind Confusion

February 17, 2022

I read “Google+ Is Dead Again, Maybe for Good This Time.” Here in my redoubt in rural Kentucky, the social network thing has not been a thing. We do try to keep track of some of the Googley management decisions.

The write up explains that Google+ was terminated sort of in 2019. Then the article explains that Google Minus became Currents, also a backwater in this here hollow. But here’s the summary of Google management’s ability to create helpful services and serve its customers:

Google said that the introduction of the Spaces group chat app last year negated the need for Currents, so it plans to wind it down starting in 2023. Before it does that, however, it will add new capabilities to Spaces to accommodate some of Currents’ social features… As was the case with Google+, the usage and purpose of Currents was likely unclear to many users. That issue extends to Spaces, as well, unfortunately. As Google described it last year, Spaces is an evolution of Rooms but is a part of Google Chat that’s designed for group messaging, much like Slack. Got it?

Actually, no. I do understand the management acumen behind this modifications. No big time revenue, no wood.

Stephen E Arnold, February 17, 2022

Coinbase and the Super Bowl: How About Seamless Scaling

February 14, 2022

I read “Crypto Exchange Coinbase’s Website Crashes After Screening QR-Code Super Bowl Ad.” Apparently the Coinbase Super Bowl ad worked. Lots of clicks made it clear that the chatter about scaling resulted in the firm’s Web site falling over. But there’s good news. According to the article:

However, Coinbase Chief Product Officer Surojit Chatterjee announced Monday that the platform is now up and running.  “Coinbase just saw more traffic than we’ve ever encountered, but our teams pulled together and only had to throttle traffic for a few minutes. We are now back and ready for you at http://drops.coinbase.com. Humbled to have been witness to this,” Chatterjee said in a tweet.

Yep, the cloud is magic. I am tempted to mention the misstep related to the theft of billions in Bitcoin. You can read the wordy New Yorker explanation at this link.

Does this build confidence in cloud computing? Sure.

Stephen E Arnold, February 14, 2022

Google: Maximizing Profit via Education Policy?

February 14, 2022

A few days ago, I summarized one of the policy changes which benefit Google, put academic researchers in a budget pickle, and change the rules for certain types of university-type research. A year ago, Google wrote a chipper Googley blog post letting those in academia know that the institutions would have to pay up for storage after crossing the 100 terabyte boundary. You can read that Beyond Search essay at this link.

I spotted on the blog alternative service Medium (as in medium when ordering a grilled mushroom sandwich) this article “Learn from Google’s Data Engineers: Dimensional Data Modeling Is Dead.” If Google wants something dead, Google delivers.

You can grind though this essay with interesting sentences like this one: “In early days of computing, storage cost a premium; as much as $90,000 in 1985.” Helpful, right?

Here’s the meatiest statement in the write up by an honest-to-goodness Googler:

The cost of 1gb of Google Cloud storage per month is just 2 cents.

Let’s assume that this metric is on the money. What about those fees Google will charge or plans to charge the colleges and universities disabused of the “free” storage. When one is faced with losing irreplaceable or uncopyable volumes of data in the time available before deletion, what does one do?

Answer: Probably pay up.

How much money will Google make? I am not sure. Why not use some downtime to figure out how the one gigabyte to $0.02 works out? Start your calculations using these helpful parameters. Be Googley, of course. And remember Google explained what would happen a year ago in a nifty and smiley-face type essay. Helpful? For sure.

Will this move be profitable? Yep. Yep.

Stephen E Arnold, February 14, 2022

Microsoft: Engineering Insecurity

February 11, 2022

I read the happy words in “Former Amazon Exec Inherits Microsoft’s Complex Cybersecurity Legacy in Quest to Solve ‘One of the Greatest Challenges of our Time.’”

Bringing together existing groups from across the company, the new organization numbers 10,000 people including existing and open positions, representing more than 5% of the tech giant’s nearly 200,000 employees.

Microsoft has 200,000 employees and 10,000 of them are working to deal with the “greatest challenge” of our time. How many might be willing to share information with bad actors for cash? How many might make a coding error, plan to go back and fix it, and then forget? How many are working to deal with the security issues which keep Steve Gibson chortling when he explains a problem for a listener to the Security Now podcast?

Now that macros have been disabled a massive security issue has been addressed. Quick action which took more than two decades to wrestle to the ground. Plus, there’s the change in what one can permit Defender to defend. This is an outstanding move for those who locate and test specialized service software. Helpful? Well, sort of.

But the big things to me are update processes, Exchange, the the MSFT fluggy clouds. For me, no answers yet.

Some of the security issues are unknown unknowns. I am not sure there is a solution, but a former Amazon executive is on a quest just like those described by the noted futurist Miguel de Cervantes Saavedra who described the antics of an individual with certain firmly held ideas about windmills.

Stephen E Arnold, February 11, 2022

Tech Giants: Are There Reasons for Complaining about Tiny Component Vendors?

February 8, 2022

I read “Tiny chips, Big Headaches.” The write up is interesting and it comes at a time which follows [a] record earnings and [b] before the anti-trust cowboys begin their roundup. I found this paragraph notable:

But there is growing anxiety that as cloud-computing networks have become larger and more complex, they are still dependent, at the most basic level, on computer chips that are now less reliable and, in some cases, less predictable. In the past year, researchers at both Facebook and Google have published studies describing computer hardware failures whose causes have not been easy to identify. The problem, they argued, was not in the software — it was somewhere in the computer hardware made by various companies.

The write up concludes that fixes  are “a little bit like changing an engine while an airplane is still flying.” This statement is attributed too Gary Smerdon, a wizard at TidalScale.

Let’s step back.

The alleged technology monopolies are eager to cement their market dominance. One way to do this is to become like AMD: Smart people paying other people to fabricate their silicon and assemble their gizmos. It stands to reason that really smart people like those at the tech giants want to gain control and be like Apple. Apple went its own direction and seems to have a lucrative allegedly monopoly and some fascinating deals with people like a certain online advertising outfit for search.

What’s the argument for becoming more like Henry Ford’s River Rouge operation. That’s the one that ingested iron ore at one end of the facility and output automobiles at the other end. Today the raw material is user clicks and the outputs are monetization of messages to the users or the crafting of subscription services that are tough to resist.

My take on the reasons for pointing the finger at third parties is more of the shifting blame. This method was evident when Mr. Zuckerberg said Apple’s “privacy” policy created some headwinds. Sure, the Zuckbook has other headwinds, but the point is that it is useful to focus blame elsewhere.

However, the write up advances a point which I found interesting. Here is the passage from the write up I noted:

In the past year, researchers at both Facebook and Google have published studies describing computer hardware failures whose causes have not been easy to identify. The problem, they argued, was not in the software — it was somewhere in the computer hardware made by various companies.

I want to direct your attention to this statement: “The problem… was not in the software.”

Now that is an interesting observation about software. The general rule is that software has flaws. Maybe Steve Gibson can generate “perfect” software for SpinRite, but how many at the alleged technology monopolies follow his practices? I would assert that many at the alleged technology monopolies know what his method is; therefore, if certain wizards don’t know something, it clearly is not worth knowing in the first place.

I interpreted the statement that “The problem … was not in the software.”

Hubris, thy manifestation is those who believe their software was not a problem.

Ho, ho, ho.

My concern is that presenting an argument that failures in uptime are someone else’s problem invites the conclusion, “Well, we will be more like Apple. Hasta la vista, Intel.”

Personally I don’t care what the alleged technology monopolies do. Trouble looms for these outfits regardless of the direction in which I look. What annoys me is that the Gray Lady is pretty happy telling the alleged technology monopolies’ story.

The problem is not the software. The problem is the human thing: Reformation, disinformation, and misinformation as stealth weapons in the battle for continued market dominance.

Stephen E Arnold, February 8, 2022

Data Federation? Loser. Go with a Data Lake House

February 8, 2022

I have been the phrase “data lake house” or “datalake house.” I noted some bold claims about a new data lake house approach in “Managed Data Lakehouse Startup Onehouse Launches with $8M in Funding.” The write up states:

One of the flagship features of Onehouse’s lakehouse service is a technology called incremental processing. It allows companies to start analyzing their data soon after it’s generated, which is difficult when using traditional technologies.

The write up adds:

The company’s lakehouse service automatically optimizes customers’ data ingestion workflows to improve performance, the startup says. Because the service is delivered via the cloud on a fully managed basis, customers don’t have to manage the underlying infrastructure.

The idea of course is that traditional methods of handling data are [a] slow, [b] expensive, and [c] difficult to implement.

The premise is that the data lake house delivers more efficient use of data and a way to “future proof the data architected for machine learning / data science down the line.”

When I read this I thought of Vivisimo’s explanation of its federating method. IBM bought Vivisimo, and I assume that it is one of the ingredient in IBM’s secret big data sauce. MarkLogic also suggested in one presentation I sat through that its system would ingest data and the MarkLogic system (once eyed by the Google as a possible acquisition) would allow near real time access to the data. One person in the audience was affiliated with the US Library of Congress, and that individual seemed quite enthused about MarkLogic. And there are companies which facilitate data manipulation; for example, Kofax and its data connectors.

From my point of view, the challenge is that today large volumes of data are available. These data have to be moved from point A to point B. Ideally data do not require transformation. At some point in the flow, data in motion can be processed. There are firms which offer real time or near real time data analytics; for example, Trendalyze.com.

Conversion, moving, saving, and then doing something “more” with the data remain challenges. Maybe Onehouse has the answer?

Stephen E Arnold, February 8, 2022

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta