Some Happy, Some Sad in Seattle Over Cloud Deal Review

July 12, 2018

I know little about the procurement skirmishes fought over multi billion dollar deals for cloud services. The pragmatic part of my experience suggests that the last thing most statement of work and contract processes produce is efficient, cost effective contracts. Quite a few COTRs, lawyers, super grades, and mere SETAs depend on three things:

  1. Complex, lengthy processes; that is, work producing tasks
  2. Multiple vendors; for example, how many databases does one agency need? Answer: Many, many databases. Believe me, there are many great reasons ranging from the way things work in Washington to legacy systems which will never be improved in my lifetime.
  3. Politics. Ah, yes, lobbyists, special interests, friends of friends, and sometimes the fact that a senior official knows that a person once worked at a specific outfit.

When I read, “Deasy Pauses on JEDI Cloud Acquisition,” I immediately thought about the giant incumbent database champions like IBM Federal Systems and Oracle’s government operations unit.

deasy

Department of Defense CIO Dana Deasy wants a “full top down, bottom up review” of the JEDI infrastructure acquisition.

But there was a moment of reflection, when I realized that this procurement tussle will have significant impact on the Seattle area. You know, Seattle, the city which has delivered Microsoft Bob and the Amazon mobile phone.

Microsoft and Amazon are in the cloud business. Microsoft is the newcomer, but it is the outfit which has the desktops of many government agencies. Everyone loves SharePoint. The Department of Defense could not hold a briefing without PowerPoint.

Let’s not forget Amazon. That is the platform used by most government workers, their families, and possibly their friends if that Amazon account slips into the wild. Who could exist in Tyson’s Corner or Gaithersburg without Amazon delivering essential foods such as probiotic supplements for the dog.

Microsoft is probably thrilled that the JEDI procurement continues to be a work in progress. Amazon, on the other hand, is likely to be concerned that its slam dunk for a government cloud game home run has been halted due to procedural thunderstorms.

Thus, part of Seattle is really happy. Another part of Seattle is not so happy.

Since I don’t’ have a dog in this fight, my hunch is that little in Washington, DC changes from administrative change to administrative change.

But this Seattle dust up will be interesting to watch. I think it will have a significant impact on Amazon and Microsoft. IBM Federal Systems and Oracle will be largely unscathed.

Exciting procurement activity is underway. Defense Department CIO Deasy Deasy’s promise of a “full top down, bottom up review” sounds like the words to a song I have heard many times.

With $10 billion in play, how long will that review take? My hunch is that it will introduce the new CIO to a new concept, “government time.”

Stephen E Arnold, July 12, 2018

Google Cloud: Dissipating with a Chance for Unsettled Weather

July 4, 2018

I love Google. It’s relevant. I am not sure the folks at CNBC share my enthusiasm. Navigate to “Google Cloud’s COO Has Left after Less Than a Year.” To be exact, I think Diane Bryant, Google Cloud Chief Operating Officer, was a Googler for about 13 months. In Internet dog years, that a long time, is it not? Maybe not? Here’s a different employment number: Seven months.

I highlighted this passage:

Bryant’s hire was a win for the search giant’s cloud business, which is widely seen as No. 3 in the public cloud market, behind Amazon and Microsoft. As the relative newcomer in the space, Google Cloud’s challenge has been to prove its capabilities to large businesses, though Greene has said that there are no more “deal blockers” in the way of new contracts.

Fact, snark, digital corn beef hash?

I don’t know. I continue to wonder if Alphabet Google’s approach to management is going to allow the company to keep pace with and then surpass the Bezos buck machine.

I will be reviewing my Amazon research at the September Telestrategies ISS LE and intelligence conference in Washington, DC. I will focus on both management and technical tactics.

I am not sure there will be a reference to Google until I have a sense that it is managed for sustainable innovation, in the cloud and on the ground as it were.

Stephen E Arnold, July 4, 2018

Munich Migrates To Windows 10

March 28, 2018

Despite the superiority of other operating systems, Microsoft Windows still tops the list as a popular office enterprise tool.  Windows PCs have easy user interfaces, applications for easy management, and are understood at a near universal level.  It does not come as a surprise when Munich, Germany decided to implement a change to Windows 10, says Silicon: “Munich Approves 49.3 million Euro Windows 10 Migration Plan.”

Munich’s city council decided to spend over 50 million euros to migrate their computer system to Microsoft Windows 10.  This is the first major overhaul the city council has had since 2004 when they implemented a Linux desktop program.  Linux is the open source software of choice and the city council decided to use it to reduce their dependency on Microsoft.

The “LiMux” programme saw a customised version of Ubuntu Linux rolled out to about 14,800 of the city’s 29,000 users and LibreOffice used by more than 15,000, in a bid to reduce the government’s dependence upon Microsoft.  In 2012 then-mayor Christian Ude said LiMux had saved Munich more than €4m in licensing costs.  The rollout was completed in 2013, nearly 10 years after it began, but a political shift the following year saw leadership turn in favour of a return to Windows.

The transition back to Microsoft comes with a change in the city council’s leadership.  Dieter Reiter pushed fo have Microsoft license and he won.  The Microsoft Windows transition cost of over 49 million euros is only part of the 89 million euro IT overhaul that is in progress.  The IT overhaul also includes retraining and testing staff.

The Munich city council will not be migrating to Microsoft Office, which would incur an even higher price tag.  Munich will instead continue to use LibreOffice, because of the staff’s familiarity and the custom templates.  The city council also hopes to implement cloud application usage.

As with anything related to politics, opposing parties are critical of the return to Microsoft and say it wastes money.  Nothing new on that end and it only points to more organizational problems than a simple OS.

Whitney Grace, March 28, 2018

IBM Releases Power9 AI and Machine Learning Chip

February 12, 2018

Make no mistake, the new AI processor from IBM has Watson written all over it—but it does move the software into new territory. We get a glimpse from the brief write-up, “IBM Has a New Chip for AI and Machine Learning” at IT Pro Portal. The new chip, dubbed Power9, is now available through IBM’s cloud portal and through third-party vendors and is built into the new AC9222 platform. (See here for a more detailed discussion of both Power9 and AC9222.) Writer Sead Fadilpaši? quotes market analyst Patrick Moorhead, who states:

Power9 is a chip which has a new systems architecture that is optimized for accelerators used in machine learning. Intel makes Xeon CPUs and Nervana accelerators and NVIDIA makes Tesla accelerators. IBM’s Power9 is literally the Swiss Army knife of ML acceleration as it supports an astronomical amount of IO and bandwidth, 10X of anything that’s out there today.

That is strong praise. Fadilpaši? also quotes IBM’s Brad McCredie, who observes:

Modern workloads are becoming accelerated and the Nvidia GPU is a common accelerator. We have seen this trend coming. We built a deep relationship with them and a partnership between the Power system and the GPU. We have a unique bus that runs between the processor and the GPU and has 10x peak bandwidth over competitive systems.

Will the Power9 live up to its expectations? We suspect IBM has reason to hope for success here.

Cynthia Murrell, February 12, 2018

Moving Legacy Apps to the Cloud: Failure Looms

January 25, 2018

I read “How to Move Legacy Applications to the Cloud.” The write up is interesting because in my opinion what’s offered will guarantee failure. As I recall from English 101 taught by a certain PhD who believed in formulaic writing, a “how to” consists of a series of steps. These should be explained in detail and, if possible, illustrated with word pictures or diagrams. Get it wrong. Get an F.

This “How to Move Legacy” essay would probably warrant a rewrite.

Here’s an example of a real life situation.

A small company has a mainframe running a Cobol application to keep track of products. The company has a client server system to keep track of money. The company has a Windows system to provide data to the company’s Web site which resides on its ISP’s system.

Okay, read this article and tell me how to move this real life system to the cloud.

Telling me to pick a cloud provider which meets my needs is not a how to, gentle reader. It is hoo-hah which would make an English 101 instructor fret.

Some companies cannot move legacy applications to the cloud. In my experience there are several reasons:

First, the person who coded the legacy system may not be around anymore and reverse engineering a legacy system may not be something the current IT staff and its consultants are keen to tackle.

Second, figuring out how three systems which are working takes time and money. Note the money part. Documentation is often sparse. The flow diagrams are long gone. Why spend the money? I would love to hear the reasons from the soon-to-be-terminated project manager.

Third, savvy managers ask, “What’s the payoff?” Marketing generalizations should not be the “facts” marshaled to fund a cloud migration effort. If there are good reasons, these can be quantified or back up with verifiable information, not opinions from a vendor.

Articles which explain how to move legacy systems without facts, details, and a coherent plan of attack are not too helpful here in Harrod’s Creek.

Stephen E Arnold, January 25, 2018

Management resistance

FNaS Or Fake News As A Service Is Now A Thing

January 24, 2018

The above acronym “FNaS” is our own invention for “fake news as a service”; if you did not catch on it is a play on SaS or software as a service.  We never thought that this was a possible job, but someone saw a niche and filled it.  According to Unhinged Group in the article, “Fake News ‘As A Service’ Booming Among Cybercrooks” describes how this is a new market for ne’er do wells.  It does make sense that fake news would be a booming business because there are many organizations and people who want to take advantage of the public’s gullibility.

This is especially true for political and religious folks, who have a lot of power to sway those in power.  Digital Shadows, however, conducted a research survey and discovered that fake news services are hired to damage reputations and cause financial distress for organizations through disinformation campaigns.

How does this work?

The firm’s research stated that these services are often associated with “Pump and Dump” scams, schemes that aggressively promote penny stocks to inflate their prices before the inevitable crash and burn. Scammers buy low, hope that their promotions let the sell high, then flee with their loot and little regard for other investors.

 

A cryptocurrency variant of the same schemes has evolved and involves gradually purchasing major shares in altcoin (cryptocurrencies other than Bitcoin) and drumming up interest in the coin through posts on social media. The tool then trades these coins between multiple accounts, driving the price up, before selling to unsuspecting traders on currency exchanges looking to buy while prices are still rising.

One “Pump and Dump” service analysis discovered that they made an equivalent of $326,000 for ne’er do wells in less than two months.  Ever worse is that Digital Shadows found more than ten services that sell social media bot software for as low as $7.

It is not difficult to create a fake “legitimate” news site.  All it takes is a fake domain, cloning services, and backlinking to exploit these fake news stories.  Real legitimate news outlets and retailers are also targets.  Anyone and anything can be a target.

Whitney Grace, January 24, 2018

We Are Without a Paddle on Growing Data Lakes

January 18, 2018

The pooling of big data is commonly known as a “data lake.” While this technique was first met with excitement, it is beginning to look like a problem, as we learned in a recent Info World story, “Use the Cloud to Create Open, Connected Data Lakes for AI, Not Data Swamps.”

According to the story:

A data scientist will quickly tell you that the data lake approach is a recipe for a data swamp, and there are a few reasons why. First, a good amount of data is often hastily stored, without a consistent strategy in place around how to organize, govern and maintain it. Think of your junk drawer at home: Various items get thrown in at random over time, until it’s often impossible to find something you’re looking for in the drawer, as it’s gotten buried.

This disorganization leads to the second problem: users are often not able to find the dataset once ingested into the data lake.

So, how does one take aggregate data from a stagnant swamp to a lake one can traverse? According to Scientific Computing, the secret lies in separating the search function into two pieces, finding and searching. When you combine this thinking with Info World’s logic of using the cloud, suddenly these massive swamps are drained.

Patrick Roland, January 18, 2018

 

 

Amazon Cloud Injected with AI Steroids

January 17, 2018

Amazon, Google, and Microsoft are huge cloud computing rivals.  Amazon wants to keep up with the competition, says Fortune, in the article, “Amazon Reportedly Beefing Up Cloud Capabilities In The Cloud.”  Amazon is “beefing up” its cloud performance by injecting it with more machine learning and artificial intelligence.   The world’s biggest retailer is doing this by teaming up with AI-based startups Domino Data Lab and DataRobot.

Cloud computing is mostly used by individuals as computer backups and the ability to access their files from anywhere.  Businesses use it to run their applications and store data, but as cloud computing becomes more standard they want to run machine learning tasks and big data analysis.

Amazon’s new effort is code-named Ironman and is aimed at completing tasks for companies focused on insurance, energy, fraud detection, and drug discovery, The Information reported. The services will be offered to run on graphic processing chips made by Nvidia as well as so-called field programmable gate array chips, which can be reprogrammed as needed for different kinds of software.

Nvidia and other high-performing chip manufacturers such as Advanced Micro Devices and Intel are ecstatic about the competition because it means more cloud operators will purchase their products.  Amazon Web Services is one of the company’s fastest growing areas and continues to bring in the profits.

Whitney Grace, January 17, 2018

Cloud Computing Resources: Cost Analysis for Machine Learning

December 8, 2017

Information about the cost of performing a specific task in a cloud computing set up can be tough to get. Reliable cross platform, apples-to-apples cost analyses are even more difficult to obtain.

A tip of the hat to the author of “Machine Learning Benchmarks: Hardware Providers.” The article includes some useful data about the costs of performing tasks on the cloud services available from Amazon, Google,  Hetzner, and IBM,

My suggestion is to make a copy of the article.

The big surprise: Amazon was the high-cost service. Google is less expensive.

One downside: No Microsoft costs.

Stephen E Arnold, December 8, 2017

Healthcare Analytics Projected to Explode

November 21, 2017

There are many factors influencing the growing demand for healthcare analytics: pressure to lower healthcare costs, demand for more personalized treatment, the emergence of advanced analytic technology, and impact of social media.  PR Newswire takes a look at how the market is expected to explode in the article, “Healthcare Analytics Market To Grow At 25.3% CAGR From 2013 To 2024: Million Insights.”  Other important factors that influence healthcare costs are errors in medical products, workflow shortcomings, and, possibly the biggest, having cost-effective measures without compromising care.

Analytics are supposed to be able to help and/or influence all of these issues:

Based on the component, the global healthcare analytics market is segmented into services, software, and hardware. Services segment held a lucrative share in 2016 and is anticipated to grow steady rate during the forecast period. The service segment was dominated by the outsourcing of data services. Outsourcing of big data services saves time and is cost effective. Moreover, Outsourcing also enables access to skilled staff thereby eliminating the requirement of training of staff.

The cloud-based delivery is anticipated to grow and be the most widespread analytics platform for healthcare.  It allows remote access, avoids complicated infrastructures, and has real-time data tracking.  Adopting analytics platforms help curb the rising problems from cost to workforce to treatment the healthcare industry faces and will deal with in the future.  While these systems are being implemented, the harder part is determining how readily workers will be correctly trained on using them.

Whitney Grace, November 21, 2017

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta