IBM Releases Power9 AI and Machine Learning Chip

February 12, 2018

Make no mistake, the new AI processor from IBM has Watson written all over it—but it does move the software into new territory. We get a glimpse from the brief write-up, “IBM Has a New Chip for AI and Machine Learning” at IT Pro Portal. The new chip, dubbed Power9, is now available through IBM’s cloud portal and through third-party vendors and is built into the new AC9222 platform. (See here for a more detailed discussion of both Power9 and AC9222.) Writer Sead Fadilpaši? quotes market analyst Patrick Moorhead, who states:

Power9 is a chip which has a new systems architecture that is optimized for accelerators used in machine learning. Intel makes Xeon CPUs and Nervana accelerators and NVIDIA makes Tesla accelerators. IBM’s Power9 is literally the Swiss Army knife of ML acceleration as it supports an astronomical amount of IO and bandwidth, 10X of anything that’s out there today.

That is strong praise. Fadilpaši? also quotes IBM’s Brad McCredie, who observes:

Modern workloads are becoming accelerated and the Nvidia GPU is a common accelerator. We have seen this trend coming. We built a deep relationship with them and a partnership between the Power system and the GPU. We have a unique bus that runs between the processor and the GPU and has 10x peak bandwidth over competitive systems.

Will the Power9 live up to its expectations? We suspect IBM has reason to hope for success here.

Cynthia Murrell, February 12, 2018

Moving Legacy Apps to the Cloud: Failure Looms

January 25, 2018

I read “How to Move Legacy Applications to the Cloud.” The write up is interesting because in my opinion what’s offered will guarantee failure. As I recall from English 101 taught by a certain PhD who believed in formulaic writing, a “how to” consists of a series of steps. These should be explained in detail and, if possible, illustrated with word pictures or diagrams. Get it wrong. Get an F.

This “How to Move Legacy” essay would probably warrant a rewrite.

Here’s an example of a real life situation.

A small company has a mainframe running a Cobol application to keep track of products. The company has a client server system to keep track of money. The company has a Windows system to provide data to the company’s Web site which resides on its ISP’s system.

Okay, read this article and tell me how to move this real life system to the cloud.

Telling me to pick a cloud provider which meets my needs is not a how to, gentle reader. It is hoo-hah which would make an English 101 instructor fret.

Some companies cannot move legacy applications to the cloud. In my experience there are several reasons:

First, the person who coded the legacy system may not be around anymore and reverse engineering a legacy system may not be something the current IT staff and its consultants are keen to tackle.

Second, figuring out how three systems which are working takes time and money. Note the money part. Documentation is often sparse. The flow diagrams are long gone. Why spend the money? I would love to hear the reasons from the soon-to-be-terminated project manager.

Third, savvy managers ask, “What’s the payoff?” Marketing generalizations should not be the “facts” marshaled to fund a cloud migration effort. If there are good reasons, these can be quantified or back up with verifiable information, not opinions from a vendor.

Articles which explain how to move legacy systems without facts, details, and a coherent plan of attack are not too helpful here in Harrod’s Creek.

Stephen E Arnold, January 25, 2018

Management resistance

FNaS Or Fake News As A Service Is Now A Thing

January 24, 2018

The above acronym “FNaS” is our own invention for “fake news as a service”; if you did not catch on it is a play on SaS or software as a service.  We never thought that this was a possible job, but someone saw a niche and filled it.  According to Unhinged Group in the article, “Fake News ‘As A Service’ Booming Among Cybercrooks” describes how this is a new market for ne’er do wells.  It does make sense that fake news would be a booming business because there are many organizations and people who want to take advantage of the public’s gullibility.

This is especially true for political and religious folks, who have a lot of power to sway those in power.  Digital Shadows, however, conducted a research survey and discovered that fake news services are hired to damage reputations and cause financial distress for organizations through disinformation campaigns.

How does this work?

The firm’s research stated that these services are often associated with “Pump and Dump” scams, schemes that aggressively promote penny stocks to inflate their prices before the inevitable crash and burn. Scammers buy low, hope that their promotions let the sell high, then flee with their loot and little regard for other investors.

 

A cryptocurrency variant of the same schemes has evolved and involves gradually purchasing major shares in altcoin (cryptocurrencies other than Bitcoin) and drumming up interest in the coin through posts on social media. The tool then trades these coins between multiple accounts, driving the price up, before selling to unsuspecting traders on currency exchanges looking to buy while prices are still rising.

One “Pump and Dump” service analysis discovered that they made an equivalent of $326,000 for ne’er do wells in less than two months.  Ever worse is that Digital Shadows found more than ten services that sell social media bot software for as low as $7.

It is not difficult to create a fake “legitimate” news site.  All it takes is a fake domain, cloning services, and backlinking to exploit these fake news stories.  Real legitimate news outlets and retailers are also targets.  Anyone and anything can be a target.

Whitney Grace, January 24, 2018

We Are Without a Paddle on Growing Data Lakes

January 18, 2018

The pooling of big data is commonly known as a “data lake.” While this technique was first met with excitement, it is beginning to look like a problem, as we learned in a recent Info World story, “Use the Cloud to Create Open, Connected Data Lakes for AI, Not Data Swamps.”

According to the story:

A data scientist will quickly tell you that the data lake approach is a recipe for a data swamp, and there are a few reasons why. First, a good amount of data is often hastily stored, without a consistent strategy in place around how to organize, govern and maintain it. Think of your junk drawer at home: Various items get thrown in at random over time, until it’s often impossible to find something you’re looking for in the drawer, as it’s gotten buried.

This disorganization leads to the second problem: users are often not able to find the dataset once ingested into the data lake.

So, how does one take aggregate data from a stagnant swamp to a lake one can traverse? According to Scientific Computing, the secret lies in separating the search function into two pieces, finding and searching. When you combine this thinking with Info World’s logic of using the cloud, suddenly these massive swamps are drained.

Patrick Roland, January 18, 2018

 

 

Amazon Cloud Injected with AI Steroids

January 17, 2018

Amazon, Google, and Microsoft are huge cloud computing rivals.  Amazon wants to keep up with the competition, says Fortune, in the article, “Amazon Reportedly Beefing Up Cloud Capabilities In The Cloud.”  Amazon is “beefing up” its cloud performance by injecting it with more machine learning and artificial intelligence.   The world’s biggest retailer is doing this by teaming up with AI-based startups Domino Data Lab and DataRobot.

Cloud computing is mostly used by individuals as computer backups and the ability to access their files from anywhere.  Businesses use it to run their applications and store data, but as cloud computing becomes more standard they want to run machine learning tasks and big data analysis.

Amazon’s new effort is code-named Ironman and is aimed at completing tasks for companies focused on insurance, energy, fraud detection, and drug discovery, The Information reported. The services will be offered to run on graphic processing chips made by Nvidia as well as so-called field programmable gate array chips, which can be reprogrammed as needed for different kinds of software.

Nvidia and other high-performing chip manufacturers such as Advanced Micro Devices and Intel are ecstatic about the competition because it means more cloud operators will purchase their products.  Amazon Web Services is one of the company’s fastest growing areas and continues to bring in the profits.

Whitney Grace, January 17, 2018

Cloud Computing Resources: Cost Analysis for Machine Learning

December 8, 2017

Information about the cost of performing a specific task in a cloud computing set up can be tough to get. Reliable cross platform, apples-to-apples cost analyses are even more difficult to obtain.

A tip of the hat to the author of “Machine Learning Benchmarks: Hardware Providers.” The article includes some useful data about the costs of performing tasks on the cloud services available from Amazon, Google,  Hetzner, and IBM,

My suggestion is to make a copy of the article.

The big surprise: Amazon was the high-cost service. Google is less expensive.

One downside: No Microsoft costs.

Stephen E Arnold, December 8, 2017

Healthcare Analytics Projected to Explode

November 21, 2017

There are many factors influencing the growing demand for healthcare analytics: pressure to lower healthcare costs, demand for more personalized treatment, the emergence of advanced analytic technology, and impact of social media.  PR Newswire takes a look at how the market is expected to explode in the article, “Healthcare Analytics Market To Grow At 25.3% CAGR From 2013 To 2024: Million Insights.”  Other important factors that influence healthcare costs are errors in medical products, workflow shortcomings, and, possibly the biggest, having cost-effective measures without compromising care.

Analytics are supposed to be able to help and/or influence all of these issues:

Based on the component, the global healthcare analytics market is segmented into services, software, and hardware. Services segment held a lucrative share in 2016 and is anticipated to grow steady rate during the forecast period. The service segment was dominated by the outsourcing of data services. Outsourcing of big data services saves time and is cost effective. Moreover, Outsourcing also enables access to skilled staff thereby eliminating the requirement of training of staff.

The cloud-based delivery is anticipated to grow and be the most widespread analytics platform for healthcare.  It allows remote access, avoids complicated infrastructures, and has real-time data tracking.  Adopting analytics platforms help curb the rising problems from cost to workforce to treatment the healthcare industry faces and will deal with in the future.  While these systems are being implemented, the harder part is determining how readily workers will be correctly trained on using them.

Whitney Grace, November 21, 2017

Mongo DB Position upon Filing IPO

November 9, 2017

This article at Datamation, “MongoDB’s Mongo Moment,” suggests MongoDB is focused on the wrong foe. As the company filed for its $100 million IPO, its CEO Dev Ittycheria observed that competitor Oracle is “vulnerable” because it has lost appeal to developers. However, writer Matt Asay asserts developers never were very fond of Oracle, and that MondoDB’s real competition is AWS (Amazon Web Services). He posits:

As mentioned, however, the real question isn’t about MongoDB’s impact on Oracle, any more than MySQL had a material impact on Oracle. No, the question is how relevant MongoDB is to the growing population of modern applications. Quite simply: this is where the action is. As VMware executive (and former MongoDB executive) Jared Rosoff reasons, ‘Old workloads grew one database server at a time. New workloads add tens or hundreds of servers at a time.’

Indeed, as MongoDB vice president of cloud products Sahir Azam told me in an interview, ‘We see a higher percentage of legacy RDBMS moving to MongoDB. Tens of billions of spend that has traditionally gone to Oracle and other SQL vendors is now moving to open source RDBMS and MongoDB with app refreshes and new apps.’

Mongo has a significant advantage over AWS, writes Asay, in the flexibility it offers developers. He also notes the increased spending power developers are now enjoying within enterprises should help the company. One potential pitfall—Mongo spends way too much on marketing, which could cause investors to shy away. On the whole, however, Asay believes MongoDB is navigating a shifting market wisely. See the article for more on the company’s approach and some criticisms it has received. Founded in 2007, MongoDB is based in New York City and employs over 800 workers in locations around the world.

Cynthia Murrell, November 9, 2017

Big Data Less Accessible for Small and Mid-Size Businesses

October 31, 2017

Even as the term “Big Data” grows stale, small and medium-sized businesses (SMB’s) are being left behind in today’s data-driven business world. The SmartData Collective examines the issue in, “Is Complexity Strangling the Real-World Benefits of Big Data for SMB’s?” Writer Rehan Ijaz supplies this example:

Imagine a local restaurant chain fighting to keep the doors open as a national competitor moves into town. The national competitor will already have a competent Cloud Data Manager (CDM) in place to provide insight into what should be offered to customers, based on their past interactions. A multi-million-dollar technology is affordable, due to scale, for a national chain. The same can’t be said for a smaller, mom and pop type restaurant. They’ve relied on their gut instinct and hometown roots to get them this far, but it may not be enough in the age of Big Data. Large companies are using their financial muscle to get information from large data sets, and take targeted action to outmaneuver local competitors.

Pointing to an article from Forbes, Ijaz observes that the main barrier for these more modestly-sized enterprises is not any hesitation about the technology itself, but rather a personal issue—their existing marketing employees were not hired for their IT prowess, and even the most valuable data requires analysis to be useful. Few SMB’s are eager to embrace the cost and disruption of hiring data scientists and reorganizing their marketing teams; they have to be sure it will be worth the trouble.

Ijaz hopes that the recent increase in scalable, cloud-based analysis solutions will help SMB’s with these challenges. The question is, he notes, whether it is too late for many SMB’s to recover from their late foray into Big Data.

Cynthia Murrell, October 31, 2017

HP Enterprise Spins Software Division into Micro Focus International

October 23, 2017

It would seem that the saga of HP’s lamented 2011 Autonomy acquisition is now complete—Reuters announces, “Hewlett Packard Enterprise to Complete Software Spin-Off.” Reporter Salvador Rodriguez explains:

The enterprise software businesses, which include the widely used ArcSight security platform, have been merged with Micro Focus International Plc (MCRO.L), a British software company. HPE was formed when the company once known as Hewlett-Packard split into HPE and HP Inc in November 2015.

 

The spin-off comes as HPE adjusts to the rapid shift of corporate computing to cloud services offered by the likes of Amazon.com Inc (AMZN.O) and Microsoft Corp (MSFT.O). HPE aims to cater specifically to customers running services both on their own premises and in the cloud, said Ric Lewis, senior vice president of HPE’s cloud software group, in an interview.

 

The spin-off marks the end of HP’s unhappy tangle with Autonomy, which it acquired for $11 billion in an aborted effort to transform HP into an enterprise software leader. The ink was barely dry on the much-criticized deal when the company took an $8.8 billion writedown on it.

But wait, the story is not over quite yet—the legal case that began when HP sued Autonomy ’s chief officers continues. Apparently, that denouement is now HPE’s to handle. As for Micro Focus, Rodriguez reports it will now be run by former HPE Chief Operating Officer Chris Hsu, who plans to focus on growth through acquisitions. Wait… wasn’t that what started this trouble in the first place?

Cynthia Murrell, October 23, 2017

Next Page »

  • Archives

  • Recent Posts

  • Meta