March 3, 2014
We like to think we’ve left old computing formats in the past, but the Financial Review points out that is a misconception in “Cloud Computing Still Has a Mainframe Lining.” Organizations and governments in Australia are spouting their cloud-based policies left and right. The Australian Information Industry Association commissioned KPMG to estimate how the cloud can benefit the nation’s GDP. It showed that the nation would gain between $2-3 billion.
There has been criticism of the estimate that KPMG did not consider the benefits of on-site computing. KPMG was not asked to include this in their estimate. Australia is not even close to getting rid of their on-site setups. The move to the cloud is coming, but it is moving very slowly.
So mainframes will be around for a while:
“It is worth also noting that server platforms such as the mainframe – pronounced dead several times over the years – continue to play a critical role in most of Australia’s largest enterprises and government agencies, especially with core financial systems at the heart of the economy.
Australian Government Information Office data says mainframe spending among federal government agencies, for example, has remained around 6 per cent to 7 per cent of total government ICT expenditure for the past few years.”
Are we throwing the baby out with the bath water? For many organizations it is cheaper to remain with on-site computing than switching all functions over to the cloud. Face it, the current generation is entrenched in on-site computers. Cloud computing will take over, eventually.
February 13, 2014
Hybrid clouds involve a combination of a public cloud-based service along with usage of a private cloud system. CMS Wire says that this is a trend that will continue to grow in 2014 and the cover the latest in their article, “Hybrid Clouds for SharePoint: Great, but Not for Everyone.”
The article says:
“The focus has not only been the public cloud, but also the hybrid cloud, which combines public cloud services (like Office 365) and applications / storage located in a private cloud. According to Gartner, it’s this hybrid cloud model that will really find its wings in 2014. Gartner actually predicts that by 2017 over half of the mainstream organizations will have a hybrid cloud.”
Stephen E. Arnold is a longtime leader in search and often covers SharePoint on his information service, ArnoldIT.com. SharePoint and the cloud is a common topic on ArnoldIT.com, as users are intrigued by the Office 365 release. And while the jury is still out on concerns like security and ease of use, the cloud is a trend that is here to stay. The cost of storage continues to drop and users are more and more interested in supported services to streamline workflow.
Emily Rae Aldridge, February 13, 2014
February 7, 2014
The cloud assists businesses with users on the go as well as people who are dealing with the inevitable device crash. Amazon fully embraced the opportunities the cloud presented and debuted Amazon Web Services. Now, according to Maureen O’Gara of Sys-Con Media, “Mark Logic Leverages Amazon” with a new layer of cloud services.
MarkLogic Corporation adds a new level of cloud services, starting with its MarkLogic Server and it will allows customers to use its widgetry on a pay-per-hour basis.
Users will have the chance to take advantage of the features MarkLogic offers:
“The patented server, also certified on VMware’s virtualization platform, which lets users implement clouds on self-managed hardware, is generally used for custom publishing, search-based applications, content analytics, unified information access, metadata catalogs and threat intelligence systems.
It provides state-of-the-art features such as location awareness, real-time search and a shared-nothing cluster architecture that supports high performance against petabyte-scale databases.”
After uploading the cloud services, what will both Amazon and Mark Logic learn from the new cloud offerings? How will the clients learn to adapt the software for new uses? The sky is the limit and the clouds have hundreds of new experiences to try out.
Whitney Grace, February 07, 2014
February 4, 2014
IBM’s Watson is proceeding to the cloud. Apparently, though, the journey is proving more challenging than expected. The Register reports, “IBM’s Watson-as-a-Cloud: Is it a Bird? Is it a Plane? No, it’s Another Mainframe.” Writer Jack Clark peers through the marketing hype, maintaining that Watson does not translate to the cloud as easily as IBM would have us believe.
The key to Watson’s functionality is its DeepQA analysis engine, which uses an amalgam of Apache‘s Hadoop, Apache’s UIMA, and other tools to achieve machine learning. This means, says Clark, that more work than one might expect must be done to get set up with the cloudy Watson.
“Applying DeepQA to any new domain requires adaptation in three areas:
*Content adaptation involves organizing the domain content for hypothesis and evidence generation, modeling the context in which questions will be generated
*Training adaptation involves adding data in the form of sample training questions and correct answers from the target domain so that the system can learn appropriate weights for its components when estimating answer confidence
*Functional adaptation involves adding new domain-specific question analysis, candidate generation, hypothesis scoring and other components.
Think of a mainframe. Watson seems a lot like one of those, as it preferences long-term relationships, an undisclosed financial outlay, and lock-in-by-default as this technology is only fielded by IBM. That’s not a terribly bad thing, mind, as for some organisations a tool like this could be useful. But it does mean you are right to be sceptical when IBM starts portraying Watson as a cloud product that’s is easy to get started with.”
The article reports that IBM is working on a lab that will help firms in Silicon Valley craft Watson-related apps. That may lead to easier transitions in the future, but in the meantime, any company considering adopting Watson-as-a-cloud should go in understanding that there will be much work to do before reaping the benefits of the famous AI’s wisdom.
Cynthia Murrell, February 04, 2014
January 31, 2014
Sometimes a company can grow too fast for its own good. Take the case of DigitalOcean, which eWeek describes in its piece, “Scrubbing Data a Concern in the Digital Ocean Cloud.” It was recently discovered that the cloud hosting firm was not automatically scrubbing user data after every deletion of a virtual machine (VM) instance—not good for security. Apparently, the young company once scrubbed after each VM destroy request, but changed that policy as their growth ballooned.
Writer Sean Michael Kerner tells us:
“As Digital Ocean’s utilization went up, the company found that the scrubbing activity was degrading performance and decided to make it an option that API users needed to manually activate. [DigitalOcean CEO Moisey] Uretsky told eWEEK that even though the data scrubbing has an impact, it is now a cost that his company will bear.
Digital Ocean grew very quickly in 2013, to at least 7,000 Web-facing servers in June 2013, up from only 100 in December 2012, according to Netcraft. One of the reasons for the rapid rise has been Digital Ocean’s aggressive pricing, which starts at $5 for a server with 512MB of memory and a 20GB solid-state drive for a month of cloud service.”
At least the company is taking responsibility for, and learning from, the mistake. Not only is DigitalOcean now faithfully scrubbing every deleted VM instance in sight, Uretsky also specified that his company is hastening to make other changes based on customer feedback. They also, he noted, pledge not to reveal customer data to third parties. The imprudent scrub-optional policy only affected certain DigitalOcean API users, and it does not appear from the article that any programmers were harmed. Headquartered in New York City, DigitalOcean graduated from the TechStars startup accelerator program in 2012.
Cynthia Murrell, January 31, 2014
January 30, 2014
Amazon Web Services are a good way to store code and other data, but it can get a little pricey. Before you upload your stuff to the Amazon cloud, check out Heap’s article, “How We Estimated Our AWS Costs Before Shipping Any Code.” Heap is an iOS and Web analytics tool that captures every user interaction. The Heap team decided to build it, because there was not a product that offered ad-hoc analysis or analyzed an entire user’s activity. Before they started working on the project, the team needed to estimate their AWS costs to decide if the idea was a sustainable business model.
They needed to figure how much data was generated by a single user interaction, but then they had to find out where the data was stored and what to store it on. The calculations showed that for the business model to work a single visit would have to yield an average one-third of a cent to be worthwhile for clients.
CPU cores, compression, and reserve instances reduced costs, but there are some unexpected factors that inflated costs:
1. “AWS Bundling. By design, no single instance type on AWS strictly dominates another. For example, if you decide to optimize for cost of memory, you may initially choose cr1.8xlarge instances (with 244GB of RAM). But you’ll soon find yourself outstripping its paltry storage (240 GB of SSD), in which case you’ll need to switch to hs1.8xlarge instances, which offer more disk space but at a less favorable cost/memory ratio. This makes it difficult to squeeze savings out of our AWS setup.
2. Data Redundancy. This is a necessary feature of any fault-tolerant, highly available cluster. Each live data point needs to be duplicated, which increases costs across the board by 2x.”
Heap’s formula is an easy and intuitive way to calculate pricing for Amazon Cloud Services. Can it be applied to other cloud services?
Whitney Grace, January 30, 2014
January 30, 2014
SharePoint Online is getting good reviews, and it is a tempting move for many organizations. However, it is not as simple as just changing platforms. In order to have a successful transition, a little pre-planning is essential. Read more in the ITWeb article, “Are you Ready for SharePoint in the Cloud?”
The article begins:
“We’ve all heard lately how migrating a business system, application or solution to the cloud is going to make our lives so much easier and save us money, but is this in fact the case? In principle, cloud might already make sense to you, but let’s explore some practical considerations that need to be taken into account if you’re not sure whether you should be moving to SharePoint in the cloud.”
Stephen E. Arnold is a longtime search expert, and a follower of the ups and downs of SharePoint. He shares the latest news and trends through ArnoldIT.com. His SharePoint coverage shows that customers are eager to adopt the Cloud, and the hype is plentiful, but a better-planned switchover will ultimately be the key to an organization’s success.
Emily Rae Aldridge, January 30, 2014
January 3, 2014
With the changes in mobile computing and cloud computing, the weaknesses in SharePoint are being exposed. Add to that the fact that SharePoint does not function on a Mac platform and many organizations are looking for alternative solutions. CMS Wire looks at one option, Huddle, in their article, “Will Huddle Note App Hammer Another Nail in SharePoint’s Coffin?”
The article begins:
“Huddle has just announced the general release of Huddle Note, a new iOS application that enables users to create content in the cloud, share it inside or outside the firewall and collaborate with other workers on documents — all from a mobile device. Taking all Huddle’s functionality into account, the company’s management claims it provides a viable alternative to SharePoint.”
Stephen E. Arnold is a longtime leader in search and the brains behind ArnoldIT.com. He gives a lot of attention to SharePoint and SharePoint alternatives. Most organizations will need some piece of enterprise software, and SharePoint is the most common. Stay tuned for the latest SharePoint news and ways to improve your enterprise infrastructure.
Emily Rae Aldridge, January 3, 2014
December 26, 2013
For Obama’s 2012 re-election campaign, his team broke down data silos and moved all the data to a cloud repository. The team built Narwhal, a shared data store interface for all of the campaigns’ application. Narwhal was dubbed “Obama’s White Whale,” because it is almost a mythical technology that federal agencies have been trying to develop for years. While Obama may be hanging out with Queequag and Ishmael, there is a more viable solution for the cloud says GCN’s article, “Big Metadata: 7 Ways To Leverage Your Data In the Cloud.”
Data silo migration may appear to be a daunting task, but it is not impossible to do. The article states:
“Fortunately, migrating agency data to the cloud offers IT managers another opportunity to break down those silos, integrate their data and develop a unified data layer for all applications. In this article, I want to examine how to design metadata in the cloud to enable the description, discovery and reuse of data assets in the cloud. Here are the basic metadata description methods (what I like to think of as the “Magnificent Seven” of metadata!) and how to apply them to data in the cloud.”
The list runs down seven considerations when moving to the cloud: identification, static and dynamic measurement, degree scales, categorization, relationships, and commentary. The only thing that stands in trashing data silos is security and privacy. While this list is useful it is pretty basic textbook information that is applied to metadata in any situation. What makes it so special for the cloud?
Whitney Grace, December 26, 2013
December 17, 2013
The article titled DataStax Tests Enterprise-Grade Cassandra Database on Google Compute Engine on Yahoo Finance discusses the recent collaboration between Google and Datastax engineers. The results of the test were positive, with expected response times, operational constancy and strong disk I/O functioning under load.
The article explains the tests of Datastax Enterprise with Google Compute Engine:
“which recently became generally available to all developers. The combination of DataStax Enterprise and Google Compute Engine allows companies to deploy their critical applications on the Google Cloud Platform and grow their data to incredible levels while making sure they remain online at all times. DataStax and Google engineers collaborated to test and validate the scalability, reliability and performance of mission-critical online applications that are built on DataStax Enterprise with Google Compute Engine.”
Datastax boasts over 300 customers for its work powering big data apps. These include Adobe, eBay and Netflix. This collaboration with Google is planned to ease the use of Datastax in the cloud. Senior vice-president David Kloc of Datastax voiced his confidence in the new relationship, calling the platform “more reliable than ever before.” He has no reason to be humble, the NoSQL database that Datastax sells works securely with Apache Cassandra, enterprise search and visual management.
Chelsea Kerwin, December 17, 2013