February 21, 2017
Competition continues in the realm of cloud technology. Amigo Bulls released an article, Can Google Cloud Really Catch Up With The Cloud Leaders?, that highlights how Google Cloud is behind Amazon Web Services and Microsoft Azure. However, some recent wins for Google are also mentioned. One way Google is gaining steam is through new clients; they signed Spotify and even some of Apple’s iCloud services are moving to Google Cloud. The article summarizes the current state,
Alphabet Inc’s-C (NSDQ:GOOG) Google cloud has for a long time lived in relative obscurity. Google Cloud results do not even feature on the company’s quarterly earnings report the way AWS does for Amazon (NSDQ:AMZN) and Azure for Microsoft (NSDQ:MSFT). This appears somewhat ironic considering that Google owns one of the largest computer and server networks on the planet to handle tasks such as Google Search, YouTube, and Gmail. Further, the Google Cloud Platform is actually cheaper than offerings by the two market leaders.
Enterprise accounts with legacy systems will likely go for Microsoft as a no-brainer given the familiarity factor and connectivity. Considering the enterprise sector will make up a large portion of cloud customers, Amazon is probably Google’s toughest competition. Spotify apparently moved to Google from Amazon because of the quality tools, including machine-learning, and excellence in customer service. We will continue following whether Google Cloud makes it as high in the sky as its peers.
Megan Feil, February 21, 2017
February 6, 2017
The article on Reuters titled Oracle-NetSuite Deal May Be Sweetest for Ellison emphasizes the perks of being an executive chairman like Larry Ellison, of Oracle. Ellison ranks as the third richest person in America and fifth in the world. The article suggests that his fortune of over $50B is often considered as mingling with Oracle’s $160B in a way that makes, if no one else, at least Reuters, very uncomfortable. The article does offer some context to the most recent acquisition of NetSuite, for which Oracle paid a 44% premium on a company of which Ellison owns a 45% stake.
NetSuite was founded by an ex-Oracle employee, bankrolled by Ellison. While Oracle concentrated on selling enterprise software to giant corporations, the upstart focused on servicing small and medium-sized companies using the cloud. The two companies’ businesses have increasingly overlapped as larger customers have become comfortable using web-based software.
As a result, it makes strategic sense to combine the two firms. And the process seems to have been handled right, with a committee of independent Oracle directors calling the shots.
The article also points out that such high surcharges aren’t all that unusual. Salesforce.com recently paid a 56% premium for Demandware. But in this case, things are complicated by Ellison’s potential conflict of interest. If Oracle had done more to invest in cloud business or NetSuite earlier, say four or five years ago, they would not find themselves forking over just under $10B now.
Chelsea Kerwin, February 6, 2017
February 1, 2017
The article titled Google Cloud Platform Releases New Database Services, Fighting AWS and Azure for Corporate Customers on GeekWire suggests that Google’s corporate offerings have been weak in the area of database management. Compared to Amazon Web Services and Microsoft Azure, Google is only wading into the somewhat monotonous arena of corporate database needs. The article goes into detail on the offerings,
Cloud SQL, Second Generation, is a service offering instances of the popular MySQL database. It’s most comparable to AWS’s Aurora and SQL Azure, though there are some differences from SQL Azure, so Microsoft allows running a MySQL database on Azure. Google’s Cloud SQL supports MySQL 5.7, point-in-time recovery, automatic storage resizing and one-click failover replicas, the company said. Cloud Bigtable is a NoSQL database, the same one that powers Google’s own search, analytics, maps and Gmail.
The Cloud Bigtable database is made to handle major workloads of 100+ petabytes, and it comes equipped with resources such as Hadoop and Spark. It will be fun to see what happens as Google’s new service offering hits the ground running. How will Amazon and Microsoft react? Will price wars arise? If so, only good can come of it, at least for the corporate consumers.
Chelsea Kerwin, February 1, 2017
January 18, 2017
Big Data and Cloud Computing were supposed to make things easier for the C-Suites to take billion dollar decisions. But it seems things have started to fall apart.
In an article published by Forbes titled The Data Warehouse Has Failed, Will Cloud Computing Die Next?, the author says:
A company that sells software tools designed to put intelligence controls into data warehousing environments says that traditional data warehousing approaches are flaky. Is this just a platform to spin WhereScape wares, or does Whitehead have a point?
WhereScape, a key player in Data Warehousing is admitting that the buzzwords in the IT industry are fizzing out. The Big Data is being generated, in abundance, but companies still are unsure what to do with the enormous amount of data that their companies produce.
Large corporations who already have invested heavily in Big Data are yet to find any RoIs. As the author points out:
Data led organizations have no idea how good their data is. CEOs have no idea where the data they get actually comes from, who is responsible for it etc. yet they make multi million pound decisions based on it. Big data is making the situation worse not better.
Looks like after 3D-Printing, another buzzword in the tech world, Big Data and Cloud Computing is going to be just a fizzled out buzzword.
Vishal Ingole, January 18, 2017
December 15, 2016
The cloud was supposed to save organizations a bundle on servers, but now we learn from Datamation that “Enterprises Struggle with Managing Cloud Costs.” The article cites a recent report from Dimensional Research and cloud-financial-management firm Cloud Cruiser, which tells us, for one thing, that 92 percent of organizations surveyed now use the cloud. Researchers polled 189 IT pros at Amazon Web Services (AWS) Global Summit in Chicago this past April, where they also found that 95 percent of respondents expect their cloud usage to expand over the next year.
However, organizations may wish to pause and reconsider their approach before throwing more money at cloud systems. Writer Pedro Hernandez reports:
Most organizations are suffering from a massive blind spot when it comes to budgeting for their public cloud services and making certain they are getting their money’s worth. Nearly a third of respondents said that they aren’t proactively managing cloud spend and usage, the study found. A whopping 82 percent said they encountered difficulties reconciling bills for cloud services with their finance departments.
The top challenge with the continuously growing public cloud resource is the ability to manage allocation usage and costs,’ stated the report. ‘IT and Finance continue to have difficulty working together to ascertain and allocate public cloud usage, and IT continues to struggle with technologies that will gather and track public cloud usage information.’ …
David Gehringer, principal at Dimensional Research, believes it’s time for enterprises to quit treating the cloud differently and adopt IT monitoring and cost-control measures similar to those used in their own data centers.
The report also found that top priorities for respondents included cost and reporting at 54 percent, performance management at 46 percent, and resource optimization at 45 percent. It also found that cloudy demand is driven by application development and testing, at 59 percent, and big data/ analytics at 31 percent.
The cloud is no longer a shiny new invention, but rather an integral part of most organizations. We would do well to approach its management and funding as we would other resource. The original report is available, with registration, here.
Cynthia Murrell, December 15, 2016
December 12, 2016
Peer-to-peer file sharing gets a boost with AlphaReign, a new torrent sharing site that enables registered users to share files anonymously using Distributed Hash Table.
TorrentFreak in an article titled Alphareign: DHT Search Engine Takes Public Torrents Private says:
AlphaReign.se is a new site that allows users to find torrents gathered from BitTorrent’s ‘trackerless’ Distributed Hash Table, or DHT for short. While we have seen DHT search engines before, this one requires an account to gain access.
The biggest issue with most torrent sites is The Digital Millennium Copyright Act (DMCA), which prohibits the sites (if possible) and the search engines from displaying search results on the search engine result page. As content or torrent indexes on AlphaReign are accessible only to registered users, seeders and leechers are free to share files without risking themselves.
Though most files shared through torrents are copyrighted materials like movies, music, software and books, torrents are also used by people who want to share large files without being spied upon.
AlphaReign also manages to address a persistent issue faced by torrent sites:
AlphaReign with new software allows users to search the DHT network on their own devices, with help from peers. Such a system would remain online, even if the website itself goes down.
In the past, popular torrent search engines like YTS, KickAssTorrents, The Pirate Bay, Torrentz among many others have been shut down owing to pressure from law enforcement agencies. However, if AlphaReign manages to do what it claims to, torrent users are going to be the delighted.
Vishal Ingole, December 12, 2016
December 9, 2016
GE (General Electric) makes appliances, such as ovens, ranges, microwaves, washers, dryers, and refrigerators. Once you get them out of the appliance market, their expertise appears to end. Fast Company tells us that GE wants to branch out into new markets and the story is in, “GE Wants To Be The Next Artificial Intelligence Powerhouse .”
GE is a multi-billion dollar company and they have the resources to invest in the burgeoning artificial intelligence market. They plan to employ two new acquisitions and bring machine learning to the markets they already dominate. GE first used machine learning in 2015 with Predix Cloud, which recorded industrial machinery sensor patterns. It was, however, more of a custom design for GE than one with a universal application.
GE purchased Bit Stew Systems, a company similar to the Predix Cloud except that collected industrial data, and Wise.io, a company that used astronomy-based technology to streamline customer support systems. Predix already has a string of customers and has seen much growth:
Though young, Predix is growing fast, with 270 partner companies using the platform, according to GE, which expects revenue on software and services to grow over 25% this year, to more than $7 billion. Ruh calls Predix a “significant part” of that extra money. And he’s ready to brag, taking a jab at IBM Watson for being a “general-purpose” machine-learning provider without the deep knowledge of the industries it serves. “We have domain algorithms, on machine learning, that’ll know what a power plant is and all the depth of that, that a general-purpose machine learning will never really understand,” he says.
GE is tackling issues in healthcare and energy issues with Predix. GE is proving it can do more than make a device that can heat up a waffle. The company can affect the energy, metal, plastic, and computer system used to heat the waffle. It is exactly like how mason jars created tools that will be used in space.
Whitney Grace, December 9, 2016
November 1, 2016
Technology conferences are the thing to do when you want to launch a product, advertise a new business, network, or get a general consensus about the tech industry. There are multiple conferences revolving around different aspects in the tech industry held each month. In October 2016, Pubcon took place in Las Vegas, Nevada and they had a very good turn out. The thing that makes a convention, though, is the guests. Pubcon did not disappoint as on the third day, Google’s search expert Gary Illyes delivered the morning keynote. (Apparently, Illyes also hold the title Chief of Sunshine and Happiness at Google). Outbrain summed up the highlights of Pubcon 2016’s third day in “Pubcon 2016 Las Vegas: Day 3.”
Illyes spoke about search infrastructure, suggesting that people switch to HTTPS. His biggest push for HTTPS was that it protected users from “annoying scenarios” and it is good for UX. Google is also pushing for more mobile friendly Web sites. It will remove “mobile friendly” from search results and AMP can be used to make a user-friendly site. There is even bigger news about page ranking in the Google algorithm:
Our systems weren’t designed to get two versions of the same content, so Google determines your ranking by the Desktop version only. Google is now switching to a mobile version first index. Gary explained that there are still a lot of issues with this change as they are losing a lot of signals (good ones) from desktop pages that are don’t exist on mobile. Google created a separate mobile index, which will be its primary index. Desktop will be a secondary index that is less up to date.
As for ranking and spam, Illyes explained that Google is using human evaluators to understand modified search better, Rankbrain was not mentioned much, he wants to release the Panda algorithm, and Penguin will demote bad links in search results. Google will also release “Google O for voice search.
It looks like Google is trying to clean up search results and adapt to the growing mobile market, old news and new at the same time.
October 31, 2016
Cloud computing allows users to access their files or hard drive from multiple devices at multiple locations. Fog computing, on the other hand, is something else entirely. Fog computing is the latest buzzword in the tech world and pretty soon it will be in the lexicon. If you are unfamiliar with fog computing, read Forbes’s article, “What Is Fog Computing? And Why It Matters In Our Big Data And IoT World.”
According to the article, smartphones are “smart” because they receive and share information with the cloud. The biggest problem with cloud computing is bandwidth, slow Internet speeds. The United States is 35th in the world for bandwidth speed, which is contrary to the belief that it is the most advanced country in the world. Demand for faster speeds increases every day. Fog computing also known as edge computing seeks to resolve the problem by grounding data. How does one “ground” data?
What if the laptop could download software updates and then share them with the phones and tablets? Instead of using precious (and slow) bandwidth for each device to individually download the updates from the cloud, they could utilize the computing power all around us and communicate internally.
Fog computing makes accessing data faster, more efficient, and more reliably from a local area rather than routing to the cloud and back. IBM and Cisco Systems are developing projects that would push computing to more local areas, such as a router, devices, and sensors.
Considering that there are security issues with housing data on a third party’s digital storage unit, it would be better to locate a more local solution. Kind of like back in the old days, when people housed their data on CPUs.
October 20, 2016
Data silos have become a permanent part of the landscape. Even if data reside in a cloud, some data are okay for certain people to access. Other data are off limits. Whether the data silo is a result of access controls or because an enthusiastic marketer has a one off storage device in his or her cubbies’ desk drawer, we have silos.
I read “Battling Data Silos: 3 Tips to Finance and Operations Integration.” This is a very good example of providing advice which is impossible to implement. If I were to use the three precepts in an engagement, I have a hunch that a barrel of tar and some goose feathers will be next to my horse and buggy.
What are the “tips”? Here you go.
- Conduct a data discovery audit.
- Develop a plan
- And my fave “Realize the value of the cloud for high performance and scalability.”
Here we go, gentle reader.
The cost of a data discovery audit can be high. The cost of the time, effort, and lost productivity mean that most data audits are limp wash rags. Few folks in an organization know what data are where, who manages those data, and the limits placed on the data. Figuring out the answers to these questions in a company with 25 people is tough. Try to do it for a government agency with dozens of locations and hundreds of staff and contractors. Automated audits can be a help, but there may be unforeseen consequences of sniffing who has what. The likelihood of a high value data discovery audit without considerable preparation, budgeting, and planning is zero. Most data audits like software audits never reach the finish line without a trip to the emergency room.
The notion of a plan for consolidating data is okay. Folks love meetings with coffee and food. A plan allows a professional to demonstrate that work has been accomplished. The challenge, of course, is to implement the plan. That’s another kettle of fish entirely. MBA think does not deliver much progress toward eliminating silos which proliferate like tweets about zombies.
The third point is value. Yep, value. What is value? I don’t know. Cloud value can be demonstrated for specific situations. But the thought of migrating data to a cloud and then making sure that no regulatory, legal, or common sense problems have been avoided is a work in progress. Data management, content controls, and security tasks nudge cloud functions toward one approach: Yet another data silo.
Yep, YADS. Three breezy notions crater due to the gravitational pull of segmented content repositories under the control of folks who absolutely love silos.
Stephen E Arnold, October 20, 2016