GE Now Manufactures Artificial Intelligence

December 9, 2016

GE (General Electric) makes appliances, such as ovens, ranges, microwaves, washers, dryers, and refrigerators.  Once you get them out of the appliance market, their expertise appears to end.  Fast Company tells us that GE wants to branch out into new markets and the story is in, “GE Wants To Be The Next Artificial Intelligence Powerhouse .”

GE is a multi-billion dollar company and they have the resources to invest in the burgeoning artificial intelligence market.  They plan to employ two new acquisitions and bring machine learning to the markets they already dominate.  GE first used machine learning in 2015 with Predix Cloud, which recorded industrial machinery sensor patterns.  It was, however, more of a custom design for GE than one with a universal application.

GE purchased Bit Stew Systems, a company similar to the Predix Cloud except that collected industrial data, and Wise.io, a company that used astronomy-based technology to streamline customer support systems.  Predix already has a string of customers and has seen much growth:

Though young, Predix is growing fast, with 270 partner companies using the platform, according to GE, which expects revenue on software and services to grow over 25% this year, to more than $7 billion. Ruh calls Predix a “significant part” of that extra money. And he’s ready to brag, taking a jab at IBM Watson for being a “general-purpose” machine-learning provider without the deep knowledge of the industries it serves. “We have domain algorithms, on machine learning, that’ll know what a power plant is and all the depth of that, that a general-purpose machine learning will never really understand,” he says.

GE is tackling issues in healthcare and energy issues with Predix.  GE is proving it can do more than make a device that can heat up a waffle.  The company can affect the energy, metal, plastic, and computer system used to heat the waffle.  It is exactly like how mason jars created tools that will be used in space.

Whitney Grace, December 9, 2016

Google Gives Third Day Keynote at Pubcon

November 1, 2016

Technology conferences are the thing to do when you want to launch a product, advertise a new business, network, or get a general consensus about the tech industry.  There are multiple conferences revolving around different aspects in the tech industry held each month.  In October 2016, Pubcon took place in Las Vegas, Nevada and they had a very good turn out.  The thing that makes a convention, though, is the guests.  Pubcon did not disappoint as on the third day, Google’s search expert Gary Illyes delivered the morning keynote.  (Apparently, Illyes also hold the title Chief of Sunshine and Happiness at Google).  Outbrain summed up the highlights of Pubcon 2016’s third day in “Pubcon 2016 Las Vegas: Day 3.”

Illyes spoke about search infrastructure, suggesting that people switch to HTTPS.  His biggest push for HTTPS was that it protected users from “annoying scenarios” and it is good for UX.  Google is also pushing for more mobile friendly Web sites.  It will remove “mobile friendly” from search results and AMP can be used to make a user-friendly site.  There is even bigger news about page ranking in the Google algorithm:

Our systems weren’t designed to get two versions of the same content, so Google determines your ranking by the Desktop version only. Google is now switching to a mobile version first index. Gary explained that there are still a lot of issues with this change as they are losing a lot of signals (good ones) from desktop pages that are don’t exist on mobile. Google created a separate mobile index, which will be its primary index. Desktop will be a secondary index that is less up to date.

As for ranking and spam, Illyes explained that Google is using human evaluators to understand modified search better, Rankbrain was not mentioned much, he wants to release the Panda algorithm, and Penguin will demote bad links in search results.  Google will also release “Google O for voice search.

It looks like Google is trying to clean up search results and adapt to the growing mobile market, old news and new at the same time.

Whitney Grace, November 1, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Be Prepared for Foggy Computing

October 31, 2016

Cloud computing allows users to access their files or hard drive from multiple devices at multiple locations.  Fog computing, on the other hand, is something else entirely.  Fog computing is the latest buzzword in the tech world and pretty soon it will be in the lexicon.  If you are unfamiliar with fog computing, read Forbes’s article, “What Is Fog Computing? And Why It Matters In Our Big Data And IoT World.”

According to the article, smartphones are “smart” because they receive and share information with the cloud.  The biggest problem with cloud computing is bandwidth, slow Internet speeds.  The United States is 35th in the world for bandwidth speed, which is contrary to the belief that it is the most advanced country in the world.  Demand for faster speeds increases every day.  Fog computing also known as edge computing seeks to resolve the problem by grounding data.  How does one “ground” data?

What if the laptop could download software updates and then share them with the phones and tablets? Instead of using precious (and slow) bandwidth for each device to individually download the updates from the cloud, they could utilize the computing power all around us and communicate internally.

Fog computing makes accessing data faster, more efficient, and more reliably from a local area rather than routing to the cloud and back.  IBM and Cisco Systems are developing projects that would push computing to more local areas, such as a router, devices, and sensors.

Considering that there are security issues with housing data on a third party’s digital storage unit, it would be better to locate a more local solution.  Kind of like back in the old days, when people housed their data on CPUs.

Whitney Grace, October 31, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Data Silos: Here to Stay

October 20, 2016

Data silos have become a permanent part of the landscape. Even if data reside in a cloud, some data are okay for certain people to access. Other data are off limits. Whether the data silo is a result of access controls or because an enthusiastic marketer has a one off storage device in his or her cubbies’ desk drawer, we have silos.

I read “Battling Data Silos: 3 Tips to Finance and Operations Integration.” This is a very good example of providing advice which is impossible to implement. If I were to use the three precepts in an engagement, I have a hunch that a barrel of tar and some goose feathers will be next to my horse and buggy.

What are the “tips”? Here you go.

  1. Conduct a data discovery audit.
  2. Develop a plan
  3. And my fave “Realize the value of the cloud for high performance and scalability.”

Here we go, gentle reader.

The cost of a data discovery audit can be high. The cost of the time, effort, and lost productivity mean that most data audits are limp wash rags. Few folks in an organization know what data are where, who manages those data, and the limits placed on the data. Figuring out the answers to these questions in a company with 25 people is tough. Try to do it for a government agency with dozens of locations and hundreds of staff and contractors. Automated audits can be a help, but there may be unforeseen consequences of sniffing who has what. The likelihood of a high value data discovery audit without considerable preparation, budgeting, and planning is zero. Most data audits like software audits never reach the finish line without a trip to the emergency room.

The notion of a plan for consolidating data is okay. Folks love meetings with coffee and food. A plan allows a professional to demonstrate that work has been accomplished. The challenge, of course, is to implement the plan. That’s another kettle of fish entirely. MBA think does not deliver much progress toward eliminating silos which proliferate like tweets about zombies.

The third point is value. Yep, value. What is value? I don’t know. Cloud value can be demonstrated for specific situations. But the thought of migrating data to a cloud and then making sure that no regulatory, legal, or common sense problems have been avoided is a work in progress. Data management, content controls, and security tasks nudge cloud functions toward one approach: Yet another data silo.

Yep, YADS. Three breezy notions crater due to the gravitational pull of segmented content repositories under the control of folks who absolutely love silos.

Stephen E Arnold, October 20, 2016

Google Cloud, Azure, and AWS Differences

October 18, 2016

With so many options for cloud computing, it can be confusing about which one to use for your personal or business files.  Three of the most popular cloud computing options are Amazon Web Services (AWS), Google Cloud Platform, and Microsoft Azure.  Beyond the pricing, the main differences range from what services they offer and what they name them.  Site Point did us a favor with its article comparing the different cloud services: “A Side-By-Side Comparison Of AWS, Google Cloud, And Azure.”

Cloud computing has the great benefit of offering flexible price options, but they can often can very intricate based on how much processing power you need, how many virtual servers you deploy, where they are deployed, etc.  AWS, Azure, and Google Cloud do offer canned solutions along with individual ones.

AWS has the most extensive service array, but they are also the most expensive.  It is best to decide how you want to use cloud computing because prices will vary based on the usage and each service does have specializations.  All three are good for scalable computing on demand, but Google is less flexible in its offering, although it is easier to understand the pricing.  Amazon has the most robust storage options.

When it comes to big data:

This requires very specific technologies and programming models, one of which is MapReduce, which was developed by Google, so maybe it isn’t surprising to see Google walking forward in the big data arena by offering an array of products — such as BigQuery (managed data warehouse for large-scale data analytics), Cloud Dataflow (real-time data processing), Cloud Dataproc (managed Spark and Hadoop), Cloud Datalab (large-scale data exploration, analysis, and visualization), Cloud Pub/Sub (messaging and streaming data), and Genomics (for processing up to petabytes of genomic data). Elastic MapReduce (EMR) and HDInsight are Amazon’s and Azure’s take on big data, respectively.

Without getting too much into the nitty gritty, each of the services have their strengths and weaknesses.  If one of the canned solutions do not work for you, read the fine print to learn how cloud computing can help your project.

Whitney Grace, October 18, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Explain Cloud Analytics Like I Am Five

October 10, 2016

One of Reddit’s popular subreddits is “explain to me like I’m five,” where redditors post questions they have about science, math, computing, engineering, politics, and other topics.  Outside of reading textbooks or questioning experts in person, this subreddit allows intellectual discussion on a global basis…as long as the comments remain mature.  “Explain to me like I’m five” is like the favorite For Dummies book series.

While Internet forums and Reddit itself have made the series semi-obsolete, For Dummies books are still a reliable reference tool when you don’t want to search and scroll on the Internet.  As companies move towards cloud-based systems, you can be sure there will be a slew of cloud computing For Dummies books.

Open Source Magazine shares that “Analytics For Dummies” is available for a free download!

Cloud analytics is dramatically altering business intelligence. Some businesses will capitalize on these promising new technologies and gain key insights that’ll help them gain competitive advantage. And others won’t.  Whether you’re a business leader, an IT manager, or an analyst, we want to help you and the people you need to influence with a free copy of “Cloud Analytics for Dummies,” the essential guide to this explosive new space for business intelligence.

For Dummies books usually retail around twenty dollars, so this offers the chance for a free, updated manual on the growing cloud analytics field and you can save a few dollars.

Whitney Grace, October 10, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Is the Cloud Really Raining Dollar Signs?

October 5, 2016

Cloud computing offers people the ability to access their files from any place in the world as long as they have a good Internet connection and a cloud account.  Many companies are transferring their mainframes to the cloud, so their employees can work remotely.  Individuals love having their files, especially photos and music, on the cloud for instantaneous access.  It is a fast growing IT business and Forbes reports that “Gartner Predicts $111B In IT Spend Will Shift To Cloud This Year Growing To Be $216B By 2020.”

Within the next five years it is predicted more companies will shift their inner workings to the cloud, which will indirectly and directly affect more than one trillion projected to be spent in IT.  Application software spending is expected to shift 37% towards more cloud usage and business process outsourcing is expected to grow 43%, all by 2020.

Why wait for 2020 to see the final results, however?  2016 already has seen a lot of cloud growth and even more is expected before the year ends:

$42B in Business Process Outsourcing IT spend, or 35% of the total market, is forecast to shift to the cloud this year. 25% of the application software spending is predicted to shift to the cloud this year, or $36B.

Gartner is a respected research firm and these numbers are predicting hefty growth (here is the source).  The cloud shift will surely affect more than one trillion.  The bigger question is will cloud security improve enough by 2020 that more companies will shift in that direction?

Whitney Grace, October 5, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Can Analytics Be Cloud Friendly?

August 24, 2016

One of the problems with storing data in the cloud is that it is difficult to run analytics.  Sure, you can run tests to determine the usage of the cloud, but analyzing the data stored in the cloud is another story.  Program developers have been trying to find a solution to this problem and the open source community has developed some software that might be the ticket.  Ideata wrote about the newest Apache software in “Apache Spark-Comparing RDD, Dataframe, and Dataset.”

Ideata is a data software company and they built many of the headlining products on the open source software Apache Spark.  They have been using Apache Spark since 2013 and enjoy using it because it offers a rich abstraction, allows the developer to build complex workflows, and perform easy data analysis.

Apache Spark works like this:

Spark revolves around the concept of a resilient distributed dataset (RDD), which is a fault-tolerant collection of elements that can be operated on in parallel. An RDD is Spark’s representation of a set of data, spread across multiple machines in the cluster, with API to let you act on it. An RDD could come from any datasource, e.g. text files, a database via JDBC, etc. and can easily handle data with no predefined structure.

It can be used as the basis fort a user-friendly cloud analytics platform, especially if you are familiar with what can go wrong with a dataset.

Whitney Grace, August 24, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Gartner Declares Microsoft a Winner

August 12, 2016

I read “Microsoft Is a Leader in 18 Gartner Magic Quadrants, Including Cloud Infrastructure as a Service.” Those folks at Microsoft should be darned proud of themselves. Receiving  A grades in 18 Gartner Magic Quadrants is remarkable.

I noted this passage in the write up:

Microsoft is the only cloud computing vendor that is a Magic Quadrant Leader in all of the major cloud services categories, including IaaS, Platform as a Service (PaaS), and Software as a Service (SaaS). These ratings place Microsoft in an enviable position above Amazon AWS, Salesforce, and Google. Looking at the following chart, we can see that Microsoft is a Leader in fully 18 different Magic Quadrants.

Yes, Microsoft stomps on Amazon. I can here the chant “We’re number one” now even though I am in Harrod’s Creek, Kentucky.

What are those 18 Magic Quadrants? I think this is the list, but I can be wrong. My view is that Gartner’s experts are never, ever, ever incorrect in their objective analyses of leading vendors. Perish the thought that the Magic Quadrant is influenced by any subjective element. I shudder to think how subjectivity influencing ratings would rock the myriad consultants wherever they may work.

The 18 Magic Quadrants:

Application develop life cycle management or ADLM

Business intelligence and analytics platforms or BIAP

Cloud infrastructure as a service or CaaS

CRM customer engagement center or CRMCEC

Data warehouse and data management solutions for analytics or DWaDMSfA

Disaster recovery as a service or DRaaS

Enterprise content management or ECM

Horizontal portals or HP (Please, do not confuse the leadership outfit Microsoft with the struggling Hewlett Packard)

Identity as a service or IDaaS

Mobile application development platforms or MADP

Operational database management systems or ODBMS

Public cloud storage services or PCSS

Sales force automation or SFA

Unified communications or UC (Not to be confused with Google ooze)

Web conferencing or WC (Please, be careful with this acronym in the UK)

X86 server virtualization infrastructure or XSVI.

Frankly, the best acronym on this list, which is filled with impressive acronyms, is DWaDMSfA. However, I quite like UC which may be pronounced “uck” and  WC. But for the connotation of a loo, WC is outstanding. I know that Microsoft is the all time champ of the enterprise.

Perhaps Amazon will pick up its marbles and focus on space travel and selling weird voice activated surveillance devices? Kudos to Microsoft for its stellar and totally objective achievement.

Stephen E Arnold, August 12, 2016

IBM Cognitive Storage Creates a Hierarchy of Data Value

August 5, 2016

The article titled IBM Introduces Cognitive Storage on EWeek reveals the advances in storage technology. It may sound less sexy than big data, but it is an integral part of our ability to sort and retrieve data based on the metric of data value. For a computer to determine a hierarchy of data value would also enable it to locate and archive unimportant data, freeing up space for data of more relevance. The article explains,

“In essence, the concept helps computers to learn what to remember and what to forget, IBM said… “With rising costs in energy and the explosion in big data, particularly from the Internet of Things, this is a critical challenge as it could lead to huge savings in storage capacity, which means less media costs and less energy consumption… if 1,000 employees are accessing the same files every day, the value of that data set should be very high.”

Frequency of use is a major factor in determining data value, so IBM created trackers to monitor this sort of metadata. Interestingly, the article states that IBM’s cognitive computing was inspired by astronomy. An astronomer would tag incoming data sets from another galaxy as “highly important” or less so. So what happens to the less important data? It isn’t destroyed, but rather relegated to what Charles King of Pund-IT calls a “deep freeze.”

 

Chelsea Kerwin, August 5, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta