Honkin' News banner

Can Analytics Be Cloud Friendly?

August 24, 2016

One of the problems with storing data in the cloud is that it is difficult to run analytics.  Sure, you can run tests to determine the usage of the cloud, but analyzing the data stored in the cloud is another story.  Program developers have been trying to find a solution to this problem and the open source community has developed some software that might be the ticket.  Ideata wrote about the newest Apache software in “Apache Spark-Comparing RDD, Dataframe, and Dataset.”

Ideata is a data software company and they built many of the headlining products on the open source software Apache Spark.  They have been using Apache Spark since 2013 and enjoy using it because it offers a rich abstraction, allows the developer to build complex workflows, and perform easy data analysis.

Apache Spark works like this:

Spark revolves around the concept of a resilient distributed dataset (RDD), which is a fault-tolerant collection of elements that can be operated on in parallel. An RDD is Spark’s representation of a set of data, spread across multiple machines in the cluster, with API to let you act on it. An RDD could come from any datasource, e.g. text files, a database via JDBC, etc. and can easily handle data with no predefined structure.

It can be used as the basis fort a user-friendly cloud analytics platform, especially if you are familiar with what can go wrong with a dataset.

Whitney Grace, August 24, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Gartner Declares Microsoft a Winner

August 12, 2016

I read “Microsoft Is a Leader in 18 Gartner Magic Quadrants, Including Cloud Infrastructure as a Service.” Those folks at Microsoft should be darned proud of themselves. Receiving  A grades in 18 Gartner Magic Quadrants is remarkable.

I noted this passage in the write up:

Microsoft is the only cloud computing vendor that is a Magic Quadrant Leader in all of the major cloud services categories, including IaaS, Platform as a Service (PaaS), and Software as a Service (SaaS). These ratings place Microsoft in an enviable position above Amazon AWS, Salesforce, and Google. Looking at the following chart, we can see that Microsoft is a Leader in fully 18 different Magic Quadrants.

Yes, Microsoft stomps on Amazon. I can here the chant “We’re number one” now even though I am in Harrod’s Creek, Kentucky.

What are those 18 Magic Quadrants? I think this is the list, but I can be wrong. My view is that Gartner’s experts are never, ever, ever incorrect in their objective analyses of leading vendors. Perish the thought that the Magic Quadrant is influenced by any subjective element. I shudder to think how subjectivity influencing ratings would rock the myriad consultants wherever they may work.

The 18 Magic Quadrants:

Application develop life cycle management or ADLM

Business intelligence and analytics platforms or BIAP

Cloud infrastructure as a service or CaaS

CRM customer engagement center or CRMCEC

Data warehouse and data management solutions for analytics or DWaDMSfA

Disaster recovery as a service or DRaaS

Enterprise content management or ECM

Horizontal portals or HP (Please, do not confuse the leadership outfit Microsoft with the struggling Hewlett Packard)

Identity as a service or IDaaS

Mobile application development platforms or MADP

Operational database management systems or ODBMS

Public cloud storage services or PCSS

Sales force automation or SFA

Unified communications or UC (Not to be confused with Google ooze)

Web conferencing or WC (Please, be careful with this acronym in the UK)

X86 server virtualization infrastructure or XSVI.

Frankly, the best acronym on this list, which is filled with impressive acronyms, is DWaDMSfA. However, I quite like UC which may be pronounced “uck” and  WC. But for the connotation of a loo, WC is outstanding. I know that Microsoft is the all time champ of the enterprise.

Perhaps Amazon will pick up its marbles and focus on space travel and selling weird voice activated surveillance devices? Kudos to Microsoft for its stellar and totally objective achievement.

Stephen E Arnold, August 12, 2016

IBM Cognitive Storage Creates a Hierarchy of Data Value

August 5, 2016

The article titled IBM Introduces Cognitive Storage on EWeek reveals the advances in storage technology. It may sound less sexy than big data, but it is an integral part of our ability to sort and retrieve data based on the metric of data value. For a computer to determine a hierarchy of data value would also enable it to locate and archive unimportant data, freeing up space for data of more relevance. The article explains,

“In essence, the concept helps computers to learn what to remember and what to forget, IBM said… “With rising costs in energy and the explosion in big data, particularly from the Internet of Things, this is a critical challenge as it could lead to huge savings in storage capacity, which means less media costs and less energy consumption… if 1,000 employees are accessing the same files every day, the value of that data set should be very high.”

Frequency of use is a major factor in determining data value, so IBM created trackers to monitor this sort of metadata. Interestingly, the article states that IBM’s cognitive computing was inspired by astronomy. An astronomer would tag incoming data sets from another galaxy as “highly important” or less so. So what happens to the less important data? It isn’t destroyed, but rather relegated to what Charles King of Pund-IT calls a “deep freeze.”

 

Chelsea Kerwin, August 5, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Need a Mentor? See Here

August 3, 2016

Does your business need a mentor? How about any students or budding entrepreneurs you know? Such a guide can be invaluable, especially to a small business, but Google and Bing may not be the best places to pose that query. Business magazine Inc. has rounded up “Ten Top Platforms for Finding a Mentor in 22016.” Writer John Boitnott introduces the list:

“Many startup founders have learned that by working with a mentor, they enjoy a collaboration through which they can learn and grow. They usually also gain access to a much more experienced entrepreneur’s extensive network, which can help as they seek funding or gather resources. For students, mentors can provide the insight they need as they make decisions about their future. One of the biggest problems entrepreneurs and students have, however, is finding a good mentor when their professional networks are limited. Fortunately, technology has come up with an answer. Here are nine great platforms helping to connect mentors and mentees in 2016.”

Boitnott  lists the following mentor-discovery resources: Music platform Envelop offers workshops for performers and listeners. Mogul focuses on helping female entrepreneurs via a 27/7 advice hotline. From within classrooms, iCouldBe connects high-school students to potential mentors. Also for high-school students, iMentor is specifically active in low-income communities. MentorNet works to support STEM students through a community of dedicated mentors, while the free, U.K.-based Horse’s Mouth supports a loosely-organized platform where participants share ideas. Also free, Find a Mentor matches potential protégés with adult mentors. SCORE supplies tools like workshops and document templates for small businesses. Cloud-based MentorCity serves entrepreneurs, students, and nonprofits, and it maintains a free online registry where mentors can match their skill sets to the needs of inquiring minds.

Who knew so much professional guidance was out there, made possible by today’s technology, and much of it for free?  For more information on each entry, see the full article.

 

 

Cynthia Murrell, August 3, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

 

Salesforce Blackout

July 27, 2016

Salesforce.com is a cloud computing company with the majority of its profits coming from customer relationship management and acquiring commercial social networking apps.  According to PC World, Salesforce recently had a blackout and the details were told in: “Salesforce Outage Continues In Some Parts Of The US.”  In early May, Salesforce was down for over twelve hours due to a file integrity issue in the NA14 database.

The outage occurred in the morning with limited services restored later in the evening. Salesforce divides its customers into instances.  The NA14 instance is located in North America as many of the customers who complained via Twitter are located in the US.

The exact details were:

“The database failure happened after “a successful site switch” of the NA14 instance “to resolve a service disruption that occurred between 00:47 to 02:39 UTC on May 10, 2016 due to a failure in the power distribution in the primary data center,” the company said.  Later on Tuesday, Salesforce continued to report that users were still unable to access the service. It said it did not believe “at this point” that it would be able to repair the file integrity issue. Instead, it had shifted its focus to recovering from a prior backup, which had not been affected by the file integrity issues.”

It is to be expected that power outages like this would happen and they will reoccur in the future.  Technology is only as reliable as the best circuit breaker and electricity flows.  This is why it is recommended to back up your files in more than one place.

 

Whitney Grace, July 27, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

The Watson Update

July 15, 2016

IBM invested a lot of resources, time, and finances into developing the powerful artificial intelligence computer Watson.  The company has been trying for years to justify the expense as well as make money off their invention, mostly by having Watson try every conceivable industry that could benefit from big data-from cooking to medicine.  We finally have an update on Watson says ZDNet in the article, “IBM Talks About Progress On Watson, OpenPower.”

Watson is a cognitive computer system that learns, supports natural user interfaces, values user expertise, and evolves with new information.  Evolving is the most important step, because that will allow Watson to keep gaining experience and learn.  When Watson was first developed, IBM fed it general domain knowledge, then made the Watson Discovery to find answers to specific questions.  This has been used in the medical field to digest all the information created and applying it to practice.

IBM also did this:

“Most recently IBM has been focused on making Watson available as a set of services for customers that want to build their own applications with natural question-and-answer capabilities. Today it has 32 services available on the Watson Developer Cloud hosted on its Bluemix platform-as-a-service… Now IBM is working on making Watson more human. This includes a Tone Analyzer (think of this as a sort spellchecker for tone before you send that e-mail to the boss), Emotion Analysis of text, and Personality Insights, which uses things you’ve written to assess your personality traits.”

Cognitive computing has come very far since Watson won Jeopardy.  Pretty soon the technology will be more integrated into our lives.  The bigger question is how will change society and how we live?

 

Whitney Grace,  July 15, 2016

There is a Louisville, Kentucky Hidden Web/Dark

Web meet up on July 26, 2016. Information is at this link: http://bit.ly/29tVKpx.

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Publicly Available Information Is Considered Leaked When on Dark Web

July 7, 2016

What happens when publicly available informed is leaked to the Dark Web? This happened recently with staff contact information from the University of Liverpool according to an article, Five secrets about the Dark Web you didn’t know from CloudPro. This piece speaks to perception that the Dark Web is a risky place for even already publicly available information. The author reports on how the information was compromised,

“A spokeswoman said: “We detected an automated cyber-attack on one of our departmental online booking systems, which resulted in publically available data – surname, email, and business telephone numbers – being released on the internet. We take the security of all university-related data very seriously and routinely test our systems to ensure that all data is protected effectively. We supported the Regional Organised Crime Unit (TITAN) in their investigations into this issue and reported the case to the Information Commissioner’s Office.”

Data security only continues to grow in importance and as a concern for large enterprises and organizations. This incident is an interesting case to be reported, and it was the only story we had not seen published again and again, as it illustrates the public perception of the Dark Web being a playing ground for illicit activity. It brings up the question about what online landscapes are considered public versus private.

 

Megan Feil, July 7, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

The Computer Chip Inspired by a Brain

July 6, 2016

Artificial intelligence is humanity’s attempt to replicate the complicated thought processes in their own brains through technology.  IBM is trying to duplicate the human brain and they have been successful in many ways with supercomputer Watson.  The Tech Republic reports that IBM has another success under their belt, except to what end?  Check out the article, “IBM’s Brain-Inspired Chip TrueNorth Changes How Computers ‘Think,’ But Experts Question Its Purpose.”

IBM’s TrueNorth is the first computer chip with an one million neuron architecture.  The chip is a collaboration between Cornell University and IBM with the  BARPA SyNAPSE Program, using $100 million in public funding.  Most computer chips use the Von Neumann architecture, but the TrueNorth chip better replicates the human brain.  TrueNorth is also more energy efficient.

What is the purpose of the TrueNorth chip, however?  IBM created an elaborate ecosystem that uses many state of the art processes, but people are still wondering what the real world applications are:

“ ‘…it provides ‘energy-efficient, always-on content generation for wearables, IoT devices, smartphones.’ It can also give ‘real-time contextual understanding in automobiles, robotics, medical imagers, and cameras.’ And, most importantly, he said, it can ‘provide volume-efficient, unprecedented neural network acceleration capability per unit volume for cloud-based streaming processing and provide volume, energy, and speed efficient multi-modal sensor fusion at an unprecedented neural network scale.’”

Other applications include cyber security, other defense goals, and large scale computing and hardware running on the cloud.  While there might be practical applications, people still want to know why IBM made the chip?

” ‘It would be as if Henry Ford decided in 1920 that since he had managed to efficiently build a car, we would try to design a car that would take us to the moon,’ [said Nir Shavit, a professor at MIT’s Computer Science and Artificial Intelligence Laboratory]. ‘We know how to fabricate really efficient computer chips. But is this going to move us towards Human quality neural computation?’ Shavit fears that its simply too early to try to build neuromorphic chips. We should instead try much harder to understand how real neural networks compute.’”

Why would a car need to go to the moon?  It would be fun to go to the moon, but it doesn’t solve a practical purpose (unless we build a civilization on the moon, although we are a long way from that).  It continues:

” ‘The problem is,’ Shavit said, ‘that we don’t even know what the problem is. We don’t know what has to happen to a car to make the car go to the moon. It’s perhaps different technology that you need. But this is where neuromorphic computing is.’”

In other words, it is the theoretical physics of computer science.

 

Whitney Grace,  July 6, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

The Cloud: Yep, Flying Blind Is Fun

July 5, 2016

Most of the information technology disasters I know about have a common characteristic. Ready for it? Managers did not do their job. The reasons ranged from a lack of informed decision making (this is a nice way of saying “stupid”) to a desire to leave the details to others (this is a nice way of saying “shirk responsibility”). Example: Portland, Oregon’s incompetence.

I thought about information technology crash and burns when I read “75 Percent of IT Pros Lack Visibility into Their Hybrid Clouds.” What I think the write up is trying to say is, “Folks with clouds don’t know what’s happening in the mist and haze.” The desire to get out of having computer systems down the hall is an admirable one. When I fiddled with the IBM mainframe at my third rate university in the 1960s, who wanted one of these puppies at home. The cloud is little more, in my opinion, than a return to the mainframe type approach to computing of a half century ago. Life is just easier with a smartphone.

The write  up reports:

The study from cloud governance specialist Netwrix reveals that almost 65 percent of organizations do not have complete visibility into user, IT and third-party activity in their IT infrastructure. In addition 75 percent of respondents have partial or no visibility into their cloud and hybrid IT environments. The survey of over 800 people across 30 industries worldwide shows a large majority of respondents (78 percent) saying they are unaware or only partly aware of what is happening across their unstructured data and file storage.

The painful reality is that people who are supposed to be professional struggle to know what the heck is going on with their cloud computing systems. MBAs and failed middle school teachers as well as bright young sprouts from prestigious university computer science programs have this characteristic too.

Understanding the limits of one’s own knowledge is a difficult undertaking. The confidence with which some “pros” talk about nifty technology is remarkable. The likelihood of a escalating costs, annoyed customers, grousing colleagues, and outright failure are highly likely events.

Whether it is the baloney about figuring out the context of a user query or an F 35 aircraft which cannot be serviced by a ground crew are examples of how arrogance or human behavior ensure information technology excitement.

Change human behavior or go with a Google and Facebook style smart system? Interesting choice or is it a dilemma.

Stephen E Arnold, July 5, 2016

Amazon AWS Jungle Snares Some Elasticsearch Functions

July 1, 2016

Elastic’s Elasticsearch has become one of the go to open source search and retrieval solutions. Based on Lucene, the system has put the heat on some of the other open source centric search vendors. However, search is a tricky beastie.

Navigate to “AWS Elasticsearch Service Woes” to get a glimpse of some of the snags which can poke holes in one’s rip stop hiking garb. The problems are not surprising. One does not know what issues will arise until a search system is deployed and the lucky users are banging away with their queries or a happy administrator discovers that Button A no longer works.

The write up states:

We kept coming across OOM issues due the JVMMemoryPresure spiking and inturn the ES service kept crapping out. Aside from some optimization work, we’d more than likely have to add more boxes/resources to the cluster which then means more things to manage. This is when we thought, “Hey, AWS have a service for this right? Let’s give that a crack?!”. As great as having it as a service is, it certainly comes with some fairly irritating pitfalls which then causes you to approach the situation from a different angle.

One approach is to use templates to deal with the implementation of shard management in AWS Elasticsearch. Sample templates are provided in the write up. The fix does not address some issues. The article provides a link to a reindexing tool called es-tool.

The most interesting comment in the article in my opinion is:

In hindsight I think it may have been worth potentially sticking with and fleshing out the old implementation of Elasticsearch, instead of having to fudge various things with the AWS ES service. On the other hand it has relieved some of the operational overhead, and in terms of scaling I am literally a couple of clicks away. If you have large amounts of data you pump into Elasticsearch and you require granular control, AWS ES is not the solution for you. However if you need a quick and simple Elasticsearch and Kibana solution, then look no further.

My takeaway is to do some thinking about the strengths and weaknesses of the Amazon AWS before chopping through the Bezos cloud jungle.

Stephen E Arnold, July 1, 2016

Next Page »