HPE IDOL Released with Natural Language Processing Capabilities Aimed at Enterprise-Level Tasks

June 16, 2017

The article titled Hewlett Packard Enterprise Enriches HPE IDOL Machine Learning Engine With Natural Language Processing on SDTimes discusses the enhancements to HPE IDOL. The challenges to creating an effective interactive experience based on Big Data for enterprise-class inquiries are related to the sheer complexity of the inquiries. Additional issues arise around context, specificity, and source validation. The article examines the new and improved model,

HPE Natural Language Question Answering deciphers the intent of a question and provides an answer or initates an action drawing from an organization’s own structured and unstructured data assets, in addition to available public data sources to provide actionable, trusted answers and business critical responses… HPE IDOL Natural Language Question Answering is a core feature of the new HPE IDOL 11.2 software release that features four key capabilities for natural language processing for the enterprise.

These capabilities are the IDOL Answer Bank (with pre-set reference questions), Fact Bank (with structured and unstructured data extraction abilities), Passage Extract (for text-based summaries), and Answer Server (for question analysis and integration of the other 3 areas). The goal is natural conversations between people and computers, an “information exchange”. The four capabilities work together to deliver a complex answer with the utmost accuracy and relevance.

Chelsea Kerwin, June 16, 2017

The Big Dud

April 24, 2017

Marketers often need a fancy term periodically to sell technologies to large companies. Big Data and Hadoop was one such term. After years of marketing, adopters are yet to see any results, let alone any ROI.

Datamani recently published an article titled Hadoop Has Failed Us, Tech Experts Say in which the author says:

Many companies still run mainframe applications that were originally developed half a century ago. But thanks to better mousetraps like S3 (for storage) and Spark (for processing), Hadoop will be relegated to niche and legacy statuses going forward.

One of the primary concerns with Hadoop is that only handful of people know how to play it. For data scientists to make head and tail out of data, precise data queries and mining needs to be done. The dearth of experts, however, is hampering efforts of companies who want to make Big Data work for them. Other frameworks are trying to overcome problems put forth by Hadoop, but many companies have already adopted it and are stuck with it. And just like many fads, Big Data might fade into oblivion.

Vishal Ingole, April 24, 2017

AI Might Not Be the Most Intelligent Business Solution

April 21, 2017

Big data was the buzzword a few years ago, but now artificial intelligence is the tech jargon of the moment.  While big data was a more plausible solution for companies trying to mine information from their digital data, AI is proving difficult to implement.  Forbes discusses AI difficulties in the article, “Artificial Intelligence Is Powerful Stuff, But Difficult To Scale To Real-Life Business.”

There is a lot of excitement brewing around machine learning and AI business possibilities, while the technology is ready for use, workers are not.  People need to be prepped and taught how to use AI and machine learning technology, but without the proper lessons, it will hurt a company’s bottom line.  The problem comes from companies rolling out digital solutions, without changing the way they conduct business.  Workers cannot just adapt to changes instantly.  They need to feel like they are part of the solution, instead of being shifted to the side in the latest technological trend.

CIO for the Federal Communications Commission Dr. David Bray said that:

The growth of AI may shift thinking in organizations. ‘At the end of the day, we are changing what people are doing,; Bray says. ‘You are changing how they work, and they’re going to feel threatened if they’re not bought into the change. It’s almost imperative for CIOs to really work closely with their chief executive officers, and serve as an internal venture capitalist, for how we bring data, to bring process improvements and organizational performance improvements – and work it across the entire organization as a whole.

Artificial intelligence and machine learning are an upgrade to not only a company’s technology but also how a company conducts business.  Business processes will need to be updated to integrate the new technology, but also how workers will use and interface it.  Businesses will continue facing problems if they think that changing technology, but not their procedures are the final solution.

Whitney Grace, April 21, 2017

UK Big Brother Invades More Privacy

April 18, 2017

The United Kingdom has been compared to George Orwell’s 1984 dystopia before, especially in the last two decades with their increasing amount of surveillance technology.  Once more UK citizens face privacy invasion reports the Guardian in “UK Public Faces Mass Invasion Of Privacy As Big Data And Surveillance Merge.”  The UK’s Surveillance Camera Commissioner Tony Porter expressed his worry that government regulators were unable to keep up with technological advances.

Big data combined with video surveillance, facial recognition technology, and the profuse use of more cameras is making it harder to protect individuals’ privacy.  People are being recorded 24/7 and often without their knowledge.  Another worry is that police are not being vigilant with private information.  One example is that license plate information has not been deleted after the two-year limit.

Porter wants changes to be made in policies and wants people to be aware of the dangers:

Porter’s new strategy, published on Tuesday, points out that an overwhelming majority of people currently support the use of CCTV in public places. But he questions whether this support can continue because of the way surveillance is changing.

 

‘I’m worried about overt surveillance becoming much more invasive because it is linked to everything else,’ Porter said. ‘You might have a video photograph of somebody shopping in Tesco. Now it is possible to link that person to their pre-movements, their mobile phone records, any sensor detectors within their house or locality. As smart cities move forward, these are challenges are so much greater for people like myself. And members of the public need to decide whether they are still happy with this.’

Porter admitted that advanced surveillance technology had allowed law enforcement to arrest terrorists and track down missing people, but it still can lead to worse privacy invasions.  Porter hopes is new three-year strategy will inform authorities about how technology will impact privacy.

The good thing about surveillance technology is how it can track down bad guys, but it can be harmful to innocent citizens.  The BBC should run some PSAs about video surveillance and privacy to keep their citizens informed.  I suggest they do not make them as scary as this one about electricity.

Whitney Grace, April 18, 2017

Big Data: The Crawfish Approach to Meaningful Information

March 21, 2017

Have you ever watched a crawfish (sometimes called a crawdad or a crayfish) get away from trouble. The freshwater crustaceans can go backwards. Members of the members of the Astacidae can be found in parts of the south, so you will have to wander in a Georgia swamp to check out the creature’s behavior.

The point is that crawfish go backwards to protect themselves and achieve their tiny lobster like goals. Big time consultants also crawfish in order to sell more work and provide “enhanced” insight into a thorny business or technical problem other consultants have created.

To see this in action, navigate to “The Conundrum of Big Data.” A super consultant explains that Big Data is not exactly the home run, silver bullet, or magic potion some lesser consultants said Big Data would be. I learned:

Despite two decades of intensive IT investment in data [mining] applications, recent studies show that companies continue to have trouble identifying metrics that can predict and explain performance results and/or improve operations. Data mining, the process of identifying patterns and structures in the data, has clear potential to identify prescriptions for success but its wide implementation fails systematically. Companies tend to deploy ‘unsupervised-learning’ algorithms in pursuit of predictive metrics, but this automated [black box] approach results in linking multiple low-information metrics in theories that turn out to be improbably complex.

Big surprise. For folks who are not trained in the nuts and bolts of data analysis and semi fancy math, Big Data is a giant vacuum cleaner for money. The cash has to pay for “experts,” plumbing, software, and more humans. The outputs are often fuzzy wuzzy probabilities which more “wizards” interpret. Think of a Greek religious authority looking at the ancient equivalent of road kill.

The write up cites the fizzle that was Google Flu Trends. Cough. Cough. But even that sneeze could be fixed with artificial intelligence. Yep, when smart humans make mistakes, send in smart software. That will work.

In my opinion, the highlight of the write up was this passage:

When it comes to data, size isn’t everything because big data on their own cannot just solve the problem of ‘insight’ (i.e. inferring what is going on). The true enablers are the data-scientists and statisticians who have been obsessed for more than two centuries to understand the world through data and what traps lie in wait during this exercise. In the world of analytics (AaaS), it is agility (using science, investigative skills, appropriate technology), trust (to solve the client’s real business problems and build collateral), and ‘know-how’ (to extract intelligence hidden in the data) that are the prime ‘assets’ for competing, not the size of the data. Big data are certainly here but big insights have yet to arrive.

Yes. More consulting is needed to make those payoffs arrive. But first, hire more advisers. What could possibly go wrong? Cough. Sneeze. One goes forwards with Big Data by going backwards for more analysis.

Stephen E Arnold, March 21, 2017

ScyllaDB Version 3.1 Available

March 8, 2017

According to Scylla, their latest release is currently the fastest NoSQL database. We learn about the update from SiliconAngle’s article, “ScyllaDB Revamps NoSQL Database in 1.3 Release.” To support their claim, the company points to a performance benchmark test executed by the Yahoo Cloud Serving Benchmark project. That group compared ScyllaDB to the open source Cassandra database, and found Scylla to be 4.6 times faster than a standard Cassandra cluster.

Writer Mike Wheatley elaborates on the product:

ScyllaDB’s biggest differentiator is that it’s compatible with the Apache Cassandra database APIs. As such, the creators claims that ScyllaDB can be used as a drop-in replacement for Cassandra itself, offering users the benefit of improved performance and scale that comes from the integration with a light key/value store.

The company says the new release is geared towards development teams that have struggled with Big Data projects, and claims a number of performance advantages over more traditional development approach, including:

*10X throughput of baseline Cassandra – more than 1,000,000 CQL operations per second per node

*Sub 1msec 99% latency

*10X per-node storage capacity over Cassandra

*Self-tuning database: zero configuration needed to max out hardware

*Unparalleled high availability, native multi-datacenter awareness

*Drop-in replacement for Cassandra – no additional scripts or code required”

Wheatley cites Scylla’s CTO when he points to better integration with graph databases and improved support for Thrift, Date Tiered Compaction Strategy, Large Partitions, Docker, and CQL tracing. I notice the company is hiring as of this writing. Don’t let the Tel Aviv location of Scylla’s headquarters stop from applying you if you don’t happen to live nearby—they note that their developers can work from anywhere in the world.

Cynthia Murrell, March 8, 2016

New Technologies Meet Resistance in Business

March 3, 2017

Trying to sell a state of the art, next-gen search and content processing system can be tough. In the article, “Most Companies Slow to Adopt New Business Tech Even When It Can Help,” Digital Trends demonstrates that a reluctance to invest in something new is not confined to Search. Writer Bruce Brown cites the Trends vs. Technologies 2016 report (PDF) from Capita Technology Solutions and Cisco. The survey polled 125 ICT [Information and Communications Tech] decision-makers working in insurance, manufacturing, finance, and the legal industry. More in-depth interviews were conducted with a dozen of these folks, spread evenly across those fields.

Most higher-ups acknowledge the importance of keeping on top of, and investing in, worthy technological developments. However, that awareness does not inform purchasing and implementation decisions as one might expect. Brown specifies:

The survey broke down tech trends into nine areas, asking the surveyed execs if the trends were relevant to their business, if they were being implemented within their industry, and more specifically if the specific technologies were being implemented within their own businesses. Regarding big data, for example, 90 percent said it was relevant to their business, 64 percent said it was being applied in their industry, but only 39 percent reported it being implemented in their own business. Artificial intelligence was ranked as relevant by 50 percent, applied in their industry by 25 percent, but implemented in their own companies by only 8 percent. The Internet of Things had 70 percent saying it is relevant, with 50 percent citing industry applications, but a mere 30 percent use it in their own business. The study analyzed why businesses were not implementing new technologies that they recognized could improve their bottom line. One of the most common roadblocks was a lack of skill in recognizing opportunities within organizations for the new technology. Other common issues were the perception of security risks, data governance concerns, and the inertia of legacy systems.

The survey also found the stain of mistrust, with 82 percent of respondents sure that much of what they hear about tech trends is pure hype. It is no surprise, then, that they hesitate to invest resources and impose change on their workers until they are convinced benefits will be worth the effort. Perhaps vendors would be wise to dispense with the hype and just lay out the facts as clearly as possible; potential customers are savvier than some seem to think.

Cynthia Murrell, March 3, 2017

 

Comprehensive, Intelligent Enterprise Search Is Already Here

February 28, 2017

The article on Sys-Con Media titled Delivering Comprehensive Intelligent Search examines the accomplishments of World Wide Technology (WWT) in building a better search engine for the business organization. The Enterprise Search Project Manager and Manager of Enterprise Content at WWT discovered that the average employee will waste over a full week each year looking for the information they need to do their work. The article details how they approached a solution for enterprise search,

We used the Gartner Magic Quadrants and started talks with all of the Magic Quadrant leaders. Then, through a down-selection process, we eventually landed on HPE… It wound up being that we went with the HPE IDOL tool, which has been one of the leaders in enterprise search, as well as big data analytics, for well over a decade now, because it has very extensible platform, something that you can really scale out and customize and build on top of.

Trying to replicate what Google delivers in an enterprise is a complicated task because of how siloed data is in the typical organization. The new search solution offers vast improvements in presenting employees with the relevant information, and all of the relevant information and prevents major time waste through comprehensive and intelligent search.

Chelsea Kerwin, February 28, 2017

The Game-Changing Power of Visualization

February 8, 2017

Data visualization may be hitting at just the right time. Data Floq shared an article highlighting the latest, Data Visualisation Can Change How We Think About The World. As the article mentions, we are primed for it biologically: the human eye and brain processes 10 to 12 separate images per second, comfortably. Considering the output, visualization provides the ability to rapidly incorporate new data sets, remove metadata and increase performance. Data visualization is not without challenge. The article explains,

Perhaps the biggest challenge for data visualisation is understanding how to abstract and represent abstraction without compromising one of the two in the process. This challenge is deep rooted in the inherent simplicity of descriptive visual tools, which significantly clashes with the inherent complexity that defines predictive analytics. For the moment, this is a major issue in communicating data; The Chartered Management Institute found that 86% of 2,000 financiers surveyed late 2013, were still struggling to turn volumes of data into valuable insights. There is a need, for people to understand what led to the visualisation, each stage of the process that led to its design. But, as we increasingly adopt more and more data this is becoming increasingly difficult.

Is data visualization changing how we think about the world, or is the existence of big data the culprit? We would argue data visualization is simply a tool to present data; it is a product rather than an impetus for a paradigm shift. This piece is right, however in bringing attention to the conflict between detail and accessibility of information. We can’t help but think the meaning is likely in the balancing of both.

Megan Feil, February 8, 2017

Counter Measures to Money Laundering

January 30, 2017

Apparently, money laundering has become a very complicated endeavor, with tools like Bitcoin “washers” available via the Dark Web. Other methods include trading money for gaming or other virtual currencies and “carding.”  ZDNet discusses law enforcement’s efforts to keep up in, “How Machine Learning Can Stop Terrorists from Money Laundering.”

It will not surprise our readers to learn authorities are turning to machine learning to cope with new money laundering methods. Reporter Charlie Osborne cites the CEO of cybersecurity firm ThetaRay, Mark Gazit, when she writes:

By taking advantage of Big Data, machine learning systems can process and analyze vast streams of information in a fraction of the time it would take human operators. When you have millions of financial transactions taking place every day, ML provides a means for automated pattern detection and potentially a higher chance of discovering suspicious activity and blocking it quickly. Gazit believes that through 2017 and beyond, we will begin to rely more on information and analytics technologies which utilize machine learning to monitor transactions and report crime in real time, which is increasingly important if criminals are going to earn less from fraud, and terrorism groups may also feel the pinch as ML cracks down on money laundering.

Of course, criminals will not stop improving their money-laundering game, and authorities will continue to develop tools to thwart them. Just one facet of the cybersecurity arms race.

Cynthia Murrell, January 30, 2017

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta