CyberOSINT banner

Artificial Intelligence: A Jargon Mandala to Understand the Universe of Search

October 12, 2015

I read “Lux: Useful Sankey Diagram on AI.” A Sankey diagram, according to Sankey Diagrams a “Sankey diagram says more than 1,000 pie charts.” The assumption is, of course, that a pie chart presents meaningful data. In the energy sector you can visual flows in complex systems. It helps to have numbers when one is working towards a Sankey map, but if real data are not close at hand, one can fudge up some data.

Here’s the Sankey diagram in the write up:


You can see an almost legible version at this link.

What the diagram suggests is that certain information access and content processing functions flow into data mining, machine learning, and statistics. If you are a fan of multidimensionality, the arrow of time may flow in the reverse direction; that is from data mining, machine learning, and statistics to affective computing, cognitive computing, computational discovery, image and video analytics, language translation, navigation, recommender systems, and speech recognition.

The intermediary state, tinted a US currency green provides intermediating operations or conditions; for example, anomaly detection, collaborative filtering, computer eavesdropping, computer vision, pattern recognition, NLP, path planning, clustering, deep learning, dimensionality reduction, networks graphic models, online reinforcement learning, pattern similarity, probabilistic modeling, regression, and, my favorite, search algorithms.

The diagram, like the wild and crazy chemical imagery for Watson, seems to be a way to:

  1. Collect a number of discrete operations
  2. Arrange the operations into some orderly framework
  3. Allow the viewer to perceive relationships or the potential for relationships among the operations.

In short, skip the wild and crazy presentations by search and content processing vendors about how search enables broader and, hence, more valuable activities. Search is relegated to an entry in the intermediating column of the Sankey diagram.

My thought is that some folks will definitely love the idea that the many different specialties of content processing can be presented in a mandala which invites contemplation and consideration.

The diagram makes clear that when a company wants to know what one can do with the different and often clever operatio0ns one can perform with content, the answer may be, “Make a poster and hang it on the wall.”

In terms of applications, the chart makes quite explicit that some clever team will have to put the parts in order. Does this remind you of building a Star Wars character from Lego blocks.

The construct is the value, not the individual enabling blocks.

Stephen E Arnold, October 12, 2015

List of Data Visualization Players

August 17, 2015

I read “CI Radar Delivers New Competitive Intelligence Coverage of the Data Visualization Market.” In the story which explains a tracking and monitoring tool from a competitive intelligence firm was a little chunk of information. The story contains a list of the players which the competitive intelligence firm considers important in the Hollywoodization of analytic system outputs. Who loves a great chart? Certainly generals, mid tier consultants, and MBA students.

Here’s the list of data visualization players:

  • Adobe (ah, the magic of the creative cloud)
  • Advizor Solutions
  • Afs Technologies
  • BeyondCore
  • Birst
  • Centrifuge Systems
  • Chartio
  • ClearStory Data
  • DataHero
  • Datameer
  • Datawatch
  • Dell (visualization and not laptops?)
  • Domo
  • Dundas
  • GoodData
  • Halo
  • iDashboards (maybe free for academics?)
  • Inetsoft Technology
  • Infor (I think of this outfit as a CRM vendor)
  • Informatica (now owned by Permira)
  • Information Builders
  • International Business Machines (IBM) (which unit of IBM?)
  • Jinfonet Software
  • Logi Analytics
  • Looker
  • Manthan
  • Microsoft (my goodness)
  • Microstrategy
  • OpenText (is this the Actuate or the Talend acquisition?)
  • Panorama Software
  • Pentaho (don’t forget this is Hitachi)
  • Phocas Software
  • ProfitBase
  • Prognoz
  • Pyramid Analytics
  • Qlik
  • RapidMiner
  • Roambi
  • Salesforce (a surprise to me)
  • SAP (interesting?)
  • SAS (also interesting?)
  • Sisense
  • Splunk (a bit of a surprise)
  • Synerscope
  • Tableau Software
  • Teradata (Is this Rainstor, ThinkBig or another chunk of acquired technology?)
  • ThoughtSpot
  • TIBCO (is this Spotfire?)
  • Viur.

I would point out that some of the key players in the law enforcement and intelligence community are not included. Why would a consulting firm want to highlight the companies which are pioneering next generation, dynamic, interactive, and real time visualization tools. Although incomplete from my vantage point, how long will it be before Forrester, Gartner, and other mid tier firms roll out a magic wave rhomboid explaining what these companies are doing to be “players”?

Stephen E Arnold, August 17, 2015

Looking Towards 2015’s Data Trends

March 5, 2015

Here we go again! Another brand new year and it is time to predict where data will take us. For the past few years it has been all about the big data and while it has a solid base, other parts of the data science are coming into the limelight. While LinkedIn is a social network for professionals, one can also read articles on career advice, hot topics, and new trends in fields. Kurt Cagle is a data science expert and has written on the topic for over ten years. His recent article, “Ten Trends In Data Science In 2015” from December was posted on LinkedIn.

He calls the four data science areas the Data Cycle: analysis, awareness, governance, and acquisition. From Cagle’s perspective, 2014 saw big data has matured, data visualization software is in high demand, and semantics is growing. He predicts 2015 will hold much of the same:

“…with the focus shifting more to the analytics and semantic side, and Hadoop (and Map/Reduce without Hadoop) becoming more mainstream. These trends benefit companies looking for a more comprehensive view of their information environment (both within and outside the company), and represent opportunities in the consulting space for talented analysts, programmers and architects.”

Data visualization is going to get even bigger in the coming year. Hybrid data stores with more capabilities will become more common, semantics will grow even larger and specializing companies will be bought up, and there will be more competition for Hadoop. Cable also predicts work be done on a universal query language and data analytics are moving beyond the standard SQL.

His ending observations explain that data silos will be phased into open data platforms, making technology easier not just for people to use but also for technology to be compliant with each other.

Whitney Grace, March 05, 2015
Sponsored by, developer of Augmentext

Opening Watson to the Masses

March 4, 2015

IBM is struggling financially and one of the ways they hope to pull themselves out of the swamp is to find new applications for its supercomputers and software. One way they are trying to cash in on Watson is to create cognitive computer apps. EWeek alerts open source developers, coders, and friendly hackers that IBM released a bunch of beta services: “13 IBM Services That Simplify The Building Of Cognitive Watson Apps.”

IBM now allows all software geeks the chance to add their own input to cognitive computing. How?

“Since its creation in October 2013, the Watson Developer Cloud (WDC) has evolved into a community of over 5,000 partners who have unlocked the power of cognitive computing to build more than 6,000 apps to date. With a total of 13 beta services now available, the IBM Watson Group is quickly expanding its developer ecosystem with innovative and easy-to-use services to power entirely new classes of cognitive computing apps—apps that can learn from experience, understand natural language, identify hidden patterns and trends, and transform entire industries and professions.”

The thirteen new IBM services involve language, text processing, analytical tools, and data visualization. These services can be applied to a wide range of industries and fields, improving the way people work and interact with their data. While it’s easy to imagine the practical applications, it is still a wonder about how they will actually be used.

Whitney Grace, March 04, 2015
Sponsored by, developer of Augmentext

Short Honk: Impressive WiFi Signal Analysis

February 15, 2015

Navigate to this Imgur link. There are visualizations of a WiFi signal. Fascinating. I had no idea that the best signals were found on tops of wave “clouds.”


Excellent idea and work.

Stephen E Arnold, February 15, 2015

Four Visualization Tools to Choose From

February 12, 2015

MakeUseOf offers us a list of graphic-making options in its “4 Data Visualization Tools for Captivating Data Journalism.” Writer Brad Jones describes four options, ranging from the quick and easy to more complex solutions. The first entry, Tableau Public, may be the best place for new users to start. The write-up tells us:

“Data visualization can be a very complex process, and as such the programs and tools used to achieve good results can be similarly complex. Tableau Public, at first glance, is not — it’s a very accommodating, intuitive piece of software to start using. Simply import your data as a text file, an Excel spreadsheet or an Access database, and you’re up and running.

“You can create a chart simply by dragging and dropping various dimensions and measures into your workspace. Figuring out exactly how to produce the sort of visualizations you’re looking for might take some experimentation, but there’s no great challenge in creating simple charts and graphs.

“That said, if you’re looking to go further, Tableau Public can cater to you. It’ll take some time on your part to really understand the breadth of what’s on offer, but it’s a matter of learning a skill rather than the program itself being difficult to use.”

The next entry is Google Fusion Tables, which helpfully links to other Google services, and much of its process is automated. The strengths of Infoactive are its ability to combine datasets and a wealth of options to create cohesive longer content. Rounding out the list is R, which Jones warns is “obtuse and far from user friendly”; it even requires a working knowledge of JavaScript and its own proprietary language to make the most of its capabilities. However, he says there is simply nothing better for producing exactly what one needs.

Cynthia Murrell, February 12, 2015

Sponsored by, developer of Augmentext

WikiGalaxy: Interactive Visualization

February 5, 2015

Short honk: A visualization of some Wikipedia articles is available at this link.


The visualization includes a search box. It is helpful. I did not understand the dots of light that flew across the display. The display held my attention for a short period of time.

Stephen E Arnold, February 5, 2015

IBM on Skin Care

January 19, 2015

Watson has been going to town in different industries, putting to use its massive artificial brain. It has been working in the medical field interpreting electronic medical record data. According to Open Health News, IBM has used its technology in other medical ways: “IBM Research Scientists Investigate Use Of Cognitive Computing-Based Visual Analytics For Skin Cancer Image Analysis.”

IBM partnered with Memorial Sloan Kettering to use cognitive computing to analyze dermatological images to help doctors identify cancerous states. The goal is to help doctors detect cancer earlier. Skin cancer is the most common type of cancer in the United States, but diagnostics expertise varies. It takes experience to be able to detect cancer, but cognitive computing might take out some of the guess work.

Using cognitive visual capabilities being developed at IBM, computers can be trained to identify specific patterns in images by gaining experience and knowledge through analysis of large collections of educational research data, and performing finely detailed measurements that would otherwise be too large and laborious for a doctor to perform. Such examples of finely detailed measurements include the objective quantification of visual features, such as color distributions, texture patterns, shape, and edge information.”

IBM is already a leader in visual analytics and the new skin cancer project has a 97% sensitivity and 95% specificity rate in preliminary tests. It translates to cognitive computing being accurate.

Could the cognitive computing be applied to identifying other cancer types?

Whitney Grace, January 19, 2015
Sponsored by, developer of Augmentext

Searchblox Announces New Visualization Method for Search Results

December 29, 2014

The brief article on Searchblox titled A Visualization Is Worth a Thousand Search Results relates the addition of visualization to the Elasticsearch-based system, Searchblox. Searchblox is an open source enterprise content search engine founded in 2003. Its customers range over 25 countries and include Harley Davidson, Capital One Investments, Kellog, and the US Department of Justice, to name just a few. The article discusses the latest advancement of visualization with a note on how to use the new plugin and how it works. The article states,

“Create a visualization from your search results using our new AngularJS database plugin and discover unique insights from your data. The AngularJS plugin integrated the raw/d3js javacsript library to create visualizations on the fly for your analysis, content marketing and infographic needs. After you setup your collections, simply install the plugin and configure the required filters and database columns to display.

Once the data grid is configured you can see the search results in a grid format.”

The article stipulates that the plugin is best suited for data from csv files and databases. The ability to see your results as a graphic rather than a list is certainly promising, especially for people who are visual learners. There are several nifty chart options available, for all of which the user is able to state the fields for their data.

Chelsea Kerwin, December 29, 2014

Sponsored by, developer of Augmentext

Centrifuge Analytics v3 Promises Better Understanding of Big Data Through Visualization

December 23, 2014

The article on WMC Action News 5 titled Centrifuge Analytics v3 is Now Available- Large Scale Data Discovery Never Looked Better promotes the availability of Centrifuge Analytics v3, a product that enables users to see the results of their data analysis like never before. This intuitive, efficient tool helps users dig deeper into the meaning of their data. Centrifuge Systems has gained a reputation in data discovery software, particularly in the fields of cyber security, counter-terrorism, homeland defense, and financial crimes analysis among others. Chief Executive Officer Simita Bose is quoted in the article,

“Centrifuge exists to help customers with critical missions, from detecting cyber threats to uncovering healthcare fraud…Centrifuge Analytics v3 is an incredibly innovative product that represents a breakthrough for big data discovery.” “Big data is here to stay and is quickly becoming the raw material of business,” says Stan Dushko, Chief Product Officer at Centrifuge Systems. “Centrifuge Analytics v3 allows users to answer the root cause and effect questions to help them take the right actions.”

The article also lists several of the perks of Centrifuge Analytics v3, including that it is easy to deploy in multiple settings from a laptop to the cloud. It also offers powerful visuals in a fully integrated background that is easy for users to explore, and even add to if source data is complete. This may be an answer for companies who have all the big data they need, but don’t know what it means.

Chelsea Kerwin, December 23, 2014

Sponsored by, developer of Augmentext

Next Page »