CyberOSINT banner

Looking Towards 2015’s Data Trends

March 5, 2015

Here we go again! Another brand new year and it is time to predict where data will take us. For the past few years it has been all about the big data and while it has a solid base, other parts of the data science are coming into the limelight. While LinkedIn is a social network for professionals, one can also read articles on career advice, hot topics, and new trends in fields. Kurt Cagle is a data science expert and has written on the topic for over ten years. His recent article, “Ten Trends In Data Science In 2015” from December was posted on LinkedIn.

He calls the four data science areas the Data Cycle: analysis, awareness, governance, and acquisition. From Cagle’s perspective, 2014 saw big data has matured, data visualization software is in high demand, and semantics is growing. He predicts 2015 will hold much of the same:

“…with the focus shifting more to the analytics and semantic side, and Hadoop (and Map/Reduce without Hadoop) becoming more mainstream. These trends benefit companies looking for a more comprehensive view of their information environment (both within and outside the company), and represent opportunities in the consulting space for talented analysts, programmers and architects.”

Data visualization is going to get even bigger in the coming year. Hybrid data stores with more capabilities will become more common, semantics will grow even larger and specializing companies will be bought up, and there will be more competition for Hadoop. Cable also predicts work be done on a universal query language and data analytics are moving beyond the standard SQL.

His ending observations explain that data silos will be phased into open data platforms, making technology easier not just for people to use but also for technology to be compliant with each other.

Whitney Grace, March 05, 2015
Sponsored by ArnoldIT.com, developer of Augmentext

Opening Watson to the Masses

March 4, 2015

IBM is struggling financially and one of the ways they hope to pull themselves out of the swamp is to find new applications for its supercomputers and software. One way they are trying to cash in on Watson is to create cognitive computer apps. EWeek alerts open source developers, coders, and friendly hackers that IBM released a bunch of beta services: “13 IBM Services That Simplify The Building Of Cognitive Watson Apps.”

IBM now allows all software geeks the chance to add their own input to cognitive computing. How?

“Since its creation in October 2013, the Watson Developer Cloud (WDC) has evolved into a community of over 5,000 partners who have unlocked the power of cognitive computing to build more than 6,000 apps to date. With a total of 13 beta services now available, the IBM Watson Group is quickly expanding its developer ecosystem with innovative and easy-to-use services to power entirely new classes of cognitive computing apps—apps that can learn from experience, understand natural language, identify hidden patterns and trends, and transform entire industries and professions.”

The thirteen new IBM services involve language, text processing, analytical tools, and data visualization. These services can be applied to a wide range of industries and fields, improving the way people work and interact with their data. While it’s easy to imagine the practical applications, it is still a wonder about how they will actually be used.

Whitney Grace, March 04, 2015
Sponsored by ArnoldIT.com, developer of Augmentext

Short Honk: Impressive WiFi Signal Analysis

February 15, 2015

Navigate to this Imgur link. There are visualizations of a WiFi signal. Fascinating. I had no idea that the best signals were found on tops of wave “clouds.”

image

Excellent idea and work.

Stephen E Arnold, February 15, 2015

Four Visualization Tools to Choose From

February 12, 2015

MakeUseOf offers us a list of graphic-making options in its “4 Data Visualization Tools for Captivating Data Journalism.” Writer Brad Jones describes four options, ranging from the quick and easy to more complex solutions. The first entry, Tableau Public, may be the best place for new users to start. The write-up tells us:

“Data visualization can be a very complex process, and as such the programs and tools used to achieve good results can be similarly complex. Tableau Public, at first glance, is not — it’s a very accommodating, intuitive piece of software to start using. Simply import your data as a text file, an Excel spreadsheet or an Access database, and you’re up and running.

“You can create a chart simply by dragging and dropping various dimensions and measures into your workspace. Figuring out exactly how to produce the sort of visualizations you’re looking for might take some experimentation, but there’s no great challenge in creating simple charts and graphs.

“That said, if you’re looking to go further, Tableau Public can cater to you. It’ll take some time on your part to really understand the breadth of what’s on offer, but it’s a matter of learning a skill rather than the program itself being difficult to use.”

The next entry is Google Fusion Tables, which helpfully links to other Google services, and much of its process is automated. The strengths of Infoactive are its ability to combine datasets and a wealth of options to create cohesive longer content. Rounding out the list is R, which Jones warns is “obtuse and far from user friendly”; it even requires a working knowledge of JavaScript and its own proprietary language to make the most of its capabilities. However, he says there is simply nothing better for producing exactly what one needs.

Cynthia Murrell, February 12, 2015

Sponsored by ArnoldIT.com, developer of Augmentext

WikiGalaxy: Interactive Visualization

February 5, 2015

Short honk: A visualization of some Wikipedia articles is available at this link.

image

The visualization includes a search box. It is helpful. I did not understand the dots of light that flew across the display. The display held my attention for a short period of time.

Stephen E Arnold, February 5, 2015

IBM on Skin Care

January 19, 2015

Watson has been going to town in different industries, putting to use its massive artificial brain. It has been working in the medical field interpreting electronic medical record data. According to Open Health News, IBM has used its technology in other medical ways: “IBM Research Scientists Investigate Use Of Cognitive Computing-Based Visual Analytics For Skin Cancer Image Analysis.”

IBM partnered with Memorial Sloan Kettering to use cognitive computing to analyze dermatological images to help doctors identify cancerous states. The goal is to help doctors detect cancer earlier. Skin cancer is the most common type of cancer in the United States, but diagnostics expertise varies. It takes experience to be able to detect cancer, but cognitive computing might take out some of the guess work.

Using cognitive visual capabilities being developed at IBM, computers can be trained to identify specific patterns in images by gaining experience and knowledge through analysis of large collections of educational research data, and performing finely detailed measurements that would otherwise be too large and laborious for a doctor to perform. Such examples of finely detailed measurements include the objective quantification of visual features, such as color distributions, texture patterns, shape, and edge information.”

IBM is already a leader in visual analytics and the new skin cancer project has a 97% sensitivity and 95% specificity rate in preliminary tests. It translates to cognitive computing being accurate.

Could the cognitive computing be applied to identifying other cancer types?

Whitney Grace, January 19, 2015
Sponsored by ArnoldIT.com, developer of Augmentext

Searchblox Announces New Visualization Method for Search Results

December 29, 2014

The brief article on Searchblox titled A Visualization Is Worth a Thousand Search Results relates the addition of visualization to the Elasticsearch-based system, Searchblox. Searchblox is an open source enterprise content search engine founded in 2003. Its customers range over 25 countries and include Harley Davidson, Capital One Investments, Kellog, and the US Department of Justice, to name just a few. The article discusses the latest advancement of visualization with a note on how to use the new plugin and how it works. The article states,

“Create a visualization from your search results using our new AngularJS database plugin and discover unique insights from your data. The AngularJS plugin integrated the raw/d3js javacsript library to create visualizations on the fly for your analysis, content marketing and infographic needs. After you setup your collections, simply install the plugin and configure the required filters and database columns to display.

Once the data grid is configured you can see the search results in a grid format.”

The article stipulates that the plugin is best suited for data from csv files and databases. The ability to see your results as a graphic rather than a list is certainly promising, especially for people who are visual learners. There are several nifty chart options available, for all of which the user is able to state the fields for their data.

Chelsea Kerwin, December 29, 2014

Sponsored by ArnoldIT.com, developer of Augmentext

Centrifuge Analytics v3 Promises Better Understanding of Big Data Through Visualization

December 23, 2014

The article on WMC Action News 5 titled Centrifuge Analytics v3 is Now Available- Large Scale Data Discovery Never Looked Better promotes the availability of Centrifuge Analytics v3, a product that enables users to see the results of their data analysis like never before. This intuitive, efficient tool helps users dig deeper into the meaning of their data. Centrifuge Systems has gained a reputation in data discovery software, particularly in the fields of cyber security, counter-terrorism, homeland defense, and financial crimes analysis among others. Chief Executive Officer Simita Bose is quoted in the article,

“Centrifuge exists to help customers with critical missions, from detecting cyber threats to uncovering healthcare fraud…Centrifuge Analytics v3 is an incredibly innovative product that represents a breakthrough for big data discovery.” “Big data is here to stay and is quickly becoming the raw material of business,” says Stan Dushko, Chief Product Officer at Centrifuge Systems. “Centrifuge Analytics v3 allows users to answer the root cause and effect questions to help them take the right actions.”

The article also lists several of the perks of Centrifuge Analytics v3, including that it is easy to deploy in multiple settings from a laptop to the cloud. It also offers powerful visuals in a fully integrated background that is easy for users to explore, and even add to if source data is complete. This may be an answer for companies who have all the big data they need, but don’t know what it means.

Chelsea Kerwin, December 23, 2014

Sponsored by ArnoldIT.com, developer of Augmentext

More Metadata: Not Needed Metadata

November 21, 2014

I find the metadata hoo hah fascinating. Indexing has been around a long time. If one wants to dig into the complexities of metadata, you may find the table from InfoLibCorp.com helpful:

image

Mid tier consulting firms often do not use the products or systems their “experts” recommend. Consultants in indexing do create elaborate diagrams that make my eyes glaze over.

Some organizations generate metadata without considering what is required. As a result, outputs from the systems can present mind boggling complex options to the user. A report displaying multiple layers  of metadata can be difficult to understand.

My thought is that before giving the green light to promiscuous metadata generation, some analysis and planning may be useful. The time lost trying to figure out which metadata is relevant to a particular issue can be critical.

But consultants and vendors are indeed impressed with flashy graphics. Too many times no one has a clue what the graphics are trying to communicate. The worst offenders are companies that sell visual sizzle to senior managers. The goal is a gasp from the audience when the Hollywood style visualizations are presented. Pass the popcorn. Skip the understanding.

Stephen E Arnold, November 21, 2014

Presto the Software Formerly Known as JackBe

October 30, 2014

The information page titled What You Can Do With: Presto on Software AG Products provides an overview of the data-combining software formerly known as JackBe until its acquisition by Software AG. JackBe is now Presto! (Exclamation point optional.) Information flow since March 2014 has been modest. The article offers an overview and some of the capabilities of the software, such as in-memory analytics and visualization and data mashing. The article states,

“Presto combines data from any source for data visualizations. Accessing the original data—directly from data warehouses, news feeds, social media, existing BI systems, streaming big data, even Excel spreadsheets—lets business users respond to changing conditions as they happen. Presto’s “point-click-connect” assembly tool, Wires, makes it easy to bring together and manipulate data from multiple existing systems into meaningful data visualizations. Simple, powerful data mashing means IT and power users can create new apps and dashboards in hours—even minutes…”

Software AG began in 1969 in Germany and in 2013 acquired JackBe. According to the Company History page, the deal was actually awarded the title of Strategic M&A deal of the Year by the Association for Corporate Growth. Other acquisitions include Apama Complex Event Processing Platform, alfabet AG, and Longjump.

Chelsea Kerwin, October 30, 2014

Sponsored by ArnoldIT.com, developer of Augmentext

Next Page »