Norman Rockwell Museum Leaves Behind Twentieth Century

April 18, 2014

Normal Rockwell is associated with his artwork of mid-twentieth century American life and the Saturday Evening Post. Most collector’s items carrying his artwork from plates to wall calendars are making their way to thrift stores as his fans age. Computerworld’s article, “How Big Data Helped The Norman Rockwell Museum Grow Revenue” details how data offered insights into reaching a younger audience.

The Norman Rockwell Museum used DigiWorks’s services to capture transactional data and then create a shopping profile for individual users to recommend products.

“The principle is simple: If you offer someone something they need, right before they need it, they’re more likely to buy. To do that, you need to understand your customers as individuals. If a customer once bought something for an infant six months ago, you need to understand what that means.”

Big data has saved Norman Rockwell from obscurity and even made it better. Could they do the same for the old masters?

Whitney Grace, April 18, 2014
Sponsored by, developer of Augmentext

New Data Integration Tool from BA Insight

April 11, 2014

A new data integration platform promises to simplify the process of deploying search-driven applications, save organizations time and money, and improve security. BA Insight posts, “BA Insight Announces Knowledge Integration Platform 2014 for Rapid Implementation of Search-Drive Applications.” No definition of “knowledge” is included, however.

The press release specifies:

“The BAI Knowledge Integration Platform turns enterprise search engines into knowledge engines by transforming the way information is found to get the right information to the right people at the right time. It has the flexibility to function as a comprehensive solution or be implemented in a phased approach to meet growing organizational needs. The platform consists of three robust engines:

*User Experience Engine – drives remarkable user experiences for finding and exploring knowledge or experts via an extensible engine and a library of powerful components

*Content Intelligence Engine – increases findability using automated classification, metadata generation, and text analytics

*Content Connectivity Engine – provides secure connectivity to a wide variety of content systems, enabling unified views of all knowledge assets”

The press release notes that several prominent global companies are using this platform, including the Apache Corporation. (No, that has nothing to do with open source software; it is a huge energy-exploration enterprise.) The write-up also emphasizes that the platform builds on an organization’s existing infrastructure to present users with an integrated view of their data.

BA Insight aims to make enterprise search more comprehensive and easier to use. Founded in 2004, the company is headquartered in Boston with offices in Chicago, Washington, DC, and Sacramento, California.

Cynthia Murrell, April 11, 2014

Sponsored by, developer of Augmentext

Microsoft Office Graph for Oslo Limits Bothersome Data

April 11, 2014

Just what we need—another way to shield folks from information they’d rather not see. Microsoft helps move us in that direction, this time within the enterprise. We learn about the hidden data-narrowing technology in “Social Enterprise, Machine Learning Meet in Microsoft’s Office Graph, Oslo” at eWeek. Oslo is a mobile app created to give users “an at-a-glance view of collaborative Office documents and activities.” The role of Office Graph is to narrow the data stream. Writer Pedro Hernandez tells us:

“Office Graph, while tucked ‘under the hood and never exposed to the user,’ helps users avoid information overload and focus on the task at hand by delivering ‘really personalized and relevant views of their world,’ according to Julia White, general manager of Microsoft Office. This ‘intelligence layer,’ which integrates with SharePoint, Exchange, Lync, Yammer and Office, is the basis of the company’s upcoming Oslo app. Oslo is a mobile-optimized app that ‘cuts through the noise by showing you what you need to know today, and even what’s likely to be important in the near future,’ stated Ashok Kuppusamy, a Microsoft FAST group program manager, in a blog post.”

The app should be available to users of Office 365 within the year. Some of Oslo’s features do sound helpful. For example, since many of us are better at remembering people’s names than project titles or keywords, users can search by colleague name. One can also see what content has been shared, liked, viewed, or modified. But I wonder—do people really need algorithms deciding what to include in “relevant views of their world”?

Cynthia Murrell, April 11, 2014

Sponsored by, developer of Augmentext

OpenCalais Has Big Profile Users

April 2, 2014

OpenCalais is an open source project that creates rich semantic data by using natural language processing and other analytical methods through a Web service interface. It is a simple explanation for a piece of powerful software. OpenCalais was originally part of ClearForest, but Thomson Reuters acquired the project in 2007. Instead of marketing OpenCalais as proprietary software, Reuters allowed it to remain open. OpenCalais has since become valued metadata open source software that is used on blogs to specialized museum collections.

There are many notables who use OpenCalais and a sample can be found on “The List Of OpenCalais Implementations Grows.”

OpenCalais is excited about the new additions to the list:

“Add 10 to the list of innovative sites and services that use OpenCalais to reduce costs, deliver compelling content experiences and mine the social web for insight. See our press release for more details on each. We are thrilled to recognize the following new sites and services that are changing the way we engage with news and the social Web. They join a growing number of others in media, publishing, blogging, and news aggregation who use OpenCalais.”

Among them are The New Republic, Al Jazeera’s English blogging news networks, Slate Magazine’s blogging network, and I*heart* Sea.” Not only do news Web sites use OpenCalais, but news aggregation apps do as well, including, Feedly. DocumentCloud, and OpenPublish. Expect the list to grow even longer and consider OpenCalais for your own metadata solution.

Whitney Grace, April 02, 2014
Sponsored by, developer of Augmentext

Digging for Data Gold

April 1, 2014

Tech Radar has an article that suggests an idea we have never heard before: “How Text Mining Can Help Your Business Dig Gold.” Be mindful that was a sarcastic comment. It is already common knowledge that text mining is advantageous tool to learn about customers, products, new innovations, market trends, and other patterns. One of big data’s main scopes is capturing that information from an organization’s data. The article explains how much data is created in a single minute from text with some interesting facts (2.46 million Facebook posts, wow!).

It suggests understanding the type of knowledge you wish to capture and finding software with a user-friendly dashboard. It ends on this note:

“In summary, you need to listen to what the world is trying to tell you, and the premier technology for doing so is “text mining.” But, you can lean on others to help you use this daunting technology to extract the right conversations and meanings for you.”

The entire article is an overview of what text mining can do and how it is beneficial. It does not go further than basic explanations or how to mine the gold in the data mine. That will require further reading. We suggest a follow up article that explains how text mining can also lead to fool’s gold.

Whitney Grace, April 01, 2014
Sponsored by, developer of Augmentext

The Enigma App

April 1, 2014

Information can be an enigma, which is probably why the developers named their new app that. Visiting the Enigma Web site opens on a picture of either New York or London with the headline “navigate the world of public data.” It is an intriguing idea that one would think could be accomplished with search engine or academic database. Then again when you think about the process and how time consuming it is, it would be handy to have a search engine that did most of the work for you.

Enigma was built as a solution to this problem. The company says they have:

“Enigma is amassing the largest collection of public data produced by governments, universities, companies, and organizations. Concentrating all of this data provides new insights into economies, companies, places and individuals.”

Enigma’s services do come with a fee, however. They offer public data search and quick analytics for free with sign-up, but if you want API access and online support you need to upgrade to plans that start at $195/month. The data search must be gold, when you consider that many of these records are available for the public. It is worth exploring to see how the service differs from a basic search engine, but it is hard to sign up. The registration page is finicky.

Whitney Grace, April 01, 2014
Sponsored by, developer of Augmentext

Darpa Prods Big Data Experts

March 29, 2014

I read “Darpa Calls for Advanced Big Data Ideas.” If the write up is accurate, Darpa is not on board with the marketing innovations about Big Data, whatever the term means. Darpa wants more. According to the TechRadar story:

According to V3, DARPA director Arati Prabhakar told a briefing on emerging threats with the House Armed Services Committee’s Subcommittee on Intelligence that it is looking to come up with some advanced big data ideas. She said that DARPA is creating a new set of cyber security capabilities that will ensure that networked information is trustworthy.

Address “big data” may be easier if those talking about it would define the term and the context in which the phrase is being used. Those who chant “Big Data,” including Darpa, are just empowering the sales people, the self appointed experts, and the failed middle school teachers who write “reports” for mid tier consulting firms.

Stephen E Arnold, March 29, 2014

Tibco Connects with Popular Big Data Repositories

March 29, 2014

Tibco continues to grow. The Wall Street Journal’s Market Watch reveals, “TIBCO Expands Connectivity to Key Big Data Sources.” Now, users of the company’s Spotfire data analysis platform can connect directly to big data storehouses at Cloudera, Hortonworks, and Pivotal. The press release quotes VP of Spotfire product strategy, Lars Bauerle:

“Our ability to connect directly to these data sources, conduct in-database analysis, and mash-up the data in the worlds of Hadoop and others puts Spotfire in prime position for enterprises looking to get the most out of their data assets. Spotfire now further embraces data access in all forms, including Big Data architecture, enabling our customers to derive significantly greater value from their existing data.”

Cloudera development VP Tim Stevens added:

“Cloudera and TIBCO Big Data technologies complement one another by adding significant value to our joint customers’ IT environments. Until now, analytics and Hadoop have separately been two of the most significant enterprise technologies of the last few years. As these technologies come together in Spotfire, we see an opportunity for organizations to reap great business value as they build out their enterprise data hubs.”

Launched in 1997 and offering a range of infrastructure and business intelligence software, Tibco is based in Palo Alto, California. The company is so sure of the competitive edge granted by its BI software that it has trademarked the phrase “two-second advantage.”

Cynthia Murrell, March 29, 2014

Sponsored by, developer of Augmentext

Open Data Collection with Ushahidi

March 28, 2014

The crowdsourced data collection platform Ushahidi, now assisting activists worldwide, was first created to facilitate public accountability and social activism during crises in its home nation, Kenya. Not surprisingly, Ushahidi is also the name of the non-profit behind the open-source project. interviewed the organization’s director of data projects, Chris Albon, about the platform. The article prefaces the dialogue with a brief explanation:

“In a nutshell, it allows citizens to make reports in a collaborative way, creating crowdsourced interactive maps. With a very intelligent approach, Ushahidi gives citizens the possibility to use the web, their smartphones and even SMS to gather data, which makes this technology accessible almost everywhere and for everyone. Originally created in Kenya to serve as an instrument for social activism and public accountability in crisis situations, the software has proven to be a great companion worldwide in bringing advocacy campaigns to a successful end. The team behind Ushahidi has not only created a world-changing technology but also they share it with others since it is released as Open Source.”

Albon tells us that the core Ushahidi platform is now being used in 159 countries and has been translated into 35 languages, and explains it is being used by groups from small, election-monitoring non-profits to global organizations tracking disaster relief efforts. Journalists also make use of the platform. Albon notes that his group helped build iHub in Nairobi, an “innovation hub” and community workspace designed to facilitate collaboration and community growth. See the article for more on this and Ushahidi’s other projects, Crowdmap, Swiftriver, Ping, and BRCK. The interview wraps up with something to look forward to: the next generation of the Ushahidi core platform, v3, is on its way.

Cynthia Murrell, March 28, 2014

Sponsored by, developer of Augmentext

Splunk and Tableau Developed Connector to Analyze Machine-Generated Data

March 27, 2014

The article on TechWorld titled Tableau Folds Splunk Data Into Business Analysis shares information on the new connector enabling the analysis of machine-generated data, developed in partnership by Tableau Enterprises and Splunk. The collaboration allows for a better understanding of product analytics and customer experience, since Splunk’s software collects data on what customers do when they visit a website. The article explains,

“The new driver for Tableau expands the scope of how Splunk data can be used by the enterprise. It imports data captured by Splunk into Tableau’s data processing and visualization environment. As a result, business analysts can merge the event data generated by servers with other sources of data, which would potentially provide new insights into customer behavior or corporate operations…The connector is a ODBC (Open Database Connectivity) driver that is included in the Tableau 8.1.4 maintenance release.”

Splunk’s software was initially used more for finding issues in a system, but with the addition of analysis tools the software’s ability’s were broadened. Now instead of just noting trouble spots on a website, the software is used to discover patterns in customer behavior. The article uses the example of users filling shopping carts on a website but not making purchases. Splunk’s software is used by managers to pinpoint the issue that is causing that lack of follow-through. Whether or not the partnership of Tableau and Splunk will pay off remains to be seen.

Chelsea Kerwin, March 27, 2014

Sponsored by, developer of Augmentext

Next Page »