OpenText Opens Advanced Content Analytics Market

February 14, 2011

Following in the footsteps of other vendors, Open Text has opened an advanced content analytics market.

OpenText Licensing Agreement Brings Advanced Content Analytics to Market” reveals a tie up between OpenText and the National Research Council (Canada). The idea is that new Content Analytics innovations will be added to the ECM Suite and made available by spring 2011. The added content analytics to the ECM Suite will improve data mining and analysis. The key point is:

“Content analytics is the key to extracting business value from social media and text-rich online and enterprise information sources, an essential technology for marketing, online commerce, customer service, and improved search and Web experience. Given the mind-boggling growth in information volumes, no wonder uptake is booming, powered by rapid technical advances from leading-edge vendors such as OpenText.”

Content Analytics will perform data mining that will uncover and show relationships between businesses and other facts. It will be able to find information that a normal search engine wouldn’t find. This agreement is the beginning for OpenText to apply Content Analytics to all its enterprise content management Suite products.

Whitney Grace, February 14, 2011

Freebie

Textalyser Highlighted on Podcast

February 9, 2011

Text analysis was mentioned by the podcast No Agenda, which is hosted by Adam Curry (professional broadcast journalist) and John C. Dovorak (technical and business columnist). The No Agenda podcast team runs certain text through Textalyzer and uses the output to identify “memes”; that is, words or phrases designed to be magnetic and persist in a conversation.

You can give Textalyser, the tool No Agenda mentioned, by navigating to http://textalyser.net/. There are two Web accessible modes. First, you can take a chunk of text and paste it into the Analysis Box on the Web page. The system will generate a report. Shown below, is a portion of the Textalyser report for one of my 2010 for fee columns.

textalyser 01

The report generates a word frequency report, word length summary, and two, three, and four word phrase frequency reports.

The service carries this identification notice: V 1.05 help Traduction Nieruchomo?ci Magazine interactif Umarex Airsoft + Paintball. For more information about the service, you can navigate to this link and leave a message.

Stephen E Arnold, February 9, 2011

Freebie

Synthesys Platform Beta Available

February 7, 2011

Digital Reasoning alerted us last week that a new beta program for the Synthesys Platform is available. Digital Reasoning has emerged as one of “the leader in complex, large scale unstructured data analytics.” The Synthesys platform is one of the “leaders in complex, large scale unstructured data analytics.” We have interviewed the founder of Digital Reasoning in our Search Wizards Speak series. These interviews are available on ArnoldIT.com’s Search Wizards Speak series here and here. Digital Reasoning is one of the leaders in making next-generation analytics available via the cloud, on premises, and hybrid methods.

image

© Digital Reasoning, 2011

This platform version of Digital Reasoning’s software will provide beta users immediate API-level access to the firm’s analytics software and access to tools that will be added through the beta program.

Matthew Russell, vice president of engineering at Digital Reasoning said:

We are excited to introduce Synthesys Platform to the market. By allowing users to upload their data into the cloud for analysis, many more users will get the opportunity to experience next generation data analytics while exploring their own data.

Digital Reasoning Systems (www.digitalreasoning.com) solves the problem of information overload by providing the tools people need to understand relationships between entities in vast amounts of unstructured and structured data.

Digital Reasoning builds data analytic solutions based on a distinctive mathematical approach to understanding natural language. The value of Digital Reasoning is not only the ability to leverage an organization’s existing knowledge base, but also to reveal critical hidden information and relationships that may not have been apparent during manual or other automated analytic efforts. Synthesys is a registered trademark of Digital Reasoning Systems, Inc.

Digital Reasoning will be exhibiting at the upcoming Strata Conference on February 28 and March 1, 2011. For more information about Digital Reasoning, navigate to the company’s Web site at www.digitalreasoning.com.

Stephen E Arnold, February 7, 2011

Juru, Watson, I Say, Juru!

February 1, 2011

Quite a heated discussion at lunch today. One of the goslings was raving about Watson. The Jeopardy demo convinced the engineer that IBM had the next big thing in search. A person can ask a question and right away get the answer. Wow. I thought that type of computer system only worked under carefully controlled conditions, in demos, or in motion pictures.

That’s why the goslings were agitated when I said, “It is TV. TV does almost anything—well, anything—for money.” I pointed out that the game shows 21 and the $64,000 Question took some liberties to boost ratings. Have TV times changed that much? I said, “I don’t think so.”

I supported my argument by mentioning Juru. Do you remember that gem from IBM. Here’s what my Overflight system spit out.

Juru is / was a full text search “library” that would make short work of “small and mid-sized corpuses.” Of course, “small” and “mid-sized” are rarely defined either by IBM or other search researchers. The idea was that Java made it easy to run Juru on any platform. Of course, today, I don’t think Juru would work in the Android or IOS environment, but some day maybe.

Juru asserted that the system would:

  • Support different document types
  • Make use of links just like our every tweakable PageRank-type systems
  • On the fly summaries of documents
  • Clustering
  • Nifty ways to keep the indexes small and, therefore, zippy.

You can get some info at this link. There is some additional color here:

I reminded the goslings that IBM rolls out search solutions as part of its global marketing efforts. More to the point, I asked the goslings which vendors’ search systems IBM resells. I did not hear the magic words Autonomy or Endeca. IBM once loved Fast ESP.

If you want search from IBM, what do you get today? A version of the open source search solution Lucene. Why? It works pretty well. Juru, Watson, Web Fountain, et al? Well, make up your own mind with some head to head testing. I won the argument and still had to pay for lunch. Honk.

Stephen E Arnold, February 1, 2011

Freebie

SharePoint Sharing from Attivio

January 19, 2011

Of interest to businesses overwhelmed with voluminous SharePoint content: “Attivio Announces AIE for SharePoint Integration” tells of the Active Intelligence Engine’s new availability to aggregate information across not only SharePoint, but also websites, databases, email, CRM and other information sources.

According to the announcement,

“The difficulty of discovering and delivering timely insight derived from all of these resources exposes gaps in an organization’s ability to integrate and rapidly update information; providing a single method for users to find the information they need, regardless of its origins.  AIE for SharePoint Integration enables secure access to all types of information by unifying diverse datasets, while avoiding the cost and delays of cumbersome legacy integration stages. “

With AIE companies no longer have to worry about SharePoint silos, easily accessible only within departments and can instead maximize insight and collaboration across the entire enterprise.  Attivio promises retention of data relationships from text sources, rapid implementation, and tight security to result in maximized competitive advantage. We believe the company has a good approach to a very tough SharePoint challenge.

Alice Wasielewski, January 19, 2011

Google and Local Search Commitment

January 17, 2011

Google re-loads and takes aim, this time at Facebook.  “Google’s Mobile Matchmaker” reports on an interview with Marissa Mayer, Google’s executive in charge of “local” products.  For Google “local” includes, maps, mobile, and even social activities.  “Contextual discovery,” giving automatic location-based information, or “search without search” as she calls it, is the basis of the way Mayer seeks to knock Facebook off its pedestal.  Google is working on taking the location information and adding social contextual information, such as showing a person in a restaurant the menu with annotations from friends or regular customers of the venue.  When asked if Google might work with Facebook on some of these social applications, Mayer demurred, citing Facebook’s closed nature versus Google’s support of the open web.  Instead, Mayer pointed to the Google social-esque alternatives such as Google Latitude, an application that follows the physical location of someone on a map.  The social implications seem obvious: “Once you tell Google who your friends are on Latitude, that same information might eventually be used for other services like socially marked-up menus, if you permitted it. The point is that Google may have more ways to acquire social information than just by building its own competing social network.”  My view is that Facebook may have reached its peak and is ripe for a serious alternative  The idea of friends’ LoJacking on Latitude doesn’t appeal to me, but then I’m not a FourSquare fan either.  Facebook, watch out, Google’s war is underway.

Alice Wasielewski, January 17, 2011

Freebie

2007 Semantic Search Info Still Relevant

January 13, 2011

Short honk. We had a long call today (January 12, 2011) about semantic search. In the course of the call, I mentioned a presentation by Jon Atle Gulla, a profession in 2007 at the Norwegian University of Science and Technology. I did some poking around and found the link to the presentation. Quite useful in 2007 and still germane today. The presentation puts into context some of the work that must be done to deploy an effective semantic technology system in an organization. The slide deck is on Slideshare at this link. Registration may be required to access the file.

Stephen E Arnold, January 13, 2011

Freebie

Attensity’s New Year’s Resolution

January 11, 2011

Attensity, now a multi-faceted technology management firm, has set a new course for itself this year in Making it Work in 2011!.  In the past it seems as though the company’s focus was increasingly on government contracts, as illustrated by the formation of the subsidiary Attensity Government Systems.  Well oh how “the times they are a changing.”  In a blog post on the company’s website in late December, buried beneath references to both classical music and reality television, the new direction is laid out.

Currently, a massive amount of data is generated by the surging wave of social networking sites and the new breed of citizen journalists.  Per Attensity:  “These days, competitors often have access to the same source material of customer conversations from Twitter, Facebook, blogs, forums, and review sites.  However, where the battle is truly won or lost is in how companies are able to harness and arrange that material, embellishing it with insights from their own internal survey and call center data, and transforming it into a symphony of action.”  So, Attensity’s new focus for the coming year is to improve their current menu, giving companies the option to act on multi-channel conversations.

It appears that like many companies, Attensity sees an opportunity in repackaging their services for broader consumption in an effort to cash in on the public’s embracing of these fresh and exciting technologies.  The same blog post gives a quick nod to the outgoing year’s poor economic makeup, though one is still left speculating if its main motive for the shift from its government affiliations to those of private consumers is to have a bountiful 2011. No problem with that.

Sarah Rogers, January 11, 2011

Freebie

Lexalytics and DataSift

December 31, 2010

If sentiment analysis is the key ingredient in the social content technology cocktail, then Lexalytics aims to be the brand of choice for businesses and individual consumers everywhere. MediaSift Ltd., the British company behind the Datasift social media filtering engine, is eager to see a partnership with the Lexalytics text analysis software take root.

We learned in DataSift Taps Lexalytics to Help “Tune Your Data”, that one focus of the alliance is the ever increasing accumulation of data generated from tweeting. “Lexalytics provides the ability to automatically extract companies, people or product names, without having a list of them ahead of time; the ability to calculate tweet, entity, and “linked-content” sentiment; output lists of positive/negative entities; and more.”

The Founder and CEO of Favorit Ltd., owner of Tweetmeme, a service designed to total all links and ascertain which are the most popular, is Nick Halstead. “An important part of the metrics we provide through Datasift is the sentiment, or tonality of the data. We needed an engine that could integrate quickly into our environment and start immediately providing accurate sentiment analysis across all our data services.” says Halstead. “Lexalytics Salience gives us a great combination of flexible integration, high performance and accurate sentiment analysis.”

Another goal of the union is to give users the tools to observe and respond in real-time. This is accomplished through the interpretation of massive amounts of data from a variety of online sources. The Lexalytics software possesses the capability of converting all English text and is compatible with multiple systems.  Looks like another player in social content technology is being added to the shaker.

Sarah Rogers, December 31, 2010

Freebie

Scientists Map the News

December 22, 2010

People have been trying to figure out what makes good news stories and a team of European computer scientists might have found the answer. Red Orbit points us to the article, “Scientists Map What Factors Influence The News Agenda.” The team discovered that if you study a wide range of media outlets for a long period of time, patterns start to emerge. Most of the content that makes it on the news concerns national biases, cultural, geographical, and economic ties between countries.

“The analysis the researchers have conducted could not have been done in the past, due to the sheer scale of the data, but is now possible using automated methods from artificial intelligence because of recent advances in machine translation and text analysis.”

An analysis of the new agenda could prove to have many applications, especially in understanding how people view and use information. It also has the potential to allow scientists to study how media affects the entire globe and discover discernable patterns they normally would never have found.

Whitney Grace, December 22, 2009

Freebie

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta