Vivisimo Does for Content

March 21, 2012

Vivisimo, the information optimization software provider, recently rolled out an automated blog called the Vivisimo Daily. We find it interesting that the blog’s appearance coincides with the company’s step up in customer support marketing.

This microsite has a variety of different media sources in which it displays content, ranging from posts to videos. They even have a CXO mobile contest.

In reference to CXO, the editor’s note states:

Customer eXperience Optimization (CXO) connects customer-facing professionals in your sales, support and customer service organizations with all of the information they need for successful customer, partner and sales prospect interactions.

The service is operated by, a quick and easy way for organizations to publish their own online newspapers. This is an example of a company utilizing to assist with a content play. More original content might be a plus.

Jasmine Ashton, March 21, 2012

Sponsored by

Discovery Engines and Information Overload Management

February 7, 2012

Are discovery engines the cure for information overload? The Darwin Awareness Engine Blog lists “How to Manage Information Overload: 6 Ways Discovery Engines Help.” First, a distinction: a discovery engine goes further than a search engine, offering tools to refine a search, consolidate data, and apply context to the results.

Discovery engines, states writer Romain Goday, help users navigate overwhelming data because they: focus on topics, not people; go straight to the source of information; supply information through a single channel; let users discover what they didn’t know that they didn’t know; often go through a curation process; and reduce anxiety by combining and ranking sources. See the write up for details on each point.

The article asserts:

Managing information overload requires tools that deliver ‘awareness’ of topics and filter out irrelevant information will become indispensable. The challenge is to do so without losing the ability to make unexpected discoveries. Content discovery engines are addressing this need with a multitude of approaches. The market remains very fragmented but we can expect important players to emerge in the next few years.

I’m sure we can. Our concern is that information may be lost through the auto-selection process. Is it wise to rely on an AI for such an important task? Do we have a choice at this point, or has the big data monster grown too big for the human touch?

Cynthia Murrell, February 7, 2012

Sponsored by

FirstRain Gets Some Azure Chip Love

December 18, 2011

According to the October 25 news release, FirstRain Recognized as “Innovative Business Analytics Company under $100M to Watch in 2011” by Leading Market Research Firm, the analyst firm IDC has included FirstRain, an analytics software company, in its 2011 list of “Innovative Business Analytics Companies Under $100M to Watch.”

FirstRain is an analytics software company that uses its Business Monitoring Engine to provide professionals with access to the business Web. The company’s semantic-categorization technology instantly cuts through the clutter of consumer Web content, delivering only highly-relevant intelligence.

The company was highlighted in the “Cloud-based Analytics” category for their innovative use of semantic analysis to extract and deliver high-value information from the Web.

IDC observed:

The value in using FirstRain is the breadth of its coverage, combined with its depth of selection and filtering so that it delivers the information that users need to see without cluttering their desktops or their minds with too much that is extraneous. It was easy to integrate into existing information delivery channels and because of the high relevance of the information that it delivered.

The fact that IDC even has a list of top business analytics companies shows how important search optimization software is becoming in the business world. Who knew that business intelligence would be the new black?

Jasmine Ashton, December 18, 2011

Sponsored by

Digimind 9 Now Available

December 17, 2011

In the current economic climate, businesses are under more pressure than ever to consolidate their resources and invest in products that will maximize cost and efficiency. When choosing information management solutions, it is especially important that companies keep their specific needs in mind.

According to a recent PRWeb news release “Digimind launches Digimind 9 – Next generation Competitive Intelligence for Smarter Decision Making,” competitive intelligence software provider Digimind, has released Digimind 9, updated software which is designed to accompany organizations throughout their intelligence workflows.

The article states:

Digimind 9 comes in response to a growing demand from companies willing to complement their CI apparatus with such features included as “advanced semantic analysis”, “social media monitoring”, and “intelligence profile management”. Indeed, beyond the conventional intelligence workflows, more intelligence requirements surface nowadays to leverage on social networks, unstructured data, and related analysis.

As data is being created faster than ever, Digimind 9 meets the ever growing need for companies to improve their capacity and to react and anticipate rapid changes.

Jasmine Ashton, December 17, 2011

Sponsored by

Protected: Trade Tips and Prices at the SharePoint StackExchange

December 16, 2011

This content is password protected. To view it please enter your password below:

Protected: Grasp Your SharePoint Workflows and Evaluate Productivity

December 14, 2011

This content is password protected. To view it please enter your password below:

Protected: The Contested Fact: Sharepoint Has Social Media Potential

December 13, 2011

This content is password protected. To view it please enter your password below:

DataExplorers and Why Financial Information Vendors Fear a Storm

December 4, 2011

I am still amused that my team predicted the management shift at Thomson Reuters weeks before the news broke. Alas, that 250 page analysis of the Thomson Reuters’ $13 billion a year operation is not public. Shame. However, one can  get a sense of the weakening timbers in the publishing and information frigate in the Telegraph’s story “DataExplorers Looks for £300m Buyer.”

DataExplorers is a specialist research company. The firm gathers information about the alleged lending of thousands of institutional funds. I am not familiar with the names of these exotic financial beasties. The aggregated data are subjected to the normal razzle dazzle of the aggregation for big money crowd. The data are collected, normalized, and analyzed. The idea is that an MBA looking to snag an island can use the information to make a better deal. Not surprisingly, the market for these types of information is small, only a fraction of those in the financial services industry focus on this sector.

DataExplorer’s revenues reflect this concentration. According to the write up, the company generated less than £15 million in annual revenues in 2010 with a profit of about £3 million. The margin illustrates what can be accomplished with a niche market, tight cost controls, and managers from outfits like Thomson Reuters. That troubled outfit contributed the management team at DataExplorers.

Now here’s the hook?

The company is for sale, according to the Telegraph which is a “real” journalistic outfit, for £300 million. That works out to a number that makes sense in the wild and crazy world of financial information; that is, 100 times earnings or 20 times revenue. The flaw, which I probably should not peg to just Thomson Reuters, has these facets:

  1. The global financial “challenge” means that there may be some pruning of information services in the financial world. Stated another way, MBAs will be fired and their employers may buy less of expensive services such as DataExplorers
  2. If the financial crisis widens, the appeal of “short” information may lose a bit of its shine. Once a market tanks, what’s the incentive for those brutalized by the sectors’ collapse to stick around
  3. Thomson Reuters is pretty good at cost cutting. Innovating is not part of the usual package. This means that DataExplorers may be at the peak of its form and sea worthy for a one day cruise in good weather, and once a deal goes down, the new owners may have a tough time growing the business because marketing and research will require infusions of capital to keep the vessel from listing.

Net net: DataExplorers is an example of an information property which may be tough to get back into growth mode. The buyer will be confident that it knows how to squeeze more performance from a niche information product. And that assumption is what contributes to the woes of Thomson Reuters, Reed Elsevier, and many other high end professional content producers. Optimism is a great quality. Realism is too.

Stephen E Arnold, December 4, 2011

Sponsored by

X1 and Newsgator Venture Into RSS

October 11, 2011

In a new angle for search vendors, X1, a productivity enhancement and information management software tools company, partners with Newsgator, developer of content aggregator solutions, to bring search to aggregated RSS (really simple syndication) news and information.  “X1 and NewsGator
Partner to Provide Instant Search Capabilities of Aggregated RSS News and
” tells more.

The fast-as-you-can-type search capabilities of X1™ Search, which lets users find the content of email, files, attachments, and contacts, has been coupled with NewsGator’s ability to deliver news and information directly into Microsoft Outlook to give customers a simple, integrated solution for obtaining and finding information.

Our experience is that news archives are not particularly deep, so regardless of search engine, much time sensitive content disappears or gets a “buy this story” link banner.  The concept is interesting and might be useful for very short-term access.  However, other solutions must be used to ensure long-term archiving.

Emily Rae Aldridge, October 11, 2011

Google Abandons Another No Brainer Database

June 9, 2011

In “Google Kills Google News Archive,” Techspot’s reporting the end of the Internet giant’s newspaper archiving project. We learned:

“Newspapers that have their own digital archives can still add material to Google’s news archive via sitemaps, but the search giant will no longer spend its own money toward the cause.” Users can continue to search digitized newspapers in the archive, but, the company isn’t going “to introduce any further features or functionality to the Google News Archive.”

Seems like Google now understands what commercial database publishers have known for some time–searchable newspaper databases are commodity products with thin profit margins.

It’s no surprise that the company has retreated from the market. Google’s threat to commercial online services, seemingly so real several years ago, has yet to materialize.

What does Google’s pull out mean for ProQuest and similar outfits? First, Google is going after bigger fish. Second, consolidation may be the path to stabilizing revenues from what is a shrinking library market.

There are other options, but the goose is not honking.

Stephen E Arnold, June 9, 2011

Sponsored by, the resource for enterprise search information and current news about data fusion

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta