Content Tagging Costs

December 27, 2010

We read an interesting blog post called “The Search for Machine-Aided Indexing: Why a Rule-Based System is the Cost-Effective Choice.” Information about the costs of indexing content using different methods is often difficult to locate.

The article provides some useful information; however, I always verify any dollar estimates. Vendors often do custom price quotations, which makes it difficult to compare certain products and services.

Here’s the passage that caught my attention:

The database company manager could not give an exact figure for what their final actual costs were for purchasing Nstein; however, she did state that it was “not cheap.” She admitted that it was more expensive than all of the other MAI software products that they considered. (A press release from Nstein reported that the deal was worth approximately $CAN 450,000). When asked about staffing requirements, the manager estimated that it took the time of five full-time indexers and two indexing managers about a “month or so” at first. She added that there is a need for “constant” (she then rephrased that to “annual”) training. The investment company manager preferred not to discuss the actual implementation costs of Nstein, as there was a good deal of negotiation with non-cash assets involved. (A press release from Nstein of March 14th, 2002 reported that the deal was a five-year deal valued at over $CAN 650,000).

I downloaded this write up and tucked it in my Search 2011 pricing file. One never knows when these types of estimates will come in handy. I noticed on a LinkedIn threat relating to enterprise search that a person posted prices for the Google Search Appliance. I did a bit of clicking around and tracked down the original source of the data: SearchBlox Software. The data on the chart reported prices for the Google Mini. When one explores the US government’s price list for Google appliances that can handle 20 million documents which is a count encountered in some search applications, the cost estimates were off by quite a bit. Think in terms of $250,000, not $3,000.

I use whatever pricing data is available via open source research, and I know that hard data are often difficult to locate. The “appliance” approach is one way to control some costs. The “appliance” is designed to limit, like an iPad, what the user can do. Custom installations, by definition, are more expensive. When rules have to be created for any content processing system, the costs can become interesting.

Stephen E Arnold, December 27, 2010

Freebie, although Access Innovations has bought me one keema nan several weeks ago.

Lucid Announces Enterprise Search

December 16, 2010

Short honk: I learned via Marketwire that Lucid Imagination has announced the general availability of its enterprise search system, LucidWorks Enterprise. According to the report, the new search solution scales, features a cost-effective architecture, and delivers “enriched document handling”. The software is available without cost. Lucid Imagination offers a complete range of for-fee consulting and engineering services. For more information, navigate to www.lucidimagination.com.

Stephen E Arnold, December 16, 2010

Freebie

Thetus Embraces Knowledge Modeling

October 28, 2010

Thetus, a privately held corporation in Portland, provides semantic knowledge modeling and discovery solutions for extracting and managing information to support complex analysis and highly informed decision-making. The company’s Web site suggests that the firm’s solutions “create insight into the unique socio-cultural terrain that influences perceptions, decisions and behavior.”

We learned from DirectionsMag.com that Thetus has forged a new partnership with OpenSGI. This firm Open Solutions Group Inc., is an information technology corporation that offers geospatial products and consulting services to a broad range of Federal, state, local and commercial clients. The company’s Enterprise GeoCache Appliance, is capable of delivering worldwide high resolution geospatial data at a throughput of 6,000 images per second. That’s snappy.

“Thetus Partners With OpenSGI to Deliver Integrated Solution” said:

The [two companies’] solution combines OpenSGI’s Enterprise GeoCache product with the Thetus Savanna Analysis Solution to deliver high speed access to critical geospatial information and background maps — enabling users to more efficiently search, relate, and visualize from a unified view of spatial data and non-spatial knowledge. Government agencies need to get critical intelligence into the hands of their analysts and operational planners as quickly as possible. Thetus and OpenSGI meet this need by integrating Savanna’s ability to support flexible and changing models with GeoCache’s ability to bring lighting fast maps to the analyst desktop as an out-of-the-box integration.

We find this an interesting service. Mapping and geospatial are hot segments. In the back of our mind are such firms as Google, which has the capability of disrupting markets without warning. For more information about Thetus, navigate to www.thetus.com.

Stephen E Arnold, October 28, 2010

Freebie

A Tagged Future

October 7, 2010

“The Myth of the Universal Tag and the Future of Digital Data Collection” provides an interesting view of the challenges of tag-based deployment. The point of the write up by Ensighten is to use a tag management system. Ensighten offers such a system. There are other vendors in this market as well; for example, Access Innovations.

The white paper explains how a tag management system can create  technologically strong and financially wise benefits for end users. This will be especially interesting to those who invest in digital measurement advertising, and marketing solutions. Another sharp focus is clarifying the need for tag management systems and tag management solutions.

For your own copy of the white paper, click here for the Ensighten write up. You will have to fill out the form for a free copy. Sounds interesting, and may be worth your time to look.

Our view is that tags are now proliferating. In the traditional database world, too many tags can create problems. Users can get confused when a tag generates false drops. The management of tags becomes more complicated. Without control from the git-go, tags have a tendency to become muddled.

Search is tough and indiscriminate tagging makes the job harder for the user. Uncontrolled tags are often one consequence of carelessness. Whether a manual system or automated system is used to generate tags, getting the tagging method under control is Job #1. Getting the tags wrong means significant costs down the road, assuming the organization has the appetite to fix a problem on the right level in an appropriate manner. Cosmetics applied by azurini and former journalists won’t do the job in our experience.

Stephen E Arnold, October 7, 2010

Freebie

« Previous Page

  • Archives

  • Recent Posts

  • Meta