Enterprise Search and the Mythical Five Year Replacement Cycle

July 9, 2015

I have been around enterprise search for a number of years. In the research we did in 2002 and 2003 for the Enterprise Search Report, my subsequent analyses of enterprise search both proprietary and open source, and the ad hoc work we have done related to enterprise search, we obviously missed something.

Ah, the addled goose and my hapless goslings. The degrees, the experience, the books, and the knowledge had a giant lacuna, a goose egg, a zero, a void. You get the idea.

We did not know that an enterprise licensing an open source or proprietary enterprise search system replaced that system every 60 months. We did document the following enterprise search behaviors:

  • Users express dissatisfaction about any installed enterprise search system. Regardless of vendor, anywhere from 50 to 75 percent of users find the system a source of dissatisfaction. That suggests that enterprise search is not pulling the hay wagon for quite a few users.
  • Organizations, particularly the Fortune 500 firms we polled in 2003, had more than five enterprise search systems installed and in use. The reason for the grandfathering is that each system had its ardent supporters. Companies just grandfathered the system and looked for another system in the hopes of finding one that improved information access. No one replaced anything was our conclusion.
  • Enterprise search systems did not change much from year to year. In fact, the fancy buzzwords used today to describe open source and proprietary systems were in use since the early 1980s. Dig out some of Fulcrum’s marketing collateral or the explanation of ISYS Search Software from 1986 and look for words like clustering, automatic indexing, semantics, etc. A short cut is to read some of the free profiles of enterprise search vendors on my Xenky.com Web site.

I learned about a white paper, which is 21st century jargon for a marketing essay, titled “Best Practices for Enterprise Search: Breaking the Five-Year Replacement Cycle.” The write up comes from a company called Knowledgent. The company describes itself this way on its Who We Are Web page:

Knowledgent [is] a precision-focused data and analytics firm with consistent, field-proven results across industries.

The essay begins with a reference to Lexis, which along with Don Wilson (may he rest in peace) and a couple of colleagues founded. The problem with the reference is that the Lexis search engine was not an enterprise search and retrieval system. The Lexis OBAR system (Ohio State Bar Association) was tailored to the needs of legal researchers, not general employees. Note that Lexis’ marketing in 1973 suggested that anyone could use the command line interface. The OBAR system required content in quite specific formats for the OBAR system to index it. The mainframe roots of OBAR influenced the subsequent iterations of the LexisNexis text retrieval system: Think mainframes, folks. The point is that OBAR was not a system that was replaced in five years. The dog was in the kennel for many years. (For more about the history of Lexis search, see Bourne and Hahn, A History of Online information Services, 1963-1976. By 2010, LexisNexis had migrated to XML and moved from mainframes to lower cost architectures. But the OBAR system’s methods can still be seen in today’s system. Five years. What are the supporting data?

The white paper leaps from the five year “assertion” to an explanation of the “cycle.” In my experience, what organizations do is react to an information access problem and then begin a procurement cycle. Increasingly, as the research for our CyberOSINT study shows, savvy organizations are looking for systems that deliver more than keyword and taxonomy-centric access. Words just won’t work for many organizations today. More content is available in videos, images, and real time almost ephemeral “documents” which can difficult to capture, parse, and make findable. Organizations need systems which provide usable information, not more work for already overextended employees.

The white paper addresses the subject of the value of search. In our research, search is a commodity. The high value information access systems go “beyond search.” One can get okay search in an open source solution or whatever is baked in to a must have enterprise application. Search vendors have a problem because after decades of selling search as a high value system, the licensees know that search is a cost sinkhole and not what is needed to deal with real world information challenges.

What “wisdom” does the white paper impart about the “value” of search. Here’s a representative passage:

There are also important qualitative measures you can use to determine the value and ROI of search in your organization. Surveys can quickly help identify fundamental gaps in content or capability. (Be sure to collect enterprise demographics, too. It is important to understand the needs of specific teams.) An even better approach is to ask users to rate the results produced by the search engine. Simply capturing a basic “thumbs up” or “thumbs down” rating can quickly identify weak spots. Ultimately, some combination of qualitative and quantitative methods will yield an estimate of  search, and the value it has to the company.

I have zero clue how this set of comments can be used to justify the direct and indirect costs of implementing a keyword enterprise search system. The advice is essentially irrelevant to the acquisition of a more advanced system from an leading edge next generation information access vendor like BAE Systems (NetReveal), IBM (not the Watson stuff, however), or Palantir. The fact underscored by our research over the last decade is tough to dispute: Connecting an enterprise search system to demonstrable value is a darned difficult thing to accomplish.

It is far easier to focus on a niche like legal search and eDiscovery or the retrieval of scientific and research data for the firm’s engineering units than to boil the ocean. The idea of “boil the ocean” is that a vendor presents a text centric system (essentially a one trick pony) as an animal with the best of stallions, dogs, tigers, and grubs. The spam about enterprise search value is less satisfying than the steak of showing that an eDiscovery system helped the legal eagles win a case. That, gentle reader, is value. No court judgment. No fine. No PR hit. A grumpy marketer who cannot find a Web article is not value no matter how one spins the story.

The white paper veers into build or buy comments. Who wants to build anything today. The reason the Hacking Team exists is a result of law enforcement and intelligence agencies’ desire to use off-the-shelf applications. People use Microsoft Word because no one wants to cobble together a kitchen sink of text tools from EMACS.

The white paper includes a list of metrics. The list is useful as a starting point for assessing whether a search system works. The item that is omitted from the list is the search system’s staying within budget guidelines. When an accountant gets around to figuring out what the fully loaded, direct and indirect costs of an enterprise search system are—that’s when a hard look is taken at information access systems. The accountant probably never uses the system herself, so the numbers are all-to-frequently a really big surprise. Not good in my experience.

Martin White and I wrote five or six years ago Successful Enterprise Search Management. We burned through a 150 pages with tiny type to capture our experience with managing the she-devil of enterprise applications. Yep, that’s enterprise search. A she devil. Horrible. The white paper offers some advice which I don’t understand; for example, use the RACI matrix? The what? I circled “Process for service level agreement around adding new sources and front ends.” Another baffler.

How does the white paper end? A bang? A whimper? Neither. The white paper states:

The five-year replacement cycle for search can be frustrating and cost intensive for enterprises.  From the implementation of search to the need for a replacement, the cycle undermines both the enterprise’s initial investment of time and resources as well as the buy-in leveraged to implement the solution originally. In Knowledgent’s experience, there are methods by which enterprises can limit the damage caused by this cycle. By quantifying the value of search, obtaining the necessary skills during the search evaluation, tracking metrics to guide priorities, following a process for managing search, and other best practices, enterprises can break out of the cycle and even avoid it entirely.

This means to me: Hire Knowledgent. Make that decision with careful consideration, gentle reader. My approach would be quite different. One of the systems on which we have worked is now entering its 10th year with only minor adjustments. I suppose that means I failed twice because the system which processes hundreds of millions documents and images should have been replaced two times.

The white paper, like the five year assertion and much of the commentary about enterprise search, does not jibe with the real world I inhabit.

Stephen E Arnold, July 9, 2015

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta