SurfRay Surges

May 24, 2010

SurfRay pinged us on May 21, 2010. We took the opportunity to gather some information about this search and content processing company. We want to break our coverage of SurfRay into two parts. In this first part, we bring you up to date on the company’s product. In the second part, which will run in Beyond Search on May 31, 2010, we take a look at some of the details of the SurfRay products. Here’s the update, which as far as we know is an exclusive look at this company.

SurfRay (www.surfray.com) has released feature-packed Ontolica 2010 containing the new Ontolica Search Intelligence module, and with support for Ontolica Preview. This recent release provides extensive reporting and analytics on search performance and SharePoint content processing. You can get more information about Ontolica here. A free trial is available from this link.

The 2010 release of Ontolica Preview, which provides native support for about 500 document formats, ranging from Office formats to vector image formats and high-fidelity HTML preview, the product also supports in-document highlighting, allows users to browse to best-bet pages inside documents, and is optimized for performance over the internet, with no client installs needed.

Having completed development on Ontolica Express, a search extension to Microsoft Search Server and Search Server Express, they have transformed Microsoft’s free search engine into a much more rich and robust solution. With important features such as wildcard and Boolean search as well as drill down and faceted search, they can provide effective solutions to the customer.

image

The feature matrix shows how Ontolica adds important functionality to the SharePoint 2010 environment. Notice that the Fast Search solution lacks important out-of-the-box features such as portal usage reports and hot linked thumbnail previews.

Packaged enterprise search solutions most often equate to long and expensive customization and implementation projects for customers. SurfRay is out to change that. With a new managing director and several new releases of the company’s Ontolica and MondoSearch products they have positioned themselves for the impending release of SharePoint 2010. Soren Pallesen, the new CEO, believes SurfRay has a significant opportunity for the firm to grow.

Other search vendors add features that are hard to understand and don’t offer real value for customers. SurfRay is committed to delivering value to customers with easy to use, out of the box, and based on industry-standard technologies.

SurfRay, a Microsoft Certified Partner, can deliver tightly packaged enterprise search solutions that are rich in functionality but easy to test and install – Ontolica installs literally in 5 minutes. And in so doing, SurfRay is responding to customers move toward more packaged search products and away from expensive consulting projects.

Founded in 2000, SurfRay is a global leader in search infrastructure software for enterprises that delivers highly packaged enterprise search solutions that are easy to try, buy and install. SurfRay has more than a 1000 customers in over 30 countries and is dual headquartered in Santa Clara, USA and Copenhagen, Denmark. Their customer base includes some of the most known brands and largest companies in the world, including AT&T Wireless, Bank of Thailand, Best Buy, BMW, Ernst & Young, Ferrari, H & R Block, Intel Solution Services, John Deere, Nintendo, and the list goes on.

SurfRay is a trendsetter in packaged enterprise search solutions that takes the complexity out of deploying business search solutions. They achieve this by releasing new products and versions continuously and by focusing on geographic expansion. They have established dedicated physical presence in local markets to further build their local customer support and international reseller network, such as SurfRay UK and Ireland, SurfRay Benelux and Nordic. All this seems to be working as SurfRay recently announced over 20 percent quarter-to-quarter revenue growth.

Pallesen believes, “Today most customers are very well educated on search technology and they don’t want to be convinced that they need some fancy new techno-feature. The next new thing that truly will transform the search market and deliver substantial value to customers will be enterprise class search solutions that install and are configured as easily as Microsoft Office.”

SurfRay has a deep heritage in innovation and advanced search technology. They continue to leverage this and put valuable enhancement into packaged search solutions that makes search functional as well as easy to install and use.

Stephen E Arnold and Melody Smith, May 24, 2010

Sponsored

ZyLAB, SharePoint, and XML Content Archiving

May 9, 2010

ZyLAB has been a frequent visitor to my newsreader in the last week or so. The company is hopping on the rich media bandwagon with podcasts. That’s okay, but I am not a rich media goose. The idea of a serial information intake session is not too appealing to this old waterfowl. I leave the videos to the much smarter, more agile wizards, the new masters of the financially-challenged universe.

What did catch my attention was a news item in German called “Microsoft SharePoint-Paket von ZyLAB unterstützt jetzt auch Wikis und Blogs”. The idea is that ZyLAB’s technology and Microsoft SharePoint mesh together. Of particular interest to me is that the ZyLAB product now supports wikis and blogs. ZyLAB has nosed into the XML space as well with its storage service. With ZyLAB an already happy SharePoint customer will be able to extend that goodness with:

  • Search of scanned documents in different languages
  • Tap the benefits of XML storage of SharePoint content.
  • Eliminate the need for additional SQL Server licenses

One interesting feature is that in eDiscovery some SharePoint documents can go missing. The ZyLAB system can create a SharePoint archive with a comprehensive content set.

More information is available at www.zylab.com.

Stephen E Arnold, May 9, 2010

Unsponsored post.

Fast Search Server 2010 Vision

May 3, 2010

Another vision for Fast Search. You can read “Vision: Fast Search Server 2010 for SharePoint: Brief Discussion” and jump into the fray. The write up explains that Fast Search often has to be defended. And then the article turns to the azure chip outfit Gartner for support and succor. The passage that I tucked in my “future reference” folder was:

We guess Microsoft realized it didn’t have a real solution for the high-end enterprise search market and that that led them to the acquisition of the Norwegian company Fast, which specializes in delivering search services for the high-end market. According to Gartner, Fast is even more: it’s the best enterprise search system in the world. The acquisition was announced in January 2008 and ultimately, in SharePoint 2010, this has lead to a fully SharePoint-integrated solution called Fast Search Server 2010 for SharePoint. Fast Search Server 2010 for SharePoint is an add-on, which means you can start using the normal SharePoint Enterprise Search features as long as you want and switch to Fast once you need high-end search features.

The article wraps up with a number of bullet points about the wonders of the Fast Search system.

Vision? Not exactly. If you are a SharePoint fan and need some cheerleading, the “Vision” write up may be your cup of tea. Not much complexity in the write up, but there may be some in the Fast Search system in my opinion.

Stephen E Arnold, May 3, 2010

Unsponsored post.

Search Progress or Stalled on the Information Superhighway?

April 24, 2010

Many professionals think search has improved, but we question whether that improvement has added features or complications. CMSWire recently reported in their article, “How Search Has Improved in SharePoint 2010” that search has continued to evolve. The author highlighted the levels of involvement in various search features. However, if search has improved, perhaps three different “flavors” of search introduce an element of confusion for the user? It is difficult to ascertain which to use under what circumstances and how much each “flavor” costs. The differentiation among the search systems can be confusing. The family of products model is interesting, but if we are using that as our metaphor, which product goes to the expensive university and which goes to trade school? The Beyond Search Take: Read the write up and make up your own mind.

Melody K. Smith, April 24, 2010

Post was not sponsored.

SharePoint Taxonomy Fairy Dust

April 21, 2010

First, navigate to “SharePoint 2010: Using Taxonomy & Controlled Vocabulary for Content Enrichment”. Second, read the article. Now ask yourself these questions:

  1. Who sets up the SharePoint taxonomy magic?
  2. From where does the taxonomy come?
  3. Who maintains the taxonomy?
  4. How are inappropriate terms removed from the index and the correct terms applied?

Got your answers. Here are mine:

  1. A specialist in controlled term lists is needed to figure out the list and then an industrial strength system like the one available from Access Innovations is needed. Once the system is up and running and the term list generated you are ready to tackle SharePoint.
  2. The taxonomy comes from a method that involves figuring out the lingo of the organization, available term lists, and then knowledge value work. In short, a taxonomy has to be in touch with the organization and the domain of knowledge to which it is applied. Sound like work? It is and most taxonomy problems originate with slap dash methods.
  3. The taxonomy must be – note the imperative – by a combination of a human and software. New terms come and old terms go. The indexes and the tagged objects must be kept in sync. Humans with software tools perform this work. A taxonomy left to the devices of automated systems, left unchanged, or tweaked by azure chip experts is essentially useless after a period of time.
  4. Inappropriate terms are removed from the system via a human and software intermediated system. Once the term list is updated, then the process of retagging and reindexing takes places. Muff this bunny and no one can find anything.

Now read the article again. Quite a bit is left out or simply not deemed relevant. My suggestion is to do some thinking about the nature of the user, the specific information retrieval needs, and the expertise required to do the job to avoid wasting time and money.

Like most tasks in search, it is more fun to simplify than to deal from the top of the deck. SharePoint is one of the more interesting systems with which to work. Once the short cuts and half baked approach goes south, you will be ready to do the job correctly. I wonder if the CFO knows what questions to ask to figure out why content processing costs have gone through the roof because of rework, fiddling, and bungee jumping without a cord.

Stephen E Arnold, April 21, 2010

Unsponsored post

The Biggest Names in Enterprise Search!

April 10, 2010

I received a link to a “National Press Release.” When I click the link here, I saw this title: “BA-Insight’s SharePoint Search and FAST Search 2010 Webinar Series Features the Biggest Names in Enterprise Search.” I don’t have too much of a problem with hyperbole. I find it amusing that the “biggest names in enterprise search” did not include individuals from:

  • Autonomy and its chief wizard, Mike Lynch
  • Exalead and the prescient François Bourdonclek
  • Google and co founders Sergey Brin and Larry Page
  • Lucid Imagination and Eric Gries and Marc Krellenstein.

I could ennumerate this list but I am not sure I would feel comfortable using the bold phrase “biggest names in enterprise search” even if those in my bulleted list were on the program.

Enterprise search is a flawed phrase, but it is one that seems to resonate. The reality is that there are many different types of search, and I am not sure that two firms, despite their stellar reputations, can deliver across the spectrum of chemical structure search in enterprises engaged in drug research, search for specific legal information related to a matter, search for rich media in an enterprise engaged in broadcast television news, etc.

I think the headline would have made me more comfortable if it has said, “A Webinar Focused on Improving Information Access in SharePoint Using Technology Certified by Microsoft.” No superlatives are needed in my opinion. If the “biggest names” can’t make the basic product work, is there not a logical thread to tug?

Stephen E Arnold, April 11, 2010

A freebie.

Infrastructure Ripple from SharePoint

March 22, 2010

Navigate to Thor Projects and read the article “Infrastructure Ripple Effect – The Story of Servers, Racks and Power.” I have about 48 inches of screen real estate and I needed all of it to read the article. The layout is – in a word – interesting. The point of the write up, in my opinion, is summarized in this passage from the article:

I am reminded that any change creates a ton of little ripples.

When an information technology pro runs into problems with a single server, I wonder what the impact of more massive on premises changes might be.

I thought about Mauro Cardarelli’s “Where Does SharePoint Still Fall Short?” when I thought about adding hardware. He wrote:

Let’s face it; the interface for security management is confusing and cumbersome… even for people who use it every day. What are the consequences? First, you increase the likelihood of security breaches (i.e. showing content to the wrong audience). Second, you increase the likelihood of giving users permissions greater than necessary. Finally, you increase the likelihood of a having a security model that is highly diluted and overly complex. This is probably why the 3rd party market for SharePoint administration has been so strong… someone needs to pay attention to what these folks are doing! But I would argue that this is reactive (versus proactive) management… and things need to be taken one step further.

Hardware and security. Hmmm.

Stephen E Arnold, March 22, 2010

No one paid me to write this article. I will report this to the Salvation Army, an outfit that knows about work without pay. Perhaps the cloud access to SharePoint will obviate the problem?

Coveo and GEICO Host Webinar on March 23, 2010

March 21, 2010

Fierce Media has asked Beyond Search to facilitate a discussion about “how GEICO thinks about leveraging its data-rich enterprise systems to generate real-time business value and intelligence.” The participants are GEICO and Coveo as well as Stephen E Arnold.

Topics include how the Coveo system can:

  • Enable improved business intelligence and decision making through dynamic dashboards and information mashups that provide actionable business information
  • Access structured and unstructured data from across enterprise systems and repositories without complex integration or data migration, improving efficiency and cost effectiveness through a unified indexing layer
  • Lower the cost of legacy system integrations and  upgrades, and reduce time-consuming data migration
  • Optimize social networks and incorporate the value of collaboration and just-in-time information exchange into the knowledge ecosystem

The audio program will be on Tuesday, March 23, 2010 beginning at 11:00am Eastern/8:00am Pacific. More information about Coveo may be found at http://www.coveo.com. You can register here.

Ben Kent, March 21, 2010, Beyond Search

This is a sponsored post.

WAND and Layer2 Team for SharePoint Taxonomy Functions

March 19, 2010

A happy quack to the reader who sent me a link to “Jump-Start Microsoft SharePoint 2010 Knowledge Management Using Pre-Defined Taxonomy Metadata”. The Microsoft Fast road show is wending its way among the Redmond faithful. In its wake, a number of companies see opportunity in the Microsoft demos. But with Microsoft making some tasty offers to incentive those looking for search systems, Microsoft may be doing third-party add-on vendors and Fast ESP consultants a big favor.

The Earth Times’ article said:

In cooperation with WAND, Inc – one of the leading providers of enterprise taxonomies – Layer2 now offers pre-defined Taxonomy Metadata for Microsoft SharePoint Server 2010, a robust and expanding library of taxonomies covering a wide variety of domains to help jumpstart classification projects. Taxonomy Metadata for Microsoft SharePoint 2010 is currently available in 13 languages, e.g. English, French, German, Spanish, Italian, Portuguese, Japanese, Simplified Chinese, Traditional Chinese, Korean, and Vietnamese.

WAND has developed structured multi-lingual vocabularies with related tools and services to power precision search and classification applications. The company asserts that WAND makes search work better. WAND Taxonomies are used in online yellow pages and local search, ad-matching engines, business to business directories, product search, and within enterprise search engines. The firm’s library contains more than 40 domain specific taxonomies. WAND’s taxonomies are available in 13 languages.

Layer 2 GmbH is a specialist for creating custom components and solutions for Microsoft SharePoint Products and Technologies. Based in Germany, Layer2 offers products and solutions that add additional features to portals based on Microsoft SharePoint technology.

My view is that Microsoft may be creating opportunities at the same time it leaves some SharePoint customers wondering why their systems do not work as expected. If taxonomy management was a priority, Microsoft should have included a system to perform this type of work within the SharePoint package. Third party vendors now have an opportunity to sell a “solution,” but customers may have to go through a learning process and then spend additional money to get the functionality required to make SharePoint more useful.

Perhaps another mixed result from SharePoint? Just my opinion.

Stephen E  Arnold, March 19, 2010

Freebie. No one paid me to point out that talking about “taxonomies” is much easier than implementing a high value taxonomy and then enforcing consistent tagging across the processed corpus. I know that the IRS is good at indexing by social security number, so I will report non payment to that agency.

Indexing Craziness

March 15, 2010

I read “Folksonomy and Taxonomy – do you have to choose?,” which takes the position that a SharePoint administrator can use a formal controlled term list or just let the users slap their own terms into an index field. The buzzword for allowing users to index documents is part of a larger 20 something invention—folksonomy. The key segment for me in the SharePoint centric Jopx blog was:

The way that SharePoint 2010 supports the notion of promoting free tags into a managed taxonomy demonstrates that a folksonomy can be used as a source to define a taxonomy as well.

Let me try and save you a lot of grief. Indexing must be normalized. The idea is to use certain terms to retrieve documents with reasonable reliability. Humans who are not trained indexers do a lousy job of applying terms. Even professional indexers working in production settings fall into some well known ruts. For example, unless care is exercised in management and making the term list available, humans will work from memory. The result is indexing that is wrong about 15 percent of the time. Machine indexing when properly tuned can hit that rate. The problem is the that the person looking for information assumes that indexing is 100 percent accurate. It is not.

The idea behind controlled term lists is that these are logically consistent. When changes are made such as the addition of a term such as “webinar” as a related term to “seminar”, a method exists to keep the terms consistent and a system is in place to update the index terms for the corpus.

When there is a mix of indexing methods, the likelihood of having a mess is pretty high. The way around this problem is to throw an array of “related” links in front of the user and invite the user to click around. This approach to discovery entertains the clueless but leads to the potential for rat holes and wasted time.

Most organizations don’t have the appetite to create a controlled term list and keep it current. The result is the approach that is something I encounter frequently. I see a mix of these methods:

  1. A controlled term list from someplace (old Oracle or Convera term list, a version of the ABI/INFORM or some other commercial database controlled vocabulary, or something from a specialty vendor)
  2. User assigned terms; that is, uncontrolled terms. (This approach works when you have big data like Google but it is not so good when there are little data, which is how I would characterize most SharePoint installations.)
  3. Indexes based on parsing the content.

A user may enter a term such as “Smith purchase order” and get a bunch of extra work. Users are not too good at searching, and this patchwork of indexing terms ensures that some users will have to do the Easter egg drill; that is, look for the specific information needed. When it is located, some users like me make a note card and keep in handy. No more Easter egg hunts for that item for me.

What about third party SharePoint metadata generators? These generate metadata but they don’t solve the problem of normalizing index terms.

SharePoint and its touting of metadata as the solution to search woes are interesting. In my opinion, the approach implemented within SharePoint will make it more difficult for some users to find data, not easier. And, in my opinion, the resulting index term list will be a mess. What happens when a search engine uses these flawed index terms, the search results force the user to look for information the old fashioned way.

Stephen E Arnold, March 15, 2010

A free write up. No one paid me to write this article. I will report non payment to the SharePoint fans at the Department of Defense. Metadata works first time every time at the DoD I assume.

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta