Google Amazon Dust Bunnies

October 13, 2011

The addled goose has a bum eye, more air miles than a 30 something IBM sales engineer, and lousy Internet connectivity. T Mobile’s mobile WiFi sharing gizmo is a door stop. Imagine my surprise when I read “Google Engineer: “Google+ Example of Our Complete Failure to Understand Platforms.” In one webby write up, the dust bunnies at Google and Amazon were moved from beneath the bed to the white nylon carpet of a private bed chamber.

I am not sure the information in the article is spot on. Who can certain about the validity of any information any longer. The goose cannot. But the write up reveals that Amazon is an organization with political “infighting”. What’s new? Nothing. Google, on the other hand, evidences a bit of reflexivity. I will not drag the Motorola Mobility event into this brief write up, but students of business may find that acquisition worth researching.

Here is the snippet which caught my attention:

[A]  high-profile Google engineer … mistakenly posted a long rant about working at Amazon and Google’s own issues with creating platforms on Google+. Apparently, he only wanted to share it internally with everybody at Google, but mistaken shared it publicly. For the most part, [the] post focuses on the horrors of working at Amazon, a company that is notorious for its political infighting. The most interesting part to me, though, is … [the] blunt assessment of what he perceives to be Google’s inability to understand platforms and how this could endanger the company in the long run.

I want to step back. In fact, I want to go into MBA Mbit mode.

First, this apparent management behavior is the norm in many organizations, not the companies referenced in the post.I worked for many years in the old world of big time consulting. Keep in mind that my experiences date from 1973, but management idiosyncrasies were the rule. The majority of these management gaffes took place in a slower, not digital world. Sure, speed was important. In the physics of information speed is relative. Today the perceived velocity is great and the diffusion of information adds a supercharger to routine missteps. Before getting too excited about the insights into one or two companies, most organizations today are  perilously close to dysfunction. Nothing special here, but today’s environment gives what is normal some added impact. Consolidation and an absence of competition makes the stakes high. Bad decisions add a thrill to the mundane. Big decisions weigh more and can have momentum that does more quickly than a bad decision in International Harvester or NBC in the 1970s.

Second, technology invites bad decisions. Today most technologies are “hidden”, not exposed like the guts of  a Model T or my mom’s hot wire toaster which produced one type of bagel—burned. Not surprisingly, even technically sophisticated managers struggle to understand the implications of  a particular technical decision. To make matters worse, senior mangers have to deal with “soft” issues and technical training, even if limited, provide few beacons for the course to chart. Need some evidence. Check out the Hewlett Packard activities over the last 18 months. I routinely hear such statements as “we cannot locate the invoice” and “tell us what to do.” Right. When small things go wrong, how can the big things go right? My view is that chance is a big factor today.

Third, the rush to make the world social, collaborative, and open means that leaks, flubs, sunshine, and every other type of exposure is part of the territory.. I find it distressing that sophisticated organizations fall into big pot holes. As I write this, I am at an intelligence conference, and the rush to openness has an unexpected upside for some information professionals. With info flowing around without controls, the activities of authorities are influenced by the info bonanza. Good and bad guys have unwittingly created a situation that makes it less difficult to find the footprints of an activity. The post referenced in the source article is just one more example of what happens when information policies just don’t work. Forget trust. Even the technically adept cannot manage individual communications. Quite a lesson I surmise.

In search and content processing,the management situation is  dire. Many companies are uncertain about pricing,features, services, and innovation. Some search vendors describe themselves with nonsense and Latinate constructions. Other flip flop for search to customer support to business intelligence without asking themselves, “Does this stuff actually work?” Many firms throw adjectives in front of jargon and rely on snake charming sales people to close deals. Good management or bad management? Neither. We are in status quo management with dollops of guessing and wild bets.

My take on this dust bunny matter is that we have what may be an unmanageable and ungovernable situation. No SharePoint governance conference is going to put the cat back in the bag. No single email, blog post, or news article will make a difference. Barn burned. Horse gone. Wal-Mart is building on the site. The landscape has changed. Now let the “real” consultants explain the fix. Back to the goose pond for me. Collaborate on that.

Stephen E Arnold, October 13, 2011

Sponsored by Pandia.com

xx

 

xx

RIM Creates BlackBerry App to Address Needs of Changing Workforce

October 11, 2011

Wireless innovation company Research in Motion (RIM), The creator of the BlackBerry Solution in 1999, has now revealed a BlackBerry Client for Microsoft Sharepoint.

In addition to being budget friendly, the BlackBerry Client app has many user benefits for individuals and businesses. User benefits include, among other things, app integration with core BlackBerry features and functions and content searches across multiple SharePoint sites.

The complete solution is also great and easy to use for IT departments with BlackBerry Enterprise Server. Many fortune 500 companies plan on using the product in the coming weeks.

According to RIM Mobilizes Microsoft SharePoint on BlackBerry, Victor Garcia, chief technology officer at HP Canada, said of the product:

In a world of continuous connectivity, businesses and governments require products, services and information faster and more reliably than ever before. The BlackBerry Client for Microsoft SharePoint helps to address the needs of today’s mobile workforce with new tools and mobile applications to enable collaboration and productivity.

Mobile computing remains a challenge for enterprises.  The applications such as this one from RIM offer tantalizing promises.  Is the hardware up to delivering the functionality promised by such apps?  Searching Sharepoint on the go a necessary function, perhaps, a delivered solution, it remains to be seen.

Jasmine Ashton, Oct 11, 2011

Funnelback Tunes in Telstra

September 15, 2011

Telstra recently announced its new telecommunications site, www.nowwearetalking.com.au, will be hosted by Funnelback, as reported in the article, Funnelback Launches New Search for Telstra, on Blog Hosting Info. Telstra, Australia’s leading provider of telecommunications and information services, provides basic telephone services, mobile phone services as well as broadband and internet. They pride themselves on their vast geographical coverage of mobile and fixed network infrastructure, and provide

The Funnelback technology, which is a commercial product for sure, will allow users of the new site to search within blog posts, comments, forum posts, and online discussions for information. Funnelback’s promise to clients is that their services “comprehensively tailor the search facility to deliver on your business objectives.”

Funnelback offers commercial search engine services that help companies manage their online activities. As their website explains,

Funnelback can search across a myriad of corporate content repositories including websites, intranets, shared drives, SharePoint, Email systems and databases. For additional flexibility, it can be deployed as a fully managed, SaaS solution, installed within your firewall or hosted in the cloud. No matter how large or small your organization, we can tailor a solution to suit your business needs and information architecture.

As more and more digital data is being sent and received within companies, the need for content management grows. Companies, such as Funnelback, help maximize production and worker effectiveness by allowing humans to work on higher-level thinking projects and the computers sort through the myriad of data.

Catherine Lamsfuss, September 15, 2011

Sponsored by Pandia.com, publishers of The New Landscape of Enterprise Search

OpenText: Search to Teaching Is Not the Deal about Selling Services?

September 8, 2011

Another data management, search, collaboration vendor does the “we are in a new business” quick step. Searching with the Stars could be a TV sensation because there are more twists, dips, and slides in the search and content ballroom than in an Arthur Murray Advanced Beginners’ class.

Navigate to “Open Text acquires Peterborough’s Operitel”. The news is that one Canadian firm snapped up another Canadian outfit. What makes this interesting is that I was able to see some weak-force synergy between Nstein (sort of indexing and sort of data management) and OpenText, owner of lots of search, content processing, and collaboration stuff plus an SGML database and the BASIS system. But the Operitel buy has me doing some speculative thinking.

Here’s the passage which caught my attention:

Operitel’s flagship LearnFlex product is built on Microsoft Corp.’s .NET platform and is a top tier e-learning reseller for the Windows maker. Open Text also has a long standing partnership with Redmond, Wash.-based Microsoft.

I see more Microsoft credibility and a different way to sell services. OpenText strikes me as a company with a loosely or mostly non integrated line up of products. The future looks to be charging into the SharePoint sector, riding a horse called “eLearning.”

In today’s business climate, organic growth seems to be tough to achieve even with RedDot and a fruit basket filled with other technologies. (What happened to OpenText’s collaboration product? What happened to the legal workflow business? I just don’t know.) So how does a company which some Canadians at Industry Canada see as one of the country’s most important software companies grow? Here’s the answer:

Open Text’s growth-by-acquisition strategy has recently won accolades among the analyst community. The company purchased Maryland-based Metastorm Inc. for US$182-million, Texas-based Global 360 Holding Corp. for US$260-million and U.K.-based WeComm Ltd. for an undisclosed amount all in the past six months.

My hunch is that OpenText may want to find a buyer. Acquisitions seem to be a heck of a lot easier to complete than landing a major new account. I am not the only person thinking that the business of OpenText is cashing out. Point your browser at “Amid Takeover Fever, Open Text Looks Like a Bargain.” Here’s a key point in my opinion:

Open Text shares have climbed about 20 per cent this year, an increase that would pale in comparison to what would happen if a potential buyer emerged offering a premium similar to what HP has given Autonomy.

So we see a big payday for Autonomy has triggered a sympathetic response at the Globe & Mail, among “analysts”, and I am pretty sure among some OpenText stakeholders.

Several observations:

First, bankers think mostly about their commissions and fees. Bankers don’t think so much about other aspects of a deal. If there is a buck to be made from a company with a burlap sack of individual, solutions, and services, the bankers will go for it. Owning a new Porsche takes the edge off the winter.

Second, competitors have learned that other companies are a far greater threat than OpenText. A services firm can snag some revenue, but other vendors have been winning the big deals. The OpenText strategy has not generated the top line revenue growth and profit that a handful of other companies in search and content processing have achieved. So the roll up and services play looks like a way to add some zip to the burlap bag’s contents.

Third,  customers have learned that OpenText does not move with the agility of some other firms. I would not use the word “glacial,” but “stately” seems appropriate. If you know someone with the RedDot system, you may be able to relate to the notion of rapid bug fixes and point releases. By the way, RedDot used to install an Autonomy stub as the default search engine. I find this interesting because OpenText owns BRS search, Fulcrum (yikes!), and the original Tim Bray SGML data management and search system. (Has SGML and XML come and gone?)

I am not willing to go out on a limb about a potential sale of OpenText, but I think that the notion of eLearning is interesting. Will OpenText shift its focus back to collaboration and document management much as Coveo flipped from search to mobile search to customer support and then back to search again. Canadian search and content processing vendors are interesting. Strike up the music. Another fast dance is beginning. Grab your partner. Search to services up next.

Stephen E Arnold, September 9, 2011

Sponsored by Pandia.com, publishers of The New Landscape of Enterprise Search

Simplexo Search

January 3, 2011

Short honk: I learned about Simplexo earlier this year. The company provides “optimized search for your mobile.” The company has a product that makes it possible for a user of Simplexo to search a desktop computer from a mobile device or a Web browser. Yahoo UK reported in “Simplexo Aims to Simplify Remote Desktop Searches”:

Simplexo said that the software could find emails in Outlook and Exchange Server, as well as documents in SharePoint, spreadsheets and database records, and can scour social networking applications such as Facebook, Twitter and LinkedIn.

The service is to go live early in 2011. If you are interested in this type of product, navigate to this link and sign up.

Stephen E Arnold, January 3, 2011

Freebie

Anti Search in 2011

November 1, 2010

In a recent meeting, several of the participants were charged with disinformation from the azurini.

You know. Azurini, the consultants.

Some of these were English majors, others former print journalists, and some unemployed search engine optimization experts smoked by Google Instant.

But mostly the azurini emphasize that their core competency is search, content management, or information governance (whatever the heck that means). In a month or so, there will be a flood of trend write ups. When the Roman god looks to his left and right, the signal for prognostication flashes through the fabric covered cube farms.

To get ahead of the azurini, the addled goose wants to identify the trends in anti search for 2011. Yep, anti search. Remember that in a Searcher article several years ago, I asserted that search was dead. No one believed me, of course. Instead of digging into the problems that ranged from hostile users to the financial meltdown of some high profile enterprise search vendors, search was the big deal.

And why not? No one can do a lick of work today unless that person can locate a document or “find” something to jump start activity. In a restaurant, people talk less and commune with their mobile devices. Search is on a par with food, a situation that Maslow would find interesting.

The idea for this write up emerged from a meeting a couple of weeks ago. The attendees were trying to figure out how to enhance an existing enterprise search system in order to improve the productivity of the business. The goal was admirable, but the company was struggling to generate revenues and reduce costs.The talk was about search but the subtext was survival.

The needs for the next generation search system included:

  • A great user experience
  • An iPad app to deliver needed information
  • Seamless access to Web and Intranet information
  • Google-like performance
  • Improved indexing and metatagging
  • Access to database content and unstructured information like email.

Read more

Intel Stream Number 2 Available

October 27, 2010

ArnoldIT.com’s Intel Stream podcast for October 27, 2010, is now available. The podcast focuses on the intersection of business intelligence and technology. In this week’s 10 minute program, Stephen E Arnold comments about the proposal for the US government to archive Federal workers’ social media postings and content, T-Mobile’s surprising acquisition of Vamosa, Recommind’s 2010 revenues revealed by a competitor, an online SharePoint 2010 cost estimator, and a free download of sentiment analysis software. You can listen to the audio program by navigating to http://arnoldit.com/podcasts/ and clicking on the October 27, 2010 Intel Stream program.

Stuart Schram IV, October 27, 2010

Freebie

Azurini Lock In Analysis Baffles the Goose

August 3, 2010

I know, I know. Consulting firms have to be “real” and “objective” and “mavenesque.” I accept that. But the write up “Burton Group: Avoid Office 2010 Lock-In, Stick with Office 2007” wowed me. Microsoft buys lots of consulting, research, and advice. As a result, those who want to get jobs with the Redmond fun lovers often find a way to put a honey colored light on almost any product, service, or initiative. How many raves did I read about Vista? How many times have I heard about the wonders of MSN, now Live something? How many times have I heard experts explain the impact of Microsoft’s mobile strategy, its search strategy, its social strategy, its cloud strategy, and other strategies. The addled goose sure does not generate $70 billion a year in revenue and Microsoft does. So, guess who is really smart? Time’s up. Microsoft.

But a consulting firm criticizing Microsoft albeit somewhat indirectly? That is amazing, and it means to me that maybe the fondness Microsoft once felt for Burton has faded. Maybe Burton no longer loves Microsoft? Maybe there are other forces in play? Who knows.

What is clear is that Burton suggests an organization that embraces Office 2010 may be a candidate for lock in. Lock in means that a vendor calls the shots, not the client. The only way to get free is to break out. In fact, that’s one of the appeals of open source software. An organization using open source software believes it has more freedom than when chained to a giant SharePoint installation, an even bigger Microsoft Exchange construct, and the 40 other servers that Microsoft has on offer.

My view is that Microsoft is not the only enterprise software vendor looking to get shelf space and then become a monoculture in a client organization. Does IBM seek to monopolize hardware, software, and services? In my experience, you better understand the way Big Blue operates before your local IBM vice president gets a temporary office down the hall from your company’s president. Same with the Google.

So what strikes me as interesting is not the lock in angle. That’s old news. The criticism of a big outfit like Microsoft has caught my attention. Is one of the azurini  changing colors?

Stephen E Arnold, August 3, 2010

Freebie

Google and Disruption: Will It Work Tomorrow?

April 15, 2010

Editor’s Note: The text in this article is derived from the notes prepared by Stephen E Arnold’s keynote talk on April 15, 2010. He delivered this speech as part of Slovenian Information Days in Portoroz, Slovenia.

Thank you, Mr. Chairman. I am most grateful for the opportunity to address this group and offer some observations about Google and its disruptive tactics.

I started tracking Google’s technical inventions in 2002. A client, now out of business, asked me to indicate if “Google really had something solid.”

My analysis showed a platform diagram and a list of markets that Google was likely to disrupt. I captured three ideas in my 2005 monograph “The Google Legacy“, which is still timely and available from Infonortics Ltd. in Tetbury, Glos.

The three ideas were:

First, Google had figured out how to add computing capacity, including storage, using mostly commodity hardware. I estimated the cost in 2002 dollars as about one-third what companies like Excite, Lycos, Microsoft, and Yahoo and were paying.

Second, Google had solved the problem of text search for content on Web pages. Google’s engineers were using that infrastructure to deliver other types of services. In 2002, there were rumors that Google was experimenting with services that ranged from email to an online community / messaging system. One person, whose name I have forgotten, pointed out that Google’s internal network MOMA was the test bed for this type of service.

Third, Google was not an invention company. Google was an applied research company. The firm’s engineers, some of whom came from Sun Microsystems and AltaVista.com, were adepts at plucking discoveries from university research computing tests and hooking them into systems that were improvements on what most companies used for their applications. The genius was focus and selection and integration.

image

Google is an information factory, a digital Rouge River construct. Raw materials enter at one end and higher value information products and services come out at the other end of the process.

In my  second Google monograph, funded funded in part by another client, I built upon my research into technology and summarized Google’s patent activities between 2004 and mid 2007. Google Version 2.0: The Calculating Predator, also published by Infonortics Ltd., disclosed several interesting facts about the company.

Read more

Quote to Note: Dick Brass on MSFT Innovation

February 6, 2010

I met Dick Brass many years ago. He left Oracle and joining Microsoft to contribute to a confidential initiative. Mr. Brass worked on the ill-fated Microsoft tablet, which Steve Jobs has reinvented as a revolutionary device. I am not a tablet guy, but one thing is certain. Mr. Jobs knows how to work public relations. Mr. Brass published an article in the New York Times, and it captured the attention of Microsoft and millions of readers who enjoyed Mr. Brass’s criticism of his former employer. I have no opinion about Microsoft, its administrative methods, or its ability to innovate. I did find a quote to note in the write up:

Microsoft is no longer considered the cool or cutting edge place to work. There has been a steady exist of its best and brightest. (“Microsoft’s Creative Destruction”, the New York Times, February 4, 2010, Page 25, column 3, National Edition)

Telling because if smart people don’t work at a company, that company is likely to make less informed decisions than an organization with smarter people. This applies in the consulting world. There are blue chip outfits like McKinsey, Bain, and BCG). Then there are lesser outfits which I am sure you can name because these companies “advertise”, have sales people who “sell” listings, and invent crazy phrases to to create buzz and sales. I am tempted to differentiate Microsoft with a reference to Apple or Google, but I will not. Oh, why did I not post this item before today. The hard copy of my New York Times was not delivered until today. Speed is important in today’s information world.

The quote nails it.

Stephen E Arnold, February 7, 2010

No one paid me to write this, not a single blue chip consulting firm, not a single savvy company. I will report this lack of compensation to the experts at the IRS, which is gearing up for the big day in April.


* Featured
* Interviews
* Profiles

Featured
Microsoft and Mikojo Trigger Semantic Winds across Search Landscape

Semantic technology is blowing across the search landscape again. The word “semantic” and its use in phrases like “semantic technology” has a certain trendiness. When I see the word, I think of smart software that understands information in the way a human does. I also think of computationally sluggish processes and the complexity of language, particularly in synthetic languages like English. Google has considerable investment in semantic technology, but the company wisely tucks it away within larger systems and avoiding the technical battles that rage among different semantic technology factions. You can see Google’s semantic operations tucked within the Ramanathan Guha inventions disclosed in February 2007. Pay attention to the discussion of the system and method for “context”.

image

Gale force winds from semantic technology advocates. Image source: http://www.smh.com.au/ffximage/2008/11/08/paloma_wideweb__470×289,0.jpg

Microsoft’s Semantic Puff

Other companies are pushing the semantic shock troops forward. I read yesterday in Network World’s “Microsoft Talks Up Semantic Search Ambitions.” The article reminded me that Fast Search & Transfer SA offered some semantic functionality which I summarized in the 2006 version of the original Enterprise Search Report (the one with real beef, not tofu inside). Microsoft also purchased Powerset, a company that used some of Xerox PARC’s technology and its own wizardry to “understand” queries and create a rich index. The Network World story reported:

With semantic technologies, which also are being to referred to as Web 3.0, computers have a greater understanding of relationships between different information, rather than just forwarding links based on keyword searches.  The end game for semantic search is “better, faster, cheaper, essentially,” said Prevost, who came over to Microsoft in the company’s 2008 acquisition of search engine vendor Powerset. Prevost is still general manager of Powerset.  Semantic capabilities get users more relevant information and help them accomplish tasks and make decisions, said Prevost.

The payoff is that software understands humans. Sounds good, but it does little to alter the startling dominance of Google in general Web search and the rocket like rise of social search systems like Facebook. In a social context humans tell “friends” about meaning or better yet offer an answer or a relevant link. No search required.

I reported about the complexities of configuring the enterprise search system that Microsoft offers for SharePoint in an earlier Web log post. The challenge is complexity and the time and money required to make a “smart” software system perform to an acceptable level in terms of throughput in content processing and for the user. Users often prefer to ask someone or just use what appears in the top of a search results list.

Read more »
Interviews
Inside Search: Raymond Bentinck of Exalead, Part 2

This is the second part of the interview with Raymond Bentinck of Exalead.

Isn’t this bad marketing?

No. This makes business sense.Traditional search vendors who may claim to have thousands of customers tend to use only a handful of well managed references. This is a direct result of customers choosing technology based on these overblown marketing claims and these claims then driving requirements that the vendor’s consultants struggle to deliver. The customer who is then far from happy with the results, doesn’t do reference calls and ultimately becomes disillusioned with search in general or with the vendor specifically. Either way, they end up moving to an alternative.

I see this all the time with our clients that have replaced their legacy search solution with Exalead. When we started, we were met with much skepticism from clients that we could answer their information retrieval problems. It was only after doing Proof of Concepts and delivering the solutions that they became convinced. Now that our reputation has grown organizations realize that we do not make unsubstantiated claims and do stick by our promises.

What about the shift to hybrid solutions? An appliance or an on premises server, then a cloud component, and maybe some  fairy dust thrown in to handle the security issues?

There is a major change that is happening within Information Technology at the moment driven primarily by the demands placed on IT by the business. Businesses want to vastly reduce the operational cost models of IT provision while pushing IT to be far more agile in their support of the business. Against this backdrop, information volumes continue to grow exponentially.

The push towards areas such as virtual servers and cloud computing are aspects of reducing the operational cost models of information technology provision. It is fundamental that software solutions can operate in these environments. It is surprising, however, to find that many traditional search vendors solutions do not even work in a virtual server environment.

Isn’t this approach going to add costs to an Exalead installation?

No, because another aspect of this is that software solutions need to be designed to make the best use of available hardware resources. When Exalead provided a solution to the leading classified ads site Fish4.co.uk, unlike the legacy search solution we replaced, not only were we able to deploy a solution that met and exceeded their requirements but we reduced the cost of search to the business by 250 percent. A large part of this was around the massively reduced hardware costs associated with the solution.

What about making changes and responding quickly? Many search vendors simply impose a six month or nine month cycle on a deployment. The client wants to move quickly, but the vendor cannot work quickly.

Agility is another key factor. In the past, an organization may implement a data warehouse. This would take around 12 to 18 months to deploy and would cost a huge amount in hardware, software and consultancy fees. As part of the deployment the consultants needed to second guess the questions the business would want to ask of the data warehouse and design these into the system. After the 12 to 18 months, the business would start using the data warehouse and then find out they needed to ask different types of questions than were designed into the system. The data warehouse would then go through a phase of redevelopment which would last many more months. The business would evolve… making more changes and the cycle would go on and on.

With Exalead, we are able to deploy the same solution in a couple months but significantly there is no need to second guess the questions that the business would want to ask and design them into the system.

This is the sort of agile solution that businesses have been pushing their IT departments to deliver for years. Businesses that do not provide agile IT solutions will fall behind their competitors and be unable to react quickly enough when the market changes.

One of the large UK search vendors has dozens of niche versions of its product. How can that company keep each of these specialty products up to date and working? Integration is often the big problem, is it not?

The founders of Exalead took two years before starting the company to research what worked in search and why the existing search vendors products were so complex. This research led them to understand that the search products that were on the marketplace at the time all started as quite simple products designed to work on relatively low volumes of information and with very limited functional capabilities. Over the years, new functionality has been added to the solutions to keep abreast of what competitors have offered but because of how the products were originally engineered they have not been clean integrations. They did not start out with this intention but search has evolved in ways never imagined at the time these solutions were originally engineered.

Wasn’t one of the key architects part of the famous AltaVista.com team?

Yes. In fact, both of the founders of Exalead were from this team.

What kind of issues occur with these overly complex products?

As you know, this has caused many issues for both vendors and clients. Changes in one part of the solution can cause unwanted side effects in another part. Trying to track down issues and bugs can take a huge amount of time and expense. This is a major factor as to why we see the legacy search products on the market today that are complex, expensive and take many months if not years to deploy even for simple requirements.

Exalead learned from these lessons when engineering our solution. We have an architecture that is fully object-orientated at the core and follows an SOA architecture. It means that we can swap in and out new modules without messy integrations. We can also take core modules such as connectors to repositories and instead of having to re-write them to meet specific requirements we can override various capabilities in the classes. This means that the majority of the code that has gone through our quality-management systems remains the same. If an issue is identified in the code, it is a simple task to locate the problem and this issue is isolated in one area of the code base. In the past, vendors have had to rewrite core components like connectors to meet customers’ requirements and this has caused huge quality and support issues for both the customer and the vendor.

What about integration? That’s a killer for many vendors in my experience.

The added advantage of this core engineering work means that for Exalead integration is a simple task. For example, building new secure connectors to new repositories can be performed in weeks rather than months. Our engineers can take this time saved to spend on adding new and innovative capabilities into the solution rather than spending time worrying about how to integrate a new function without affecting the 1001 other overlaying functions.

Without this model, legacy vendors have to continually provide point-solutions to problems that tend to be customer-specific leading to a very expensive support headache as core engineering changes take too long and are too hard to deploy.

I heard about a large firm in the US that has invested significant sums in retooling Lucene. The solution has been described on the firm’s Web site, but I don’t see how that engineering cost is offset by the time to market that the fix required. Do you see open source as a problem or a solution?

I do not wake up in the middle of the night worrying about Lucene if that is what you are thinking! I see Lucene in places that have typically large engineering teams to protect or by consultants more interested in making lots of fees through its complex integration. Neither of which adds value to the company in, for example, reducing costs of increasing revenue.

Organizations that are interested in providing cost effective richly functional solutions are in increasing numbers choosing solutions like Exalead. For example, The University of Sunderland wanted to replace their Google Search Appliance with a richer, more functional search tool. They looked at the marketplace and chose Exalead for searching their external site, their internal document repositories plus providing business intelligence solutions over their database applications such as student attendance records. The search on their website was developed in a single day including the integration to their existing user interface and the faceted navigation capabilities. This represented not only an exceptionally quick implementation, far in excess of any other solution on the marketplace today but it also delivered for them the lowest total cost of ownership compared to other vendors and of course open-source.

In my opinion, Lucene and other open-source offerings can offer a solution for some organizations but many jump on this bandwagon without fully appreciating the differences between the open source solution and the commercially available solutions either in terms of capability or total cost. It is assumed, wrongly in many instances, that the total cost of ownership for open source must be lower than the commercially available solutions. I would suggest that all too often, open source search is adopted by those who believe the consultants who say that search is a simple commodity problem.

What about the commercial enterprise that has had several search systems and none of them capable of delivering satisfactory solutions? What’s the cause of this? The vendors? The client’s approach?

I think the problem lies more with the vendors of the legacy search solutions than with the clients. Vendors have believed their own marketing messages and when customers are unsatisfied with the results have tended to blame the customers not understanding how to deploy the product correctly or in some cases, the third-party or system integrator responsible for the deployment.

One client of ours told me recently that with our solution they were able to deliver in a couple months what they failed to do with another leading search solution for seven years. This is pretty much the experience of every customer where we have replaced an existing search solution. In fact, every organization that I have worked with that has performed an in-depth analysis and comparison of our technology against any search solution has chosen Exalead.

In many ways, I see our solution as not only delivering on our promises but also delivering on the marketing messages that our competitors have been promoting for years but failing to deliver in reality.

So where does Exalead fit? The last demo I received showed me search working within a very large, global business process. The information just appeared? Is this where search is heading?

In the year 2000, and every year since, a CEO of one of the leading legacy search vendors made a claim that every major organization would be using their brand of meaning based search technology within two years.

I will not be as bold as him but it is my belief that in less than five years time the majority of organizations will be using search based applications in mission critical applications.

For too long software vendors have been trying to convince organizations, for example, that it was not possible to deploy mission critical solutions such as customer 360 degree customer view, Master Data Management, Data Warehousing or business intelligence solutions in a couple months, with no user training, with with up-to-the-minute information, with user friendly interfaces, with a low cost per query covering millions or billions of records of information.

With Exalead this is possible and we have proven it in some of the world’s largest companies.

How does this change the present understanding of search, which in my opinion is often quite shallow?

Two things are required to change the status quo.

Firstly, a disruptive technology is required that can deliver on these requirements and secondly businesses need to demand new methods of meeting ever greater business requirements on information.

Today I see both these things in place. Exalead has proven that our solutions can meet the most demanding of mission critical requirements in an agile way and now IT departments are realizing that they cannot support their businesses moving forward by using traditional technologies.

What do you see as the trends in enterprise search for 2010?

Last year was a turning point around Search Based Applications. With the world-wide economy in recession, many companies have put projects on hold until things were looking better. With economies still looking rather weak but projects not being able to be left on ice for ever, they are starting to question the value of utilizing expensive, time consuming and rigid technologies to deliver these projects.

Search is a game changing technology that can deliver more innovative, agile and cheaper solutions than using traditional technologies. Exalead is there to deliver on this promise.

Search, a commodity solution? No.

Editor’s note: You can learn more about Exalead’s search enable applications technology and method at the Exalead Web site.

Stephen E Arnold, February 4, 2010

I wrote this post without any compensation. However, Mr. Bentinck, who lives in a far off land, offered to buy me haggis, and I refused this tasty bribe. Ah, lungs! I will report the lack of payment to the National Institutes of Health, an outfit concerned about alveoli.
Profiles
Vyre: Software, Services, Search, and More

A happy quack to the reader who sent me a link to Vyre, whose catchphrase is “dissolving complexity.” The last time I looked at the company, I had pigeon holed it as a consulting and content management firm. The news release my reader sent me pointed out that the company has a mid market enterprise search solution that is now at version 4.x. I am getting old, or at least too sluggish to keep pace with content management companies that offer search solutions. My recollection is that Crown Point moved in this direction. I have a rather grim view of CMS because software cannot help organizations create high quality content or at least what I think is high quality content.

The Wikipedia description of Vyre matches up with the information in my archive:

VYRE, now based in the UK, is a software development company. The firm uses the catchphrase “Enterprise 2.0? to describe its enterprise  solutions for business.The firm’s core product is Unify. The Web based services allows users to build applications and content management. The company has technology that manages digital assets. The firm’s clients in 2006 included Diageo, Sony, Virgin, and Lowe and Partners. The company has reinvented itself several times since the late 1990s doing business as NCD (Northern Communication and Design), Salt, and then Vyre.

You can read Wikipedia summary here. You can read a 2006 Butler Group analysis here. My old link worked this evening (March 5, 2009), but click quickly.  In my files I had a link to a Vyre presentation but it was not about search. Dated 2008, you may find the information useful. The Vyre presentations are here. The link worked for me on March 5, 2009. The only name I have in my archive is Dragan Jotic. Other names of people linked to the company are here. Basic information about the company’s Web site is here. Traffic, if these data are correct, seem to be trending down. I don’t have current interface examples. The wiki for the CMS service is here. (Note: the company does not use its own CMS for the wiki. The wiki system is from MedioWiki. No problem for me, but I was curious about this decision because the company offers its own CMS system.  You can get a taste of the system here.

image

Administrative Vyre screen.

After a bit of poking around, it appears that Vyre has turned up the heat on its public relations activities. The Seybold Report here presented a news story / news release about the search system  here. I scanned the release and noted this passage as interesting for my work:

…version 4.4 introduces powerful new capabilities for performing facetted and federated searching across the enterprise. Facetted search provides immediate feedback on the breakdown of search results and allows users to quickly and accurately drill down within search results. Federated search enables users to eradicate content silos by allowing users to search multiple content repositories.

Vyre includes a taxonomy management function with its search system, if I read the Seybold article correctly. I gravitate to the taxonomy solution available from Access Innovations, a company run by my friend and colleagues Marje Hlava and Jay Ven Eman. Their system generates ANSI standard thesauri and word lists, which is the sort of stuff that revs my engine.

Endeca has been the pioneer in the enterprise sector for “guided navigation” which is a synonym in my mind for faceted search. Federated search gets into the functions that I associated with Bright Planet, Deep Web Technologies, and Vivisimo, among others. I know that shoving large volumes of data through systems that both facetize content and federated it are computationally intensive. Consequently, some organizations are not able to put the plumbing in place to make these computationally intensive systems hum like my grandmother’s sewing machine.

If you are in the market for a CMS and asset management company’s enterprise search solution, give the company’s product a test drive. You can buy a report from UK Data about this company here. I don’t have solid pricing data. My notes to myself record the phrase, “Sensible pricing.” I noted that the typical cost for the system begins at about $25,000. Check with the company for current license fees.

Stephen Arnold, March 6, 2009
Latest News
Mobile Devices and Their Apps: Search Gone Missing

VentureBeat’s “A Pretty Chart of Top Apps for iPhone, Android, BlackBerry” shocked me. Not a little. Quite a bit. You will want to look at the top apps f

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta