Is Content Management a Digital Titanic?

February 25, 2010

Content management is a moving target. Unlike search, CMS is supposed to generate a Web page or some other type of content product. The “leaders” in content management systems or CMS seem to disappearing into larger organizations. Surprising. If CMS were healthy, why aren’t these technology outfits growing like crazy and spinning off tons of cash?

I am no expert in CMS. In fact, I am not an expert in anything unlike the azure chip consultants, poobahs, and pundits who profess deep knowing at the press of a mouse button. In my experience, CMS emerged from people not having an easy way to produce HTML pages that could be displayed in a browser.

If HTML was too tough for some people, imagine the pickle barrel in which these folks find themselves today. In order to create a Web site, more than HTML is required. The crowd who relied on Microsoft’s Front Page find themselves struggling with the need to make Web pages work as applications or bundles of applications with some static brochureware thrown in for good measure.

To make a Web site today, technical know how is an absolute must. Even the very good point-and-click services from SquareSpace.com and Weebly.com can baffle some people.

image

The azure chip consultants, the mavens, and the poobahs want to be in the lifeboats. Women and children to the rear. Source: http://www.ronnestam.com/wp-content/uploads/2009/02/lifeboat_change_advertising_sinking.jpg

Move the need for a dynamic Web site into a big organization that is not good at technology, and you have a recipe for disaster. In fact, the wreckage created by some content management vendors, pundits, and integrators is of significant magnitude. There’s the big hassle in Australia over a blue chip CMS implementation that does not work. The US Senate went after the bluest of the blue chip integrators because a CMS could not generate a single Web page. Sigh.

Read more

Quote to Note: Dick Brass on MSFT Innovation

February 6, 2010

I met Dick Brass many years ago. He left Oracle and joining Microsoft to contribute to a confidential initiative. Mr. Brass worked on the ill-fated Microsoft tablet, which Steve Jobs has reinvented as a revolutionary device. I am not a tablet guy, but one thing is certain. Mr. Jobs knows how to work public relations. Mr. Brass published an article in the New York Times, and it captured the attention of Microsoft and millions of readers who enjoyed Mr. Brass’s criticism of his former employer. I have no opinion about Microsoft, its administrative methods, or its ability to innovate. I did find a quote to note in the write up:

Microsoft is no longer considered the cool or cutting edge place to work. There has been a steady exist of its best and brightest. (“Microsoft’s Creative Destruction”, the New York Times, February 4, 2010, Page 25, column 3, National Edition)

Telling because if smart people don’t work at a company, that company is likely to make less informed decisions than an organization with smarter people. This applies in the consulting world. There are blue chip outfits like McKinsey, Bain, and BCG). Then there are lesser outfits which I am sure you can name because these companies “advertise”, have sales people who “sell” listings, and invent crazy phrases to to create buzz and sales. I am tempted to differentiate Microsoft with a reference to Apple or Google, but I will not. Oh, why did I not post this item before today. The hard copy of my New York Times was not delivered until today. Speed is important in today’s information world.

The quote nails it.

Stephen E Arnold, February 7, 2010

No one paid me to write this, not a single blue chip consulting firm, not a single savvy company. I will report this lack of compensation to the experts at the IRS, which is gearing up for the big day in April.


* Featured
* Interviews
* Profiles

Featured
Microsoft and Mikojo Trigger Semantic Winds across Search Landscape

Semantic technology is blowing across the search landscape again. The word “semantic” and its use in phrases like “semantic technology” has a certain trendiness. When I see the word, I think of smart software that understands information in the way a human does. I also think of computationally sluggish processes and the complexity of language, particularly in synthetic languages like English. Google has considerable investment in semantic technology, but the company wisely tucks it away within larger systems and avoiding the technical battles that rage among different semantic technology factions. You can see Google’s semantic operations tucked within the Ramanathan Guha inventions disclosed in February 2007. Pay attention to the discussion of the system and method for “context”.

image

Gale force winds from semantic technology advocates. Image source: http://www.smh.com.au/ffximage/2008/11/08/paloma_wideweb__470×289,0.jpg

Microsoft’s Semantic Puff

Other companies are pushing the semantic shock troops forward. I read yesterday in Network World’s “Microsoft Talks Up Semantic Search Ambitions.” The article reminded me that Fast Search & Transfer SA offered some semantic functionality which I summarized in the 2006 version of the original Enterprise Search Report (the one with real beef, not tofu inside). Microsoft also purchased Powerset, a company that used some of Xerox PARC’s technology and its own wizardry to “understand” queries and create a rich index. The Network World story reported:

With semantic technologies, which also are being to referred to as Web 3.0, computers have a greater understanding of relationships between different information, rather than just forwarding links based on keyword searches.  The end game for semantic search is “better, faster, cheaper, essentially,” said Prevost, who came over to Microsoft in the company’s 2008 acquisition of search engine vendor Powerset. Prevost is still general manager of Powerset.  Semantic capabilities get users more relevant information and help them accomplish tasks and make decisions, said Prevost.

The payoff is that software understands humans. Sounds good, but it does little to alter the startling dominance of Google in general Web search and the rocket like rise of social search systems like Facebook. In a social context humans tell “friends” about meaning or better yet offer an answer or a relevant link. No search required.

I reported about the complexities of configuring the enterprise search system that Microsoft offers for SharePoint in an earlier Web log post. The challenge is complexity and the time and money required to make a “smart” software system perform to an acceptable level in terms of throughput in content processing and for the user. Users often prefer to ask someone or just use what appears in the top of a search results list.

Read more »
Interviews
Inside Search: Raymond Bentinck of Exalead, Part 2

This is the second part of the interview with Raymond Bentinck of Exalead.

Isn’t this bad marketing?

No. This makes business sense.Traditional search vendors who may claim to have thousands of customers tend to use only a handful of well managed references. This is a direct result of customers choosing technology based on these overblown marketing claims and these claims then driving requirements that the vendor’s consultants struggle to deliver. The customer who is then far from happy with the results, doesn’t do reference calls and ultimately becomes disillusioned with search in general or with the vendor specifically. Either way, they end up moving to an alternative.

I see this all the time with our clients that have replaced their legacy search solution with Exalead. When we started, we were met with much skepticism from clients that we could answer their information retrieval problems. It was only after doing Proof of Concepts and delivering the solutions that they became convinced. Now that our reputation has grown organizations realize that we do not make unsubstantiated claims and do stick by our promises.

What about the shift to hybrid solutions? An appliance or an on premises server, then a cloud component, and maybe some  fairy dust thrown in to handle the security issues?

There is a major change that is happening within Information Technology at the moment driven primarily by the demands placed on IT by the business. Businesses want to vastly reduce the operational cost models of IT provision while pushing IT to be far more agile in their support of the business. Against this backdrop, information volumes continue to grow exponentially.

The push towards areas such as virtual servers and cloud computing are aspects of reducing the operational cost models of information technology provision. It is fundamental that software solutions can operate in these environments. It is surprising, however, to find that many traditional search vendors solutions do not even work in a virtual server environment.

Isn’t this approach going to add costs to an Exalead installation?

No, because another aspect of this is that software solutions need to be designed to make the best use of available hardware resources. When Exalead provided a solution to the leading classified ads site Fish4.co.uk, unlike the legacy search solution we replaced, not only were we able to deploy a solution that met and exceeded their requirements but we reduced the cost of search to the business by 250 percent. A large part of this was around the massively reduced hardware costs associated with the solution.

What about making changes and responding quickly? Many search vendors simply impose a six month or nine month cycle on a deployment. The client wants to move quickly, but the vendor cannot work quickly.

Agility is another key factor. In the past, an organization may implement a data warehouse. This would take around 12 to 18 months to deploy and would cost a huge amount in hardware, software and consultancy fees. As part of the deployment the consultants needed to second guess the questions the business would want to ask of the data warehouse and design these into the system. After the 12 to 18 months, the business would start using the data warehouse and then find out they needed to ask different types of questions than were designed into the system. The data warehouse would then go through a phase of redevelopment which would last many more months. The business would evolve… making more changes and the cycle would go on and on.

With Exalead, we are able to deploy the same solution in a couple months but significantly there is no need to second guess the questions that the business would want to ask and design them into the system.

This is the sort of agile solution that businesses have been pushing their IT departments to deliver for years. Businesses that do not provide agile IT solutions will fall behind their competitors and be unable to react quickly enough when the market changes.

One of the large UK search vendors has dozens of niche versions of its product. How can that company keep each of these specialty products up to date and working? Integration is often the big problem, is it not?

The founders of Exalead took two years before starting the company to research what worked in search and why the existing search vendors products were so complex. This research led them to understand that the search products that were on the marketplace at the time all started as quite simple products designed to work on relatively low volumes of information and with very limited functional capabilities. Over the years, new functionality has been added to the solutions to keep abreast of what competitors have offered but because of how the products were originally engineered they have not been clean integrations. They did not start out with this intention but search has evolved in ways never imagined at the time these solutions were originally engineered.

Wasn’t one of the key architects part of the famous AltaVista.com team?

Yes. In fact, both of the founders of Exalead were from this team.

What kind of issues occur with these overly complex products?

As you know, this has caused many issues for both vendors and clients. Changes in one part of the solution can cause unwanted side effects in another part. Trying to track down issues and bugs can take a huge amount of time and expense. This is a major factor as to why we see the legacy search products on the market today that are complex, expensive and take many months if not years to deploy even for simple requirements.

Exalead learned from these lessons when engineering our solution. We have an architecture that is fully object-orientated at the core and follows an SOA architecture. It means that we can swap in and out new modules without messy integrations. We can also take core modules such as connectors to repositories and instead of having to re-write them to meet specific requirements we can override various capabilities in the classes. This means that the majority of the code that has gone through our quality-management systems remains the same. If an issue is identified in the code, it is a simple task to locate the problem and this issue is isolated in one area of the code base. In the past, vendors have had to rewrite core components like connectors to meet customers’ requirements and this has caused huge quality and support issues for both the customer and the vendor.

What about integration? That’s a killer for many vendors in my experience.

The added advantage of this core engineering work means that for Exalead integration is a simple task. For example, building new secure connectors to new repositories can be performed in weeks rather than months. Our engineers can take this time saved to spend on adding new and innovative capabilities into the solution rather than spending time worrying about how to integrate a new function without affecting the 1001 other overlaying functions.

Without this model, legacy vendors have to continually provide point-solutions to problems that tend to be customer-specific leading to a very expensive support headache as core engineering changes take too long and are too hard to deploy.

I heard about a large firm in the US that has invested significant sums in retooling Lucene. The solution has been described on the firm’s Web site, but I don’t see how that engineering cost is offset by the time to market that the fix required. Do you see open source as a problem or a solution?

I do not wake up in the middle of the night worrying about Lucene if that is what you are thinking! I see Lucene in places that have typically large engineering teams to protect or by consultants more interested in making lots of fees through its complex integration. Neither of which adds value to the company in, for example, reducing costs of increasing revenue.

Organizations that are interested in providing cost effective richly functional solutions are in increasing numbers choosing solutions like Exalead. For example, The University of Sunderland wanted to replace their Google Search Appliance with a richer, more functional search tool. They looked at the marketplace and chose Exalead for searching their external site, their internal document repositories plus providing business intelligence solutions over their database applications such as student attendance records. The search on their website was developed in a single day including the integration to their existing user interface and the faceted navigation capabilities. This represented not only an exceptionally quick implementation, far in excess of any other solution on the marketplace today but it also delivered for them the lowest total cost of ownership compared to other vendors and of course open-source.

In my opinion, Lucene and other open-source offerings can offer a solution for some organizations but many jump on this bandwagon without fully appreciating the differences between the open source solution and the commercially available solutions either in terms of capability or total cost. It is assumed, wrongly in many instances, that the total cost of ownership for open source must be lower than the commercially available solutions. I would suggest that all too often, open source search is adopted by those who believe the consultants who say that search is a simple commodity problem.

What about the commercial enterprise that has had several search systems and none of them capable of delivering satisfactory solutions? What’s the cause of this? The vendors? The client’s approach?

I think the problem lies more with the vendors of the legacy search solutions than with the clients. Vendors have believed their own marketing messages and when customers are unsatisfied with the results have tended to blame the customers not understanding how to deploy the product correctly or in some cases, the third-party or system integrator responsible for the deployment.

One client of ours told me recently that with our solution they were able to deliver in a couple months what they failed to do with another leading search solution for seven years. This is pretty much the experience of every customer where we have replaced an existing search solution. In fact, every organization that I have worked with that has performed an in-depth analysis and comparison of our technology against any search solution has chosen Exalead.

In many ways, I see our solution as not only delivering on our promises but also delivering on the marketing messages that our competitors have been promoting for years but failing to deliver in reality.

So where does Exalead fit? The last demo I received showed me search working within a very large, global business process. The information just appeared? Is this where search is heading?

In the year 2000, and every year since, a CEO of one of the leading legacy search vendors made a claim that every major organization would be using their brand of meaning based search technology within two years.

I will not be as bold as him but it is my belief that in less than five years time the majority of organizations will be using search based applications in mission critical applications.

For too long software vendors have been trying to convince organizations, for example, that it was not possible to deploy mission critical solutions such as customer 360 degree customer view, Master Data Management, Data Warehousing or business intelligence solutions in a couple months, with no user training, with with up-to-the-minute information, with user friendly interfaces, with a low cost per query covering millions or billions of records of information.

With Exalead this is possible and we have proven it in some of the world’s largest companies.

How does this change the present understanding of search, which in my opinion is often quite shallow?

Two things are required to change the status quo.

Firstly, a disruptive technology is required that can deliver on these requirements and secondly businesses need to demand new methods of meeting ever greater business requirements on information.

Today I see both these things in place. Exalead has proven that our solutions can meet the most demanding of mission critical requirements in an agile way and now IT departments are realizing that they cannot support their businesses moving forward by using traditional technologies.

What do you see as the trends in enterprise search for 2010?

Last year was a turning point around Search Based Applications. With the world-wide economy in recession, many companies have put projects on hold until things were looking better. With economies still looking rather weak but projects not being able to be left on ice for ever, they are starting to question the value of utilizing expensive, time consuming and rigid technologies to deliver these projects.

Search is a game changing technology that can deliver more innovative, agile and cheaper solutions than using traditional technologies. Exalead is there to deliver on this promise.

Search, a commodity solution? No.

Editor’s note: You can learn more about Exalead’s search enable applications technology and method at the Exalead Web site.

Stephen E Arnold, February 4, 2010

I wrote this post without any compensation. However, Mr. Bentinck, who lives in a far off land, offered to buy me haggis, and I refused this tasty bribe. Ah, lungs! I will report the lack of payment to the National Institutes of Health, an outfit concerned about alveoli.
Profiles
Vyre: Software, Services, Search, and More

A happy quack to the reader who sent me a link to Vyre, whose catchphrase is “dissolving complexity.” The last time I looked at the company, I had pigeon holed it as a consulting and content management firm. The news release my reader sent me pointed out that the company has a mid market enterprise search solution that is now at version 4.x. I am getting old, or at least too sluggish to keep pace with content management companies that offer search solutions. My recollection is that Crown Point moved in this direction. I have a rather grim view of CMS because software cannot help organizations create high quality content or at least what I think is high quality content.

The Wikipedia description of Vyre matches up with the information in my archive:

VYRE, now based in the UK, is a software development company. The firm uses the catchphrase “Enterprise 2.0? to describe its enterprise  solutions for business.The firm’s core product is Unify. The Web based services allows users to build applications and content management. The company has technology that manages digital assets. The firm’s clients in 2006 included Diageo, Sony, Virgin, and Lowe and Partners. The company has reinvented itself several times since the late 1990s doing business as NCD (Northern Communication and Design), Salt, and then Vyre.

You can read Wikipedia summary here. You can read a 2006 Butler Group analysis here. My old link worked this evening (March 5, 2009), but click quickly.  In my files I had a link to a Vyre presentation but it was not about search. Dated 2008, you may find the information useful. The Vyre presentations are here. The link worked for me on March 5, 2009. The only name I have in my archive is Dragan Jotic. Other names of people linked to the company are here. Basic information about the company’s Web site is here. Traffic, if these data are correct, seem to be trending down. I don’t have current interface examples. The wiki for the CMS service is here. (Note: the company does not use its own CMS for the wiki. The wiki system is from MedioWiki. No problem for me, but I was curious about this decision because the company offers its own CMS system.  You can get a taste of the system here.

image

Administrative Vyre screen.

After a bit of poking around, it appears that Vyre has turned up the heat on its public relations activities. The Seybold Report here presented a news story / news release about the search system  here. I scanned the release and noted this passage as interesting for my work:

…version 4.4 introduces powerful new capabilities for performing facetted and federated searching across the enterprise. Facetted search provides immediate feedback on the breakdown of search results and allows users to quickly and accurately drill down within search results. Federated search enables users to eradicate content silos by allowing users to search multiple content repositories.

Vyre includes a taxonomy management function with its search system, if I read the Seybold article correctly. I gravitate to the taxonomy solution available from Access Innovations, a company run by my friend and colleagues Marje Hlava and Jay Ven Eman. Their system generates ANSI standard thesauri and word lists, which is the sort of stuff that revs my engine.

Endeca has been the pioneer in the enterprise sector for “guided navigation” which is a synonym in my mind for faceted search. Federated search gets into the functions that I associated with Bright Planet, Deep Web Technologies, and Vivisimo, among others. I know that shoving large volumes of data through systems that both facetize content and federated it are computationally intensive. Consequently, some organizations are not able to put the plumbing in place to make these computationally intensive systems hum like my grandmother’s sewing machine.

If you are in the market for a CMS and asset management company’s enterprise search solution, give the company’s product a test drive. You can buy a report from UK Data about this company here. I don’t have solid pricing data. My notes to myself record the phrase, “Sensible pricing.” I noted that the typical cost for the system begins at about $25,000. Check with the company for current license fees.

Stephen Arnold, March 6, 2009
Latest News
Mobile Devices and Their Apps: Search Gone Missing

VentureBeat’s “A Pretty Chart of Top Apps for iPhone, Android, BlackBerry” shocked me. Not a little. Quite a bit. You will want to look at the top apps f

Lazarus, Azure Chip Consultants, and Search

January 8, 2010

A person called me today to tell me that a consulting firm is not accepting my statement “Search is dead”. Then I received a spam email that said, “Search is back.” I thought, “Yo, Lazarus. There be lots of dead search vendors out there. Example: Convera.

Who reports that search has risen? An azure chip consultant! Here’s what raced through my addled goose brain as I pondered the call and the “search is back” T shirt slogan:

In 2006, I was sitting on a pile of research about the search market sector. The data I collected included:

  • Interviews with various procurement officers, search system managers, vendors, and financial analysts
  • My own profiles of about 36 vendors of enterprise search systems plus the automated content files I generate using the Overflight system. A small scale version is available as a demo on ArnoldIT.com
  • Information I had from my work as a systems engineering and technical advisor to several governments and their search system procurement teams
  • My own experience licensing, testing, and evaluating search systems for clients. (I started doing this work after we created in 1993 The Point (Top 5% of the Internet) and sold it to Lycos, a unit of CMGI. I figured I should look into what Lycos was doing so I could speak with authority about its differences from BRS/Search, InQuire, Dialog (RECON), and IBM STAIRS III. I had familiarity with most of these systems through various projects in my pre Point (Top 5% of the Internet life).
  • My Google research funded by the now-defunct BearStearns outfit and a couple of other well heeled organizations.

What was clear in 2006 was the following:

First, most of the search system vendors shared quite a bit of similarity. Despite the marketing baloney, the key differentiators among the flagship systems in 2006 were minor. Examples range from their basic architecture to their use of stemming to the methods of updating indexes. There were innovators, and I pointed out these companies in my talks and various writings, including the three editions of the Enterprise Search Report I wrote before I fell ill in February 2007 and quit doing that big encyclopedia type publication. These similarities made it very clear to me that innovation for enterprise search was shifting from the plain old key word indexing of structured records available since the advent of RECON and STAIRS to a more freeform approach with generally lousy relevance.

image

Get information access wrong, and some folks may find a new career. Source: http://www.seeing-stars.com/Images/ScenesFromMovies/AmericanBeautyMrSmiley%28BIG%29.JPG

Second, the more innovative vendors were making an effort in 2006 to take a document and provide some sort of context for it. Without a human indexer to assign a classification code to a document that is about marketing but does not contain the word “marketing”, this was rocket science. But when I examined these systems, there were two basic approaches which are still around today. The first was to use statistical methods to put documents together and make inferences and the other was a variation on human indexing but without humans doing most of the work. The idea was that a word list would contain synonyms. There were promising demonstrations of software methods that could “read” a document, but there were piggy and of use where money was no object.

Third, the Google approach which used social methods—that is, a human clicking on a link—were evident but not migrating to the enterprise world. Google was new but to make their 2006 method hum, lots of clicks were needed. In the enterprise, most documents never get clicked, so the 2006 Google method was truly lousy. Google has made improvements, mostly by implementing the older search methods, not by pushing the envelope as it has been doing with its Web search and dataspace efforts.

Fourth, most of the search vendors were trying like Dickens to get out of a “one size fits all” approach to enterprise search. Companies making sales were focusing on a specific niche or problem and selling a package of search and content searching that solved one problem. The failure of the boil the ocean approach was evident because user satisfaction data from my research funded by a government agency and other clients revealed that about two thirds of the users of an enterprise search system were dissatisfied or very dissatisfied with that search system. The solution, then, was to focus. My exemplary case was the use of the Endeca technology to allow Fidelity UK sales professionals to increase their productivity with content pushed to them using the Endeca system. The idea was that a broker could click on a link and the search results were displayed. No searching required. ClearForest got in the game by analyzing the dealer warranty repair comments. Endeca and ClearForest were harbingers of focus. ClearForest is owned by Thomson Reuters and in the open source software game too.

When I wrote the article in Online Magazine for Barbara Quint, one of my favorite editors, I explained these points in more detail. But it was clear that the financial pressures on Convera, for example, and the difficulty some of the more promising vendors like Entopia were having made the thin edge of survival glint in my desk lamp’s light. Autonomy by 2006 had shifted from search and organic growth to inorganic growth fueled by acquisitions that were adjacent to search.

Read more

Google Pressures eCommerce Search Vendors

November 6, 2009

Companies like Dieselpoint, Endeca, and Omniture Mercado face a new competitor. The Google has, according to Internet News, “launched Commerce Search, a cloud-based enterprise search application for e-tailers that promises to improve sales conversion rates and simplify the online shopping experience for their customers.” For me the most significant passage in the write up was:

Commerce Search not only integrates the data submitted to Google’s Product Center and Merchant Center but also ties into its popular Google Analytics application, giving e-tailers an opportunity to not only track customer behavior but the effectiveness of the customized search application. Once an e-tailer has decided to give Commerce Search a shot, it uploads an API with all its product catalog, descriptions and customization requirements and then Google shoots back an API with those specifications that’s installed on the Web site. Google also offers a marketing and administration consultation to highlight a particular brand of camera or T-shirt that the retailer wants to prominently place on its now customized search results. It also gives e-tailers full control to create their own merchandising rules so that it can, for example, always display Canon cameras at the top of its digital camera search results or list its latest seasonal items by descending price order.

Google’s technical investments in its programmable search engine, context server, and shopping cart service chug along within this new service. Google’s system promises to be fast. Most online shopping services are sluggish. Google knows how to deliver high speed performance. Combining Google’s semantic wizardry with low latency results puts some of the leading eCommerce vendors in a technology arm lock.

Some eCommerce vendors have relied on Intel to provide faster CPUs to add vigor to older eCommerce architectures. There are some speed gains, but Google delivers speed plus important semantic enhancements that offer other performance benefits. One example is content processing. Once changes are pushed to Google or spidered by Google from content exposed to Google, the indexes update quickly. Instead of asking a licensee of a traditional eCommerce system to throw hardware at a performance bottleneck or pay for special system tuning, the Google just delivers speed for structured content processed from the Google platform.

In my opinion, competitors will point out that Google is inexperienced in eCommerce. Google may appear to be a beginner in this important search sector. Looking more deeply into the engineering resources responsible for Commerce Search one finds that Google has depth. I hate to keep mentioning folks like Ramanathan Guha, but he is one touchstone whose deep commercial experience has influenced this Google product.

How will competitors like Dieselpoint, Endeca, and Omniture Mercado respond? The first step will be to downplay the importance of this Google initiative. Next I expect to learn that Microsoft Fast ESP has a better, faster, and cheaper eCommerce solution that plays well with SharePoint and Microsoft’s own commerce server technology. Finally, search leaders such as Autonomy will find a marketing angle to leave Google in the shadow of clever positioning. But within a year, my hunch is that Google’s Commerce Search will have helped reshape the landscape for eCommerce search. Google may not be perfect, but its products are often good enough, fast, and much loved by those who cannot imaging life without Google.

Stephen Arnold, November 6, 2009

I want to disclose to the Department of the Navy that none of these vendors offered me so much as a how de doo to write this article.

Microsoft Casts Doubt on Google Apps Data

November 4, 2009

I expected the Doubting Thomas approach to Google and DT arrived. I read “Microsoft Questions Google Apps’ Momentum, Touts 1M Online Business Suite Customers” and learned that Microsoft thinks that the Google is fudging its numbers. Serious stuff in my opinion. As far as I know, Google has not made any comments about the 100 million SharePoint licenses. The passage in the Computer World article that resonated for me was:

“I [a Microsoft executive] have a really hard time understanding their [Google’s] numbers,” he said. “You simply don’t know what their paying user numbers are. Analysts predict that they are pretty small. It’s hard for us to really know.” Asked what he thought of Google’s high-profile win of the City of Los Angeles for a 5-year, $7.25 million deal to use Google Apps, he said, “I feel like we are winning lots and lots of deals. We can’t spend too much time worrying about what they [Google] are doing. I feel good about how much progress we’ve made in a short period of time.”

Coincident with the Doubting Thomas play, Microsoft chopped its own prices for cloud based services. What lingers is the idea that Google is not telling the truth about its market penetration. The price cut suggests to me that whatever the Google numbers are, those numbers have forced Microsoft to lower its prices. Interesting.

Stephen Arnold, November 4, 2009

I want to disclose the FHA that no one gave me one red cent to write this blog post.

Autonomy Enters Search Engine Marketing

August 4, 2009

Search has become a commodity. Every SharePoint installation includes search. Content management systems have been struggling to bridge the gaps that exist among Web page output, eDiscovery, and repurposing of information for print. Search vendors have had to scramble to make sales as open source solutions like Lemur Consulting’s FLAX, lower cost solutions such as Gaviri, and marketing centric such as those introduced by Attensity have flowed into the marketplace.

Autonomy has been among the most agile search vendors I track. The company diversified into rich media before most organizations knew that video could be digitized. Then the company hopped into print-to-digital services with some key acquisitions. Most recently, Autonomy added to its Zantaz services with its acquisition of Interwoven, a content management company.

The most recent Autonomy innovation is a landing page service. The idea is that an organization buys a Google AdWord. The user clicking on the AdWord will be sent to the page that provides information referenced or promised in the ad. That page is a landing page and most content management systems don’t produce these without cartwheels, fireworks, and a brass band. In short, Autonomy’s announcement about a “hosted Web landing page” indicates that Autonomy is moving quickly again. The key for me is the word “hosted”. Autonomy is becoming a cloud-services company. Maybe the word “utility” is appropriate? I think landing pages are a new service for a search vendor. This is not a bad thing, just not a search thing. In my opinion, the market for one-size-fits all search seems to be softening.

Stephen Arnold, August 4, 2009

Microsoft and Online Spending

May 22, 2009

When you are north of $65 billion a year in revenue, why explain? Mary Jo Foley’s “Microsoft’s Ozzie Defends Microsoft’s Aggressive Online Spending” surprised me for two reasons. First, the old saw “never complain, never explain” seemed to be ignored. And, second, Microsoft has cash, is floating a financial instrument to raise more cash, and the Windows 7 cash dump truck will be arriving in the near future. So, spend what you want seems like a reasonable approach.

Read Ms. Foley’s article here and make up your own mind. She reported:

Microsoft’s growing family of enterprise-focused services — Exchange Online, SharePoint Online, etc. — have taught the company a lot about cloud requirements. Its investments in consumer services  have taught the company important lessons about scale, Ozzie said. The underlying infrastructure Microsoft has built to deploy and run its consumer services is now being extended to support other services throughout the company, he said. Ozzie pointed to “Cosmos,” the high-scale file system that is part of Microsoft’s Azure cloud platform, as ultimately supporting and aiding every consumer, enterprise and developer property at Microsoft. He noted that the management systems for Microsoft’s current and existing cloud services are all derived from the learnings Microsoft has gleaned from managing its consumer online services. Ozzie said he believed one of Microsoft’s main advantages vis-a-vis its cloud competitors is “the fact we build both platforms and applications.”

This sounds reasonable to me. The one thought that struck me was that Microsoft’s spending in the last few years has not brought the financial home run that I had anticipated. For example, the purchase of Fast Search & Transfer and Powerset now seem in retrospect to be interesting ideas but not yet ready to deliver megabucks. In fact, if the chatter I heard in San Francisco last week is anywhere near accurate, Microsoft may be bundling Fast ESP with SharePoint for certain clients to prevent a third party vendor from getting its snoot in the cubicles at a Microsoft-centric organization. Second, the money poured into Vista was not exactly wasted, but the grousing and negative vibes rightly or wrongly flowing through the Web postings did not put money in the bank. I don’t pay much attention to consumer products like the Xbox and the Zune, but so far neither has been the Gold Rush for which I had hoped.

Nevertheless, Microsoft has money, and in my opinion, it can spend it any way it wants to spend it. The “defensive” spin tells me that maybe there is some force field operating when Microsoft gets in front of investors and bankers. What do these audiences know or do that makes a technologist adopt a tone that connotes uncertainty and doubt?

On a related note, a reader groused about my pointing to information about SharePoint that I obtained from Mary Jo Foley’s article about search for the new SharePoint. I point to stories. I don’t create the stories. I encouraged the reader to take his grousing to my source. I enjoyed his commentary, but I can’t do much to infuse accuracy in the stories which I read and upon which I comment. When I checked out his complaint, Ms. Foley seemed to be recycling what Microsoft has said about enterprise search. I find it quite common that “old” news is included in “new” news releases. The reason, I believe, is comprehensiveness, not nefarious behavior.

I like Ms. Foley’s work, and I will continue to point to her write ups, which are often better than the drivel I find in other free Web sources.

Stephen Arnold, May 21, 2009

Content Management: Modern Mastodon in a Tar Pit, Part One

April 17, 2009

Editor’s Note: This is a discussion of the reasons why CMS continues to thrive despite the lousy financial climate. The spark for this essay was the report of strong CMS vendor revenues written by an azure chip consulting firm; that is, a high profile outfit a step or two below the Bains, McKinseys, and BCGs of this world.

Part 1: The Tar Pit and Mastodon Metaphor or You Are Stuck

PCWorld reported “Web Content Management Staying Strong in Recession” here. The author, Chris Kanaracus, wrote:

While IT managers are looking to cut costs during the recession, most aren’t looking for savings in Web content management, according to a recent Forrester Research study. Seventy-two percent of the survey’s 261 respondents said they planned to increase WCM deployments or usage this year, even as many also expressed dissatisfaction with how their projects have turned out. Nineteen percent said their implementations would remain the same, and just 3 percent planned to cut back.

When consulting firms generate data, I try to think about the data in the context of my experience. In general, pondering the boundaries of “statistically valid data from a consulting firm” with the wounds and bruises this addled goose gets in client work is an enjoyable exercise.

These data sort of make sense, but I think there are other factors that make CMS one of the alleged bright spots in the otherwise murky financial heavens.

La Brea, Tar, and Stuck Trapped Creatures

I remember the first time I visited the La Brea tar pits in Los Angeles. I was surprised. I had seen well heads chugging away on the drive to a client meeting in Longbeach in the early 1970s, but I did not know there was a tar pit amidst the choked streets of the crown jewel in America’s golden west. It’s there, and I have an image of a big elephant (Mammut americanum for the detail oriented reader) stuck in the tar. Good news for those who study the bones of extinct animals. Bad news for the elephant.

mastadon

Is this a CMS vendor snagged in litigation or the hapless CMS licensee after the installation of a CMS system?

I had two separate conversations about CMS, the breezy acronym for content management systems. I can’t recall the first time I discovered that species of mastodon software, but I was familiar with the tar pits of content in organizations. Let’s set the state, er, prep the tar pit.

Organizational Writing: An Oxymoron

Organizations produce quite a bit of information. The vast majority of this “stuff” (content objects for the detail oriented reader) is in a constant state of churn. Think of the memos, letters, voice mails, etc. like molecules in a fast-flowing river in New Jersey. The environment is fraught with pollutants, regulators, professional garbage collection managers, and the other elements of modern civilization.

The authors of these information payloads are writing with a purpose; that is, instrumental writing. I have not encountered too many sonnets, poems, or novels in the organizational information I have had the pleasure of indexing since 1971. In the studies I worked on first at Halliburton Nuclear Utility Services and then at Booz, Allen & Hamilton, I learned that most organizational writing is not read by very many people. A big fat report on nuclear power plants had many contributors and reviewers, but most of these people focused on a particular technical aspect of a nuclear power generation system, not the big fat book. I edited the proceedings of a nuclear conference in 1972, and discovered that papers often had six or more authors. When I followed up with the “lead author” about a missing figure or an error in a wild and crazy equation, I learnedthat the “lead author” had zero clue about the information in the particular paragraph to which I referred.

Flash forward. Same situation today just lots more digital content. Instrumental writing, not much accountability, and general cluelessness about the contents of a particular paragraph, figure, chart, whatever in a document.

Organizational writing is a hotch potch of individuals with different capabilities and methods of expressing themselves. Consider an engineer or mathematician. Writing is not usually a core competency, but there are exceptions. In technical fields, there will be a large number of people who are terse to the point of being incomprehensible and a couple of folks who crank out reams of information. In an organization, volume may not correlate with “right” or “important”. A variation of this situation crops up in sales. A sales report often is structured, particularly if the company has licensed a product to force each salesperson to provide a name, address, phone, number, and comments about a “contact”. The idea is that getting basic information is pretty helpful if the salesperson quits or simply refuses to fill in the blanks. Often the salesperson who won’t play ball is the guy or gal who nails a multi million dollar deal. The salesperson figures, “Someone will chase up the details.” The guy or gal is right. Distinct content challenges arise in the legal department. Customer support has its writing preferences, sometimes compressed to methods that make the customer quit calling.

Why CMS for Text?

The Web’s popularization as cheap marketing created a demand for software that would provide writing training wheels to those in an organization who had to contribute information to a Web site. The Web site has gained importance with each passing year since 1993 when hyperlinking poked its nose from the deep recesses of Standard Generalized Markup Language.

Customer relationship management systems really did not support authoring, editorial review, version control, and the other bits and pieces of content production. Enterprise resource planning systems manage back office and nitty gritty warehouse activities. Web content is not a core competency of these labyrinthine systems. Content systems mandated for regulatory compliance are designed to pinpoint which supplier delivered an Inconel pipe that cracked, what inspector looked at the installation, what quality assurance engineer checked the work, and what tech did the weld when the pipe was installed. Useful for compliance, but not what the Web marketing department ordered. Until recently, enterprise publishing systems were generally confined to the graphics department or the group that churned out proposals and specifications. The Web content was an aberrant content type.

Enter content management.

I recall the first system that I looked at closely was called NCompass. When I got a demo in late 1999, I recall vividly that it crashed in the brightly lit, very cheerful exhibition stand in San Jose. Reboot. Demo another function. Crash. Repeat. Microsoft acquired this puppy and integrated it into SharePoint. SharePoint has grown over time like a snowball. Here’s a diagram of the SharePoint system from www.JoiningDots.net:

image

SharePoint. Simplicity itself. Source: http://www.joiningdots.net/downloads/SharePoint_History.jpg

A Digital Oklahoma Land Rush

By 2001, CMS was a booming industry. In some ways, it reminded me of the case study I wrote for a client about the early days of the automobile industry. There were many small companies which over time would give way to a handful of major players. Today CMS has reached an interesting point. The auto style aggregation has not worked out exactly like the auto industry case I researched. Before the collapse of the US auto industry in 2008, automobile manufacturing had fractured and globalized. There were holding companies making more vehicles than the US population would buy from American firms. There were vast interconnected of supplier subsystems and below these huge pipelines into more fundamental industrial sectors like chemicals, steel, and rubber.

Read more

Microsoft and Proprietary Chips

April 10, 2009

Stacey Higginbotham’s “Is Microsoft Turning Away from Commodity Server?” here reminded me of a client study I did five or six years ago. The Sony PS3 was working on a proprietary chip. IBM was involved, and I documented the graphics method which built upon IBM technology. In short order, Microsoft and Nintendo signed up with IBM to use its generic chip design for their next generation game devices. Sony ran into three problems. First, costs went through the roof. Sony did not have a core competency in chip design and fabrication, and it was evident even in the sketchy technical information my Overflight service dug out.

Second, the yield on chips is a tricky issue. Without getting into why a yield goes wrong, I focused on the two key factors: time and cost overruns. The costs were brutal, eventually forcing Sony to change its fabrication plans. The time is a matter of public record. Microsoft beat the PS3 to market, and Sony is starting to recover now. We’re talking years of lost revenue, not days or weeks or months.

Third, the developers were stuck in limbo. With new chips, new programming tools and procedures were needed. Without a flow of chips, developers were flying blind. The problem became critical and when the PS3 launched, the grousing of developers about the complexity of programming the new chip joined with complaints from fanboys that games were in short supply.

Compatibility, availability, and affordability joined the chorus.

Ms. Higginbotham’s article summarized what is known about Microsoft’s alleged interest in creating its own chips for its own servers. The motivator for Microsoft, if I read Ms. Higginbotham’s article correctly, is related to performance. One way to get performance is to get zippier hardware. With faster CPUs and maybe other custom chips, the performance of Microsoft software would improve more than it would by using Intel or AMD CPUs. (Google uses both.)

For me, the most interesting point in her write up was:

The issue of getting software performance to scale linearly with the addition of more cores has become a vexing problem. Plus, as data center operators look for better application performance without expending as many watts, they are experimenting with different kinds of processors that may be better-suited to a particular task, such as using graphics processors for Monte Carlo simulations.

She did not draw any parallels with the Sony chip play. I will:

  1. The Sony Ken Kutaragi chip play provides a good lesson about the risks of rolling your own chips. Without a core competency across multiple disciplines, I think the chance for a misstep is high. Maybe Microsoft is just researching this topic? That’s prudent. Jumping into a proprietary chip may come, but some ramp up may be needed.
  2. Google does many proprietary things. The performance of Google’s system is not the result of a crash project. Time is of the essence because the GOOG is gaining momentum, not losing it. Therefore, the Sony “time problem” with regard to the Xbox may translate into years of lost opportunity. Chip designs are running into fundamental laws of physics, so software solutions may reduce the development time.
  3. The performance problem will not be resolved by faster hardware. Multiple changes are needed across the computing system. There are programming slow downs because tools have to generate zippy code for high speed devices. Most of the slow downs are not caused by calculations. Moving data is the problem. Inefficient designs and code combine with known bottlenecks to choke high performance systems, including those at Google. As the volume of data increases, the plumbing has to be scalable, stable, and dirt cheap. Performance problems are complex and expensive to resolves. Fixes often don’t work which makes the system slower. Nice, right? Need more data? Ask a SharePoint administrator about the cost and payoff of her last SharePoint scaling exercise.

My view is that one hire does not a chip fab make. Microsoft’s analysts have ample data to understand the costs of custom chip design and fabrication. Google requires immediate attention and rapid, purposeful progress on the part of Microsoft’s engineers. Time is the real enemy now. Without a meaningful competitor, Google seems to enjoy large degrees of freedom.

Stephen Arnold, April 10, 2009

Fast Forward "Machine to Organism" Baffler

March 31, 2009

I heard that the Fast Forward conference is no more. My understanding is that it will be rolled into a SharePoint conference, which seems to fit the road map for the Fast technology. I did a routine check of the Fast Forward Web log for information (no joy) and read “From Machine To Organism” here.

The article struck me as unrelated to Fast Search & Transfer’s technology. In fact, the piece seemed to describe SAP’s efforts to regain lost momentum. The original SAP world is gone, replacing it is a world populated by “homo zappians”. The wrap up of the argument is that a different type of enterprise software model is needed.

I think I agree, but here’s the disconnect for me. Microsoft purchased Fast Search & Transfer in April 2008 for about $1.2 billion dollars. The company released some Web parts late in 2008. At the Fast Forward conference, Microsoft suggested that Fast Search’s technology would be integrated with other Microsoft products according to a road map.

In the new world of homo zappiens (great phrase!), the emerging paradigm seems to be more along the Google model than along the SharePoint – Fast ESP model. Google, by definition, is an integrated, “as is” infrastructure. The Microsoft cloud play is a work under construction. The road map focused on integrating SharePoint and Fast ESP, both of which are mostly on premises installations.

If SAP is in trouble, Microsoft should be thrilled that its attempt to buy the company flopped. My thought is that the road map for Fast Search seems to be closer to the SAP model than the Google model. If I am right, this “Machine to Organism” analysis is off by about nine degrees which could spell disaster for Microsoft in my opinion.

Stephen Arnold, March 31, 2009

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta