Google Alert Change

December 14, 2008

My Google Alert changed today. I get an alert for the phrase “enterprise search.” Yesterday I received text only. Today I received text and embedded pictures. Here’s a screenshot of my improved Google Alert. I prefer text in alerts because the BlackBerry I have does a lousy job with html mail.

alert with pix

Has anyone else noticed this change? Am I late to the party? Let me know.

Stephen Arnold, December 14, 2008

Autonomy: The Next Big Thing

December 14, 2008

I enjoy the hate mail I get when I write about Autonomy’s news announcements. Some of my three or four readers think that I write these items for Autonomy. Wrong. I am reporting information that my trusty newsreader delivers to me. Here’s a gem that will get the anti-Autonomy crowd revved on a Sunday morning. The article appeared on SmartBrief.com as news. The headline was an attention grabber: “Autonomy at the Cutting Edge of New Multi-Trillion dollar Sector According to Head of Gartner Research.” You can read it here. The url is one of those wacky jobs that can fail to resolve. The core of the story for me is that Gartner has identified a “multi trillion dollar sector.” That has to be good news to those who pay Gartner to make forecasts about markets. Search and content processing has been chugging along in the $1.3 to $3.0 billion range if one ignores the aberration that is Google. I find it hard to believe that Gartner’s financial forecasts can be spot on, but who knows? In case, you want to know what a trillion is, it is one followed by a dozen zeros. The Gartner fellow with the sharp and optimistic pencil is identified as Peter Sondergaard, Senior Vice President, Gartner Research. The source, according the the news release, is an interview with an outfit called Business Spectator. I wonder if a few extra zeros were added as Mr. Sondergaard’s pronouncement was recorded? So, what’s this forecast have to do with Autonomy? Autonomy said in its input to SmartBrief:

Autonomy Corporation plc , a global leader in infrastructure software for the enterprise, today announced that its vision of searching and analyzing structured and unstructured data has now been validated as the next big thing in business IT. According to an interview with Business Spectator, Peter Sondergaard, Senior Vice President, Gartner Research, predicts that the next quantum leap in productivity will come from the use of IT systems that analyze structured and unstructured data. Sondergaard says that Autonomy is at the cutting edge of the new search technology, a sector in the IT industry that will ultimately earn multi trillion dollar revenues.

The story appeared on PRNewswire and on one of the Thomson Reuters’ services. With economies tanking, I am delighted to know that the sector in which I work is slated to become a multi trillion dollar business. I hope I live long enough. Since laughter is a medicine that extends one’s life, I look forward to more Gartner forecasts and to Autonomy’s riding the crest of this predicted market boom.

Stephen Arnold, December 15, 2008

Smart Folks Can Be Really Dumb, WSJ Says

December 14, 2008

Jason Zweig writes for The Intelligent Investor, a title that struck me as ironic in the context of the current economic downturn. I don’t have an online link to this column because I refuse to use the Wall Street Journal’s online service. The dead tree version is on my desk in front of me, and you can find the full text of “How Bernie Madoff Made Smart Folks Look Dumb” on page B1 at the foot of the page on December 13, 2008. Just look for the drawing of Mad Magazine’s mascot, and you will be ready to read the full text of Mr. Zweig’s column. For me, the column said, “Smart people were not smart.” The fact that wealthy people who are supposed to be more intelligent than an addled goose fell for a Ponzi scheme is amusing. However, as I read the column, I thought about search and content processing, not the self satisfied millionaires who were snookered.

Let me run down my thoughts:

  1. The scam worked because the alleged Ponzi artist emphasized secrecy. Quite a few vendors take this approach. I can’t name names, but in Washington, DC, last week I heard a vendor emphasize how easy it was to use a certain system. The system of course was super secret but not to worry. It’s easy because of the magic in the code. I think I saw people in the presentation quiver with excitement. Secrecy is a variant of catnip.
  2. Mr Madoff played the “exclusivity” card. The idea is that not everyone can get the inside track on a deal. In London, at the International Online Show on December 4, 2008, I heard this pitch in a trade show booth. The angle was that this particular content processing tool was available only to customer with the knowledge and insight to use a special approach. I saw folks eagerly crowding around a tiny laptop to get close to the exclusive technology.

Mr. Zweig taps an academic for an explanation of how baloney suckers supposedly intelligent people into a scam. The phrase “sophisticated investor” sums up nicely how one’s perception of one’s own intelligence combine to make a Ponzi scheme work. The black art of Mr. Madoff can be found among the search and content processing sector as well. I suppose Enterprise 2.0, search is simple, and we have magic technology won’t catch the attention of governmental authorities. In a way it is reassuring that some basic sales techniques work better than some enterprise software systems. On the other hand, is it inevitable that the methods of financial charlatans work in the information services business? Thoughts?

Stephen Arnold, December 14, 2008

Expert System’s COGITO Answers

December 12, 2008

Expert System has launched COGITO Answers, which streamlines search and provides customer assistance on web sites, e-mail and mobile interfaces such as cell phones and PDAs while creating a company knowledge base.  The platform allows users to search across multiple resources with a handy twist: it uses semantic analysis to absorb and understand a customer’s lingo, therefore analyzing the meaning of the text to process search results rather than just matching keywords. It interprets word usage in context. The program also tracks customer interface and stores all requests so the company can anticipate client needs and questions, thus cutting down response time and increasing accuracy. You can get more information by e-mailing answers@expertsystem.net.

Jessica Bratcher, December 12, 2008

Microsoft Search Head Speaks

December 10, 2008

I enjoyed Microsoft’s interview with Microsoft’s new head of search, Qi Lu. You can find the information here. The questions were not hard balls, but gently lofted nerf balls. I found the answers intriguing. Before I read the interview, I noted with interest this story about Microsoft’s and Google’s cutting back on the big capex spending for data centers. You can read that story here. No big surprise but Microsoft’s own interview with Microsoft’s own head of search said:

Steve and I first met last September, in a hotel in San Jose, California. We spent almost half a day talking. We talked about the competitive landscape, about the possibility to really innovate and take the user experience [of Microsoft’s search capabilities] to the next level, and about creating a more competitive space, particularly in the search space. We all believe that it’s better for everybody involved when we have a healthy, more competitive environment. Two things he said really stood out. First was the level of commitment on investment. Steve made it very clear how he views that as critical for the long-term future of Microsoft, and his strong commitment to invest in R&D resources is very, very important to me.

I found this somewhat jarring. But that’s a nit. Other points that I noted were:

  • “I think there is a genuine opportunity to take our search products to the next level.”
  • “…we’re here to win, and my view on this is that to win in the search space, fundamentally you build on the strengths of your product.”
  • “We have a clear path from where we are today, to where we need to be, and to reach that next level we need to keep executing and building winning products.”

As I read these comments, several thoughts went through my mind. Yahoo was not able to get an ad platform out the door, so Google ran away with the business. Second, Google has what in my opinion is an insurmountable lead in Web search and is now gunning directly for Microsoft’s liver–the enterprise. And, I am not sure I can accept the assertion of “clear path”. Microsoft has taken runs at Web search before. Microsoft bought Fast Search & Transfer which has a decent Web indexing system and a bit of a problem with Norwegian law enforcement. Microsoft bought Powerset and not done too much with the semantic system yet. I will keep an open mind, but time is running out and demographics are on the side of Googzilla.

Stephen Arnold, December 11, 2008

Video Horserace

December 10, 2008

Many Web log wizards are chasing the story about Google’s adding magazine content to its burgeoning commercial database killing service. Old news from my point of view. The GOOG is becoming the go-to service for research with words. In fact, it is game over for Thomson Reuters, Reed Elsevier, Wolters Kluwer, Ebsco, and others in this professional content sector. Management at these companies can whip up Excel models to prove me wrong, but Google has demographics and infrastructure on its side. Besides I describe the trajectory of word-centric content in my forthcoming Google and Publishing study.

The real action is in video. Anyone under the age of 17 can explain what’s happening in information. Words are okay, but the future is the rich media experience where words are amplified by music. Do you have a soundtrack for your life? My neighbors’ kids do. Do you make pix and vids on your mobile phone and use these to perform communication functions for which I need pencil and paper? Check out the demographics, and you don’t need the flailing New York Times to remind you of trouble when reporting that Harcourt will not publish books for a while.

Consider the comScore video results table here. The handicappers look at the data, which are probably generally on track but off the mark in absolute values, and see that Google is at the top of the table. Google appears to have about 40 percent of the online video traffic measured and analyzed by comScore. So 60 percent of the video traffic is “in play”; that is, other companies can enter the video ballgame and have some room to maneuver. Look at the number two player, Fox Interactive Media. If comScore data are reasonably accurate, Fox has a chokehold on four percent of the videos viewed market. One of the largest media companies in the world has captured four percent of the market and lags Google’s YouTube by 36 percent. The rest of the field perform less well. Hulu.com, the darling of the old TV kingpins, is in the race. Maybe Hulu.com like a marathon runner getting a marvelous second wind can close the gap between Hulu and Google, but I think I will give Google the advantage for now.

Who cares? The action is text, right?

Wrong, wrong, wrong. YouTube.com could be a major cost sinkhole for Google. If video is expensive for the GOOG, how much of a dent in the bank accounts will video make at outfits like Fox, NBC, and others in the comScore table. Google, for now, seems will to spend to support YouTube.com. As the credit mistral whips through old media, a willingness to spend may winnow the companies in the comScore league table.

Demographics and time, therefore, may give Google an advantage. As pundits gnash their teeth over Google’s overt moves into commercial textual information, Google management is implementing tactics designed to bleed rich media companies, thus weakening them.

Just as the book publishers and other print gurus rolled over into a position of submission to Googzilla, the same fate awaits rich media. Google Books’ growth is old news. The real action is in rich media. The comScore table makes clear to me that the GOOG is poised to destablize more 20th century giants with its 21st century business model. Now tell me why I am incorrect. Facts, please. Catcalls make the geese honk.

Stephen Arnold, December 9, 2008

Enterprise Translation Systems

December 10, 2008

Update: December 14, 2008 I came across Nice Translator at http://www.nicetranslator.com/

Original Post

I received an email from a colleague who wanted to know about translation systems. I fired back an answer, but I thought you might want to have my short list of vendors to peruse. If you run a search on Google for “enterprise translation software”, you get more than 400,000 hits. That’s not too useful. If you want to experiment with free translation services, download this file.

BASIS Technologies licenses its various translation components to a number of search and content processing vendors; for example, Fast Search & Transfer was a customer. BASIS has been a leader in providing machine translation of Arabic and related languages. The Federal government has been a fan of BASIS’s systems. You can get some very specialized translation and language components; for example, a Japanese address analyzer.

Google provides a pretty good translation system. Right now, it is for free, which is a plus. Some of the translation systems shoot into six figures pretty quickly if you pack on the language packs and custom tuning. You can use the Google system by navigating here: http://translate.google.com. You can fiddle around and automate translation, but I have heard that Google monitors its translation system, so if you push too much through the system, the Googlers follow up. You can feed it a line of text or a url.

Language Weaver automated language translation. The company serves digital industries and enterprise customers directly and through strategic partnerships. You can hook this system into other enterprise software. Employees can access documents in their native language.The company recently added new language pairs:

  • Bulgarian to/from English
  • Hebrew to/from English
  • Serbian to/from English
  • Thai to/from English
  • Turkish to/from English.

Systran has been a player in translation for years. You have to buy Systran’s software. The desktop version works quite well. The enterprise system involves some fiddling, but you can automate the translation and perform some useful operations on the machine-generated files. You can get more information about Systran here. Systran is used for the Babel Fish online translation function in AltaVista.com and Yahoo.

How good are these systems?

None of the systems is perfect. None of the systems translates as well as a human with deep knowledge of the language pairs being translated. However, the speed of these systems and their “good enough” translations can cope with the volume of data flowing into an organization. I use several of these systems. I can get a sense of the document and then turn to a native speaker to clarify the translation.

I have unsubstantiated information that suggests Google has been making considerable progress with their online translation system. Because the system is available without charge, Google is becoming the default system. AltaVista.com still offers an online translation system, but Google has surpassed that system in speed and language pair support. When Google integrates its online translation system with its other enterprise services, I think Google will continue to chew away at the established vendors’ market share. The GOOG, however, seems happy to let customers find their online translation service. The economic downturn may shift the Google into higher gear.

Stephen Arnold, December 10, 2008

Overflight Enhancements

December 9, 2008

ArnoldIT.com’s Google monitoring service made some changes over the last few days. You can access the service by clicking here. Overflight Google allows you to look at the most recent Web log posts on more than 70 Google Web logs. The change is the addition of a link that says, “Show Overflight Update Stream”. When you click it, we display the additions to Google Web logs and put the date on each item. The Update Stream function has been added for each of the Google Web log clusters. If you want to scan headlines, you can browse the most recent items for each of the Google Web logs.

The other enhancement is the addition of entity extraction to the Exalead search system’s index of the corpus of Google Web logs. I am not too happy with the phrase “vertical search”, but I must admit, the Exalead index of more than 70 Web logs is a sharply focused vertical search engine. Here’s a screen shot of the Exalead entity extraction. You can use it to learn the name of the Google customer at Genentech and similar interesting ways to learn about the GOOG.

entity extraction 1

A happy quack to the Exalead team. More enhancements are coming. If you would like an Overflight service on your Web site, write seaky2000 at Yahoo dot come.

Stephen Arnold, December 9, 2008

Google and Salesforce.com: The Plot Thickens

December 8, 2008

For years, I have heard that Google had an interest in Salesforce.com. In my for-fee briefings, I dig into the Salesforce.com technology for multi tenant applications. I am certainly no wizard in the magical world of patent documents, but I thought some of the Salesforce.com methods were somewhat elaborate. In those briefings, I commented that Google seemed to have another approach that exploited some of its more unusual inventions. One example is the elaborate system to determine the context of a user. I refer to these as the Guha patent documents. There are others, of course. My point is that Google seemed to be building functions into its broader data management and container operations. (Please, don’t write and ask me for these briefings. I don’t release that type of information into the wild nor in these largely recycled Web log musings.)

I read “Force.com + Google App Engine = Cloud Relationship Management” by Steve Gillmor here with thought, “Yep, the GOOG is on the move.” Mr. Gillmor’s write up’s lead paragraph hit the nail on the head. He wrote on December 7, 2008, “Salesforce and Google have extended their strategic partnership with Force.com for the Google App Engine.” His article provides useful technical background and some observations about Google’s approach to an “operating system.” You will want to read this article and then save it to your GoogleOnTheMove folder.

My take on this expanded use of the Google App Engine reaches outside the boundaries of Mr. Gillmor’s story. My thoughts are:

  • Google gets Salesforce.com to hook into more Google technology without significant risk or cost. If Salesforce.com’s multi-tenant technology is suitably impressive, Google could increase its involvement with Salesforce.com. If the merged clouds don’t work too well, Google has learned possibly significant information about the Salesforce.com approach.
  • Google receives valuable information about such factors as the efficiency of the Salesforce.com system
  • Google has a reasonably well-controlled lab test for hooking clouds together. The Salesforce.com cloud is more of a wrapper around the data stores at the core of Salesforce.com. Google is more of a next-generation cloud engineered to minimize certain types of bottlenecks associated with traditional database management systems.

Salesforce.com, on the other hand, has more marketing clout. I have heard that the Google relationship makes otherwise dry explanations of multi-tenant technology more interesting. Who knows? Sales presentations are like magic. What you see is often not what allows the magician to entertain and enthrall the audience.

The big loser in the deal is Microsoft. The Google and Salesforce.com relationship comes at a time when Microsoft is making a push for its Dynamics system. Customers will want to hear about the new Google-Salesforce.com deal. That can complicate some procurements and maybe derail some others.

But the best is that Google still retains its freedom with regard to CRM. Google can still buy Salesforce.com or it can pass. Google can sign similar cloud federation deals with other vendors, or at some point, stitch together existing Google services to offer its own cloud-based CRM solution. To sum up, the Google is once again using its mass to distort the enterprise information market. Google’s “dark matter” lets it exert influence in ways that can be difficult to detect.

Stephen Arnold, December 8, 2008

Google: Putting Capex on a Diet

December 8, 2008

The point to keep in mind is that Google has been working for a decade to build out its infrastructure. One of the benefits of the company’s willingness to tackle hard engineering problems is that Google obtains a better return on its hardware dollar. Data included in my 2005, The Google Legacy suggested that Google can spend a dollar and get as much as five times to performance that a non-Googlized data center would get. The data appeared in Google technical papers. Some of these papers were written by big Googlers; others by small Googlers. What the performance data share is information that provides a glimpse of the computing capability in Google’s data centers. If we flip the performance data around, a competitor would have to invest as much as five times what Google spends to get comparable performance. Is Google’s engineering that cost effective? Well, a five hundred percent performance boost may be optimistic, but when a data center can cost $600 million the implications are interesting. A competitor would have to spend more than Google to match Google’s performance on data manipulation, disk reads, and queries per second. Let’s assume that Google gets a 25 percent boost. For a competitor to match Google’s performance, the competitor would have to have the known bottlenecks under control and then spend another $125 million which makes a $600 million data center hit the books at $725 million. If you pick a larger performance boost such as two hundred percent, the $600 million data center will require $1.2 billion in capex to match Google’s capacity. Of course, no one would believe that Google wrings such a performance advantage from its commodity hardware. Competitors prefer branded equipment. What’s in the back of my mind is that Google may be keeping its cards close to its chest.

The Washington Post’s “Google Turns Down Some of NC’s Tax Incentives” explains that the economic downturn, among other factors, may be causing Google to trim its capital expenditures. The Washington Post here quotes a letter Google sent to North Carolina officials. For me the key phrase was:

While Google “remains pleased and committed to its Lenoir operations,” economic conditions make it too difficult to be sure the $600 million data center complex will expand as fast as previously thought, the letter said. “Yet the company fully expects to achieve employment and capital investment levels that are consistent with those that the state announced in 2007,” Charlotte attorney John N. Hunter wrote on behalf of Google.

The Google capex expenditures are going to become more important. The economic downturn is affecting most organizations, and I think the GOOG may be battening down its hatches. Good Morning Silicon Valley takes this position. You can read its take on the capex shift here.

What happens if Google does trim its capex for data centers? Maybe Microsoft’s new data centers will leap frog over Google? Google could find itself on the wrong side of high performance if Microsoft builds its own super performance innovations into its data centers. What the Washington Post makes clear is that Google is slowing down at least in North Carolina. The Google may be trying to trim costs by rethinking certain investments. This is another sign of Google’s increasing maturity and could indicate the opening that Microsoft needs to hobble the search Googzilla.

Stephen Arnold, December 6, 2008

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta