Smart Folks Can Be Really Dumb, WSJ Says
December 14, 2008
Jason Zweig writes for The Intelligent Investor, a title that struck me as ironic in the context of the current economic downturn. I don’t have an online link to this column because I refuse to use the Wall Street Journal’s online service. The dead tree version is on my desk in front of me, and you can find the full text of “How Bernie Madoff Made Smart Folks Look Dumb” on page B1 at the foot of the page on December 13, 2008. Just look for the drawing of Mad Magazine’s mascot, and you will be ready to read the full text of Mr. Zweig’s column. For me, the column said, “Smart people were not smart.” The fact that wealthy people who are supposed to be more intelligent than an addled goose fell for a Ponzi scheme is amusing. However, as I read the column, I thought about search and content processing, not the self satisfied millionaires who were snookered.
Let me run down my thoughts:
- The scam worked because the alleged Ponzi artist emphasized secrecy. Quite a few vendors take this approach. I can’t name names, but in Washington, DC, last week I heard a vendor emphasize how easy it was to use a certain system. The system of course was super secret but not to worry. It’s easy because of the magic in the code. I think I saw people in the presentation quiver with excitement. Secrecy is a variant of catnip.
- Mr Madoff played the “exclusivity” card. The idea is that not everyone can get the inside track on a deal. In London, at the International Online Show on December 4, 2008, I heard this pitch in a trade show booth. The angle was that this particular content processing tool was available only to customer with the knowledge and insight to use a special approach. I saw folks eagerly crowding around a tiny laptop to get close to the exclusive technology.
Mr. Zweig taps an academic for an explanation of how baloney suckers supposedly intelligent people into a scam. The phrase “sophisticated investor” sums up nicely how one’s perception of one’s own intelligence combine to make a Ponzi scheme work. The black art of Mr. Madoff can be found among the search and content processing sector as well. I suppose Enterprise 2.0, search is simple, and we have magic technology won’t catch the attention of governmental authorities. In a way it is reassuring that some basic sales techniques work better than some enterprise software systems. On the other hand, is it inevitable that the methods of financial charlatans work in the information services business? Thoughts?
Stephen Arnold, December 14, 2008
New Publication from Hartman and Boiko
December 14, 2008
Erik Hartman and Bob Boiko have a new venture. They are asking information management professionals for articles about their content management strategies, methodology, technology and tools. Contributions will address a specific issue, problem or challenge and present a proven solution — giving readers practical advice and experience. The Beyond Search geese think this is a worthwhile project. Bob Boiko and Erik M. Hartman, spark plugs of the Information Management Framework Association, made a call for industry contributions to “Information Management – Global Best Practices.” The collection is meant for analysts, developers, architects, managers, content creators and other technical and non-technical staff involved in the creation or maintenance of information systems. The first edition of the “Best Practices” series is scheduled to be released in October 2009 and annually thereafter. For more info on the book and how to contribute, navigate to the organization’s Web site here or e-mail Dr. Hartman here: erik at hartman-communicatie dot nl
Jessica Bratcher, December 14, 2008
HP ROI: Hundred Million Billion
December 13, 2008
I am confident that the phrase “hundred million billion” is a typographic error. I make them all the time. Have you ever tried to type with webbed feet? Navigate to BMighty.com and take a gander at “How Do You Manage Servers in the Big Leagues: HP Shows One Way” by Lamont Wood here. The article contains some return on investment data, which I find useful. One never knows when an example from an ink company will be needed to add color to a presentation. For me, the interesting data were:
- HP had before rationalization to three data centers “85 official data centers, plus another 400 unofficial locations that were found to harbor servers, totaling about 700 “data marts”
- The way to squeeze down this astounding number of data centers and marts was to use more hardware and virtualization. The payoff was, according to the article, reduce the number of servers by 40 percent and chop energy costs by 60 percent. Sorry, there were no actual numbers in the BMighty.com write up.
- “Hewlett-Packard cut IT spending from 4 percent of revenue three years ago to less than 2 percent today.”
I want to quote the hundred million billion number because you probably doubt that such a number is used by either HP or Mr. Wood. Here is the statement:
With revenues of slightly more than $100 million billion, this should save it an extra couple of billion yearly. But to get to that level they had to undertake a sort of surge, spending 2 percent of one year’s revenue on hardware upgrades.
The hundred million zillion reminds me of Yahoo’s statement that a Web indexing system would cost $300 million. Pulling numbers from outer space is not something this goose does. One fly in the ointment is that Mr. Wood’s data does not match the information here.
Stephen Arnold, December 13, 2008
Microsoft Architect Heads to Amazon
December 13, 2008
My goodness. First, Microsoft reigns in its spending for data centers. Then Redfin dumps Microsoft maps for the speedier Google alternative. Now, according to Todd Bishop’s Microsoft Blog, James Hamilton, a key data center architect, will leave Microsoft and head to Amazon. You can read the full text of “Key Data Center Architect Leaves Microsoft, Headed for Amazon” here. My question is, “Will other wizards involved in Microsoft’s cloud push seek greener pastures?” If I were a betting goose, I would say, “Yep”. Here’s why?
First, building big data centers does not mean the data centers can deliver data quickly. In fact, big data centers based on technology with known bottlenecks won’t go fast because of bottlenecks. Google created MapReduce to minimize one type of bottleneck about eight years ago. Microsoft is playing catch up and the bottleneck with SQL Server is a real deal problem. Second, coordinating work within a data center or across data centers requires a different type of file system; specifically, a smart file system that minimizes message traffic. Microsoft does not have a smart file system, although I have heard the company is working on this problem. Finally, replacing purpose built equipment with commodity gizmos is a good idea. When the commodity gizmo requires one hot spare and one additional computer to handle overflow conditions, the good idea goes bad. Microsoft is working to resolve these types of problems now. I have heard that the Google uses brand name network devices when these devices make sense. Commodity for commodity’s sake may not be the correct approach.
What’s your take on Microsoft’s data centers? Mine is that Microsoft’s data centers are expensive and probably plagued with some bottlenecks just like old style data centers have been for decades.
Stephen Arnold, December 13, 2008
Google Chrome EULA
December 13, 2008
You will want to click to ReadWriteWeb.com and read the article “Your New Agreement with Google, Chrome Users” here. We have Chrome on an isolated machine and use it with Tess’s Gmail account. Tess doesn’t pay much attention to EULAs. So far, she’s ignored my demands for her to scrutinize these easily changed, fluid, and maybe unenforceable Webby injunctions. You, however, may be more discriminating than Tess and, therefore, need to know what the GOOG is morphing in the Chrome EULA. I don’t want to repeat what Marshall Kirkpatrick has dug from the Google prose. I would like to quote the comment that resonated with me: “…Google is now making moves to promote Chrome over Firefox and a Mac version is in the works.” Chrome is an important technology for Google. Think containers. Chrome does containers quite well.
Stephen Arnold, December 13, 2008
Google’s Holiday Gift for Developers
December 13, 2008
The Google Code Blog told developers about a pre-holiday present. You can read the story here. What’s in the package? Effectively immediately, Google developers receive more space; specifically, the maximum file sizes grow from 20MB to 40MB, subversion quotas from 100MB to one gigabyte, and download quotas from 100MB to two gigabytes. Not a Google developer? Sign up now. Navigate to http://code.google.com/ and click on what interests you. You will be asked to get a Google account, so sign up. If you want to kick back, Google has a video channel for you here.
Stephen Arnold, December 13, 2008
Google’s Government Indexing Desire
December 13, 2008
I laughed when I read the Washington Post article “Firms Push for a More Searchable Federal Web” by Peter Whoriskey. I wiped away my tears and pondered the revelation that Google wants to index the US government’s information. The leap from Eric Schmidt to the Smithsonian to search engine optimizer par excellence was almost too much for me. I assume that Google’s man in Washington did not recall the procurements in which Google participated. Procurement I might add that Google did not win. The company lost out to Inktomi, Fast Search, Microsoft, and Vivisimo. Now it seems Google wants to get back in the game. Google is in the game. The company has an index of some US government information here. The service is called Google US Government Search, and it even has an American flag to remind you that Google is indexing some of the US government’s public facing content. When I compare the coverage of Microsoft Vivisimo’s index here with that of Google’s US government index, I think Google delivers more on point information. Furthermore, the clutter free Google search pages lets me concentrate on search. The question that does not occur to Mr. Whoriskey is, “Why doesn’t the US government use Google for the USA.gov service?” I don’t know the answer to this question, but I have a hunch that Google did not put much emphasis on competition for a government wide indexing contract. Now I think Google wants that contract, and it is making its interest known. With cheerful yellow Google Search Appliances sprouting like daisies in government agencies, the GOOG wants
How good is Google’s existing index of US government information? I ran a few test queries. Here is a summary of my results of testing Google’s service with the USA.gov service provided by Microsoft Vivisimo. The first query is “nuclear eccs”. I wanted to see on the first results page pointers to emergency core cooling system information available from the Nuclear Regulatory Commission and a couple of other places which have US government facing nuclear related information. Well, Google nailed my preferred source with a directly link to the NRC Revision of Appendix K which is about ECCS. USA.gov provided a pointer to a GE document but the second link was to my preferred document. Close enough for horseshoes.
My second query was for “Laura Bush charity”. Google delivered a useful hit at item 3 “$2 Million Grant for Literacy Programs.” USA.gov nailed the result at the number one position in its hit list.
My third query was for “companycommand.com”. Google presented the link to CompanyCommand.com at the top of the results list and displayed a breakdown of the main sections of the Web site. USA.gov delivered the hit at the top of the results list.
My test–as unscientific as it was–revealed to me that neither Google nor Microsoft Vivisimo perform better than one another. Neither services sucks the content from the depths of the Department of Commerce. Where are those consulting reports prepared for small businesses?
What’s the difference between Google’s government index and the Microsoft Vivisimo government index?
The answer is simple, “Buzz”.
No one in my circle of contacts in the US government gets jazzed about Microsoft Vivisimo. But mention Google, and you have the Tang of search. Google strikes me as responding to requests from departments to buy Google Maps or the Google Search Appliance.
If Google wants to nail larger sales in the US government, the company will need the support of partners who can deliver the support and understanding government executives expect and warrant. The messages refracted through a newspaper with support from wizards in the business of getting a public Web site to appear at the top of a results list wont do the job.
In my opinion, both Google and the Washington Post have to be more precise in their communications. Search is a complicated beastie even though many parvenu consultants love the words “Easy,” “Simple” and “Awesome performance.” That’s the sizzle, not the steak. Getting content to the user is almost as messy as converting Elsie the cow into hamburgers.
Google’s the search system with buzz. The incumbent Microsoft Vivisimo has the government contract. If Google wants that deal and others of that scale, Google will want to find partners who can deliver, not partners who are in Google’s social circle. Google will want to leverage its buzz and build into their comments that search engine optimization is not exactly what is needed for the Google Search Appliance to deliver a solution to a government agency. Finally, the Washington Post may want to dig a bit deeper when writing about search in order to enhance clarity and precision.
Stephen Arnold, December 12, 2008
Microsoft and Scudding Clouds
December 12, 2008
Update: December 12, 2008 A reader sent me a link to this Los Angeles Times’s story about Mapquest. “Mapquest Tries to Stem Flow of Users to Google Maps”. My hunch is that Google is working like a big magnet, pulling customers to them.
Original Post
A happy quack to the reader who directed my attention to Robert Scoble’s comment about his former employer’s computational performance. The Scobelizer here said:
…when I go to my wife’s blog, which is on Microsoft Spaces, it is TONS slower than WordPress. WordPress doesn’t have the huge data centers available to it that Microsoft has. Same when I use my Hotmail account. Gmail is faster. Same when I go to other things, because I’ve seen lots of people praising Microsoft’s Live.com services lately so I’ve been testing them out. Tonight ReadWrite Web, for instance, talks about the new Microsoft Labs bookmarking service.
The point I carried away is that Microsoft’s fancy new data centers aren’t the fastest hamsters in the cage. Keep in mind that I chastise the Amazon outages and I chew into the GOOG’s increasingly common glitches. But comparing WordPress with Microsoft Spaces is brutal.
The core of this story appeared on the Redfin Corporate Blog here. The Redfin wizards switched from Microsoft Virtual Earth and some Google to all Google.
Speed wins, but speed is not a data center. Speed is dependent on engineering solutions to known bottlenecks in distributed, parallelized computing system. Google has solutions, not perfect, mind you, but pragmatic. Microsoft has solutions as well but the core remains anchored to my old pal SQL Server and other Microsoft components. According to Redfin’s post, Microsoft may have to open more engineering spigots.
Stephen Arnold, December 12, 2008
Expert System’s COGITO Answers
December 12, 2008
Expert System has launched COGITO Answers, which streamlines search and provides customer assistance on web sites, e-mail and mobile interfaces such as cell phones and PDAs while creating a company knowledge base. The platform allows users to search across multiple resources with a handy twist: it uses semantic analysis to absorb and understand a customer’s lingo, therefore analyzing the meaning of the text to process search results rather than just matching keywords. It interprets word usage in context. The program also tracks customer interface and stores all requests so the company can anticipate client needs and questions, thus cutting down response time and increasing accuracy. You can get more information by e-mailing answers@expertsystem.net.
Jessica Bratcher, December 12, 2008
Concept Search Partners with Microsoft FAST
December 12, 2008
In October, Concept Searching, a company that develops statistical search and classification products, partnered with Microsoft subsidiary FAST Search, which designs search product solutions. They’re combining their efforts to offer FAST’s search augmented by Concept Searching’s “high recall and high precision” automatic classification, advanced meta-tagging and taxonomy management products. Their goal is to deliver solutions for content asset management. According to Microsoft FAST’s Partner Program Charter and Partners mission statement, it helps partners “develop technologies, turnkey solutions and repeatable service business, reflecting and augmenting the many ways in which customers implement FAST search technology.” We’re tracking Microsoft FAST because of the police action in October 2008.
Jessica Bratcher, December 12, 2008