The Obvious in Mobile Land

November 16, 2009

I relish consulting firms’ reports about technology. I find that the blue chip firms and the azure chip outfits are becoming more alike. In the early days of consulting, there were a handful of firms, including ur-consultants like the Edwin Booz outfit and the Ivy Lee operation. Today, blues and azures are struggling to make business sense in areas that have left the economic  landscape littered with mile markers, billboards, and neon signs blinking their messages in pink and yellow lights.

You may want to read “Windows Mobile Loses Serious Market Share”, an article that summarizes a Gartner Group report about mobile market share. Keep in mind that Gartner is a firm which does  not want its information reproduced. I can’t quote from the Gartner report, but you can start your hunt for the information at “Windows Mobiles Loses Nearly a Third of Its Market Share”.

Microsoft is trying to make Windows mobile better. I think that Version 6.5 is in the phones on offer in the Microsoft store on the Redmond campus. You can buy a phone in that shop, and it darn well better run Windows mobile. Microsoft also has a Softie who is making the rounds of consultancies, handset makers, and developers. I believe this person is Jeff Paul, but I may have that mixed up.

The problem Microsoft faces with Windows mobile is, as I pointed out in Google Version 2.0, focus. The company is involved in a great many markets. So is Google, but there is a difference. Google has a relative homogeneous software platform. Although not perfect, the Google does not have to fool around with legacy software. Microsoft, on the other hand, has silos of technologies. When these have to interact, Microsoft “wraps” or “hooks” systems together. This works when the resources are available to handle the fuzz. And Microsoft is mindful of legacy customers, and some of the those folks are running older servers and want to connect those with hot, new mobile services. That’s more work.

The present situation is that Windows mobile is, like Nokia, in a world of hurt. Nokia sells lots of phones but it is not exactly a hot mobile company. Windows mobile is lagging.

The reason for this state of affairs is easy to identify. Just look at what devices people are using. The iPhone is prominent. The BlackBerry still appears in the talons of New York business mavens. The geeks, including one in the ArnoldIT.com lab, loves his Gphone. You can see him clutching his much loved device in this ArnoldIT.com developers’ video.

The consultant’s study referenced in the articles referenced in this write up purport to document the obvious. I am not sure that there’s much mystery about the success of Windows mobile. The obvious is good. I think it is a useful historical exercise, a bit like writing a research paper in sophomore Ancient Western History. Good practice. Known data.

Stephen Arnold, November 16, 2009

Oyez, oyez, Federal Communications Commission! I was not paid to write this obvious article about the obvious study of mobile operating systems market share. Oyez, oyez. Yikes, I am using a mobile device with a lousy operating system. I am paying for this communication.

Control and the Days of Hot Type

November 16, 2009

Short honk: The electronically adept Guardian (UK newspaper) ran “The Case for Books by Robert Darnton. Dinah Birch praises Robert Darnton, a passionate defender of the printed word.” This is a book review laced with the Guardian’s nostalgia for a time when newspapers were the curators of intelligent discourse. Now you are reading the thoughts of an addled goose. Quack. At the foot of the review was a passage of interest to me; to wit:

In his final essay, Darnton [book author] remarks that “reading remains mysterious”, despite the burgeoning debates surrounding the production, preservation and interpretation of texts. The practice of reading shifts in every generation. No commercial or political process has yet succeeded in controlling its evolution and nothing suggests that its unruly energies are likely to diminish in a digital world.

Yep, mystery, understanding, and control. Oh, how we long for the good old, analogue days.

Stephen Arnold, November 16, 2009

The Government Printing Office will receive an email that says, “Mr. Arnold was not paid to write this brief article insinuating that the Guardian wants to be a Luddite”, a word that appears frequently in its articles about technology.

A Startling Look at IBM and the Future of Search, IBM?

November 16, 2009

I find it amusing when “the future of search” is invoked. When that phrase occurs when discussing IBM, I enjoy the remark because IBM and search are not synonymous in my experience. Sure, IBM was * the leader * in search with its original STAIRS product. But since that golden era, IBM has floundered in search, buying companies, cutting deals with Endeca and Fast Search & Transfer, among others, and then embracing the Lucene open source search solution. I wrote about IBM’s commerce search recently and then did a search on IBM India’s eCommerce Web site. I reported that IBM’s own search products could not be located. So, that’s the future of search? I hope not.

A youthful looking person, Kas Thomas, who is an “analyst” begs to disagree with my view of IBM’s information retrieval capabilities. Navigate to “IBM, Lucene, and the Future of Search”. Mr. Thomas wrote:

A lot is at stake for IBM, too: The key pieces of IBM’s information-access strategy — including InfoSphere Content Assessment (ICA), InfoSphere Content Collector (ICC), and InfoSphere Classification Module (ICM) — all employ the OmniFind Enterprise Edition search infrastructure in various ways. With Lucene and UIMA occupying center stage, IBM is betting a lot on this technology.

I am not sure IBM is a betting organization. Lucene and other open source products are [a] lower cost and [b] a hedge against Microsoft and Google. IBM is in an information retrieval bind, and I don’t think Lucene is going to do much to release the pressure.

image

IBM is hunting for its search “ball”. Without a ball, IBM is not in the search game. Source: http://www.desbrophy.com/images/gallery/LostBall.JPG.JPG

Here’s why in my opinion:

  1. IBM does not understand search. The lead it enjoyed in the STAIRS era has been eroded because IBM focused on other types of systems. Since STAIRS version III (now devolved into SearchManager/370) was dumped to IBM Germany for revision, the commitment to search, information retrieval, and sophisticated content processing technologies has been pushed into a secondary position. IBM could have been the leader. Instead it is a partner to any company that supports UIMA. On the path to UIMA, IBM has purchased search technology lock, stock and barrel. Anyone remember iPhrase?
  2. IBM now finds itself struggling with Microsoft’s resurgence in search even if Microsoft’s best bet is the aging Fast ESP technology. IBM also sees that its “partner” Google is pushing into areas that IBM once considered beyond IBM’s core competence. (Think data management and collaboration.) Now IBM is without its own search technology and it has embraced open source as the path forward. My research indicates that this is a “cost based decision”. Open source is a wonderful idea for the IBM MBAs, but when applied to IBM’s own products, the Lucene search implementation is not up to par with offerings from such companies as Coveo or Exalead, for example.
  3. IBM has wizards working at its labs on very sophisticated content processing and information retrieval systems. In fact, Google’s current system owes a tip of the hat to the Clever system, which IBM did little to commercialize in a meaningful way. In addition, Google’s semantic context technology is from none other than former IBM Almaden researcher, Ramanathan Guha. IBM is, in my opinion, on a par with Xerox Parc in the ability to generate continuing revenue from content processing innovations.

To sum up, IBM and the future of search don’t go together flow trippingly on the tongue. IBM is increasingly a consulting company that is still hanging on to its software business. IBM, like SAP, is a company that is search challenged. The notion of “prospective standards”, another phrase used by Mr. Thomas, analyst, baffles me.

IBM—just like Google, Micr5osoft, and Oracle–is at its core a vendor of proprietary products and services. Search is a placeholder at IBM. If it were more, why didn’t IBM do something “clever” with Dr. Guha’s inventions? What’s happened to Web Fountain? Where’s the SearchManager/370 technology capability in Lucene? Answer: Lucene is a toy compared to SearchManager/370. IBM has dropped its ball in search in search. Now it is hunting for that ball in a dark, large, hot IBM conference room in White Plains. The future of search at IBM until that ball is found again.

Stephen Arnold, November 16, 2009

I want to disclose to the Treasury Department that I was not paid to point out that IBM uses software in order to generate vendor lock in to its proprietary software and systems. Why would a market driven company pay me to point out that Lucene is a means to an IBM end, not an example of the success of open source software?

SharePoint Fast Search Docs at Last

November 16, 2009

Microsoft is supposed to roll out the Fast Search ESP system for SharePoint in the next few days. I found Cornelius J. van Dyk’s “Fast Search Docs for SharePoint 2010 Beta Released” and concluded that the Redmond giant is moving forward. You can download a range of documents from Mr. Van Dyk’s Web page. For me, the most interesting document was Planning and Architecture for Fast Search Server 2010 for SharePoint (Beta). (You will need the Microsoft XPS viewer to access this document. Also, it is not possible to export the XPS document to Word, although the text can be pasted into Wordpad. Formatting is lost, however.)

I read through the write up and found that no major changes were made to the basic plumbing on the Fast Search & Transfer SA system. The description of the system is straightforward but there is no hint about the complexity of the system configuration and tuning. There is a clustering diagram that provides one clue about the type of infrastructure the system may require.

The search system block diagram is not significantly different from the block diagram I included in my discussion of Fast ESP in my 2004 Enterprise Search Report. Like Autonomy and Endeca, making fundamental changes to established information retrieval systems is a difficult undertaking. (Autonomy, Endeca, and Fast Search are systems developed roughly at the same time in the 1990s. Each of these systems has been improved via internal modifications and “snap in” extensions. Each of the systems are sophisticated, complex software constructs composed of subsystems which interact continuously.) Here’s the Microsoft diagram from the Planning and Architecture Document referenced above. Microsoft owns the copyright to this document:

fast 2010 architecture

I will have to wait until more information becomes available about the forthcoming search system. Interesting times ahead for those working with SharePoint’s native search functions.

Stephen Arnold, November 16, 2009

I want to report to the US government’s Chief Financial Officers Council that I was not paid for this write up. I have a hunch that those who embrace this Fast Search system will have an opportunity to discuss money with your group. For that you will have paid, not me.

Google and Speed, Which Kills

November 16, 2009

Google’s focus on speed is one of those isolated Google dots that invite connection with other Google dots. Connecting the dots is easy when you are in grade school. The dots are big and the images used in grade school have parts filled in to help the easily bored student. Check out the image from Natural Environment Club for Kids. Looks like a flower and a bee, doesn’t it?

image

Connecting Google dots is a bit more complicated. The Google dots look more like this type of puzzle:

image

So where does speed fit into the Google dots? You will want to read “Google: Page Speed May Become a Ranking Factor in 2010: Algorithm Change Would Make Slow Sites Rank Lower”. Chris Crum wrote:

Google has generally been pretty good at providing webmasters with tools they can use to help optimize their sites and potentially boost rankings and conversions. Google recently announced a Site Speed site, which provides webmasters with even more resources specifically aimed at speeding up their pages. Some of these, such as Page Speed and Closure tools come from Google itself. But there are a number of tools Google points you to from other developers as well.  If you’re serious about wanting your site to perform better in search engines, and you haven’t given much thought to load times and such, it’s time to readjust your way of thinking. Caffeine increases the speed at which Google can index content. Wouldn’t it make sense if your site helped the process along?

No push back on this from me. Let me shift the discussion from a dot connected to PageRank to a dot that has a sharper angle.

Speed is a big deal. Google itself wants stuff to run quickly. However, in my research speed is * the * Achilles’ heel for its principal competitors in Web search and in the enterprise. In fact, speed and scale are the Scylla and Charybdis through which most companies have to navigate. If you have had to figure out how much it costs to scale a system like SharePoint or make Oracle response times improve, you know exactly what the challenges are.

Speed will be a competitive wedge that Google uses to put significant pressure on its competitors’ Atlas major in late 2009 and throughout 2010. When the dots are connected, here’s the image that the competitors Google targets will see when the picture is complete:

image

Speed is a killer for IBM, Microsoft, Oracle, and Yahoo. Speed makes systems fluid. Users may not know an n-space from a horse race, but speed is addictive. Cheap speed is competitive angle that could spell trouble for companies that mock Google’s spending for lots of its dots.

Stephen Arnold, November 15, 2009

I wish to report to the Superfund Basic Research Program that the research upon which these comments rest was funded by some big outfits who have gone out of business in the financial meltdown. This short article is based on recycled material of minimal commercial value. I wonder if I can apply for superfund support?

SQL Server and Total Economic Impact

November 15, 2009

Wow, “total economic impact”. That sounds like Cold War rhetoric from the Hermann Kahn school of strategic thinking. You can download the paper, written by Forrester (consulting firm) for Microsoft (company in the gun site of Google) from the ZDNet France Web site. (I have a ZDNet France account because I have a tough time locating information across different ZDNet Web sites.) The paper is called “Prepared for Microsoft Corporation September 19, 2008, “Total Economic Impact of SQL Server 2008 Upgrade.” First, I was curious about this “old paper” turning up in my RSS feed. That’s a mystery for sure. Second, the paper is 27 pages long and explains how the calculations for ROI were performed. A first year business school student may find the detail interesting. I was more concerned about the assumptions and the weighting, which were fascinating. I don’t want to quote much from this write up because the present economic vise is squeezing azure chip consulting firms. Some of these outfits have prohibited addled geese from quoting from their thought pieces. To be on the safe side, you can download the paper, read it, and take its conclusions with your regular azure chip consulting firm medicine.

Several observations:

  • The fallacy is the assumption that SQL Server and its RDBMS technology is the right solution for a data management job in an organization. This is a pretty big assumption, and in my opinion, it is the core weakness of the paper. Traditional RDBMS methods struggle with some of today’s data flow issues; namely, scaling and updating indexes. Sure, the problems can be “fixed” but the remedial steps are expensive, time consuming, and * essential * to the profitability of the software vendor. In short, the client does not get well. The client gets to keep on spending. Bad approach in today’s economic environment.
  • The detail is self serving. Although components are explained – see page five for a fundamental item of detail that is disconnected from the interoperation of the model – the overall flow of the model is one way. The reader is lead to a conclusion that says, “Upgrade.” The cost is the same even with adjustments for risk. This is another fundamental problem. The hockey stick costs come about from the unanticipated failures or problems in a database centric application. In short, the RDBMS looks great on paper, but it doesn’t look so great when the CFO tries to figure out how the IT costs went off the rails… again.
  • The sponsorship angle guarantees that the data are shaped. An objective analysis would look at different data management options and their costs. With those omitted from the 27 page paper, the result is a big marketing brochure. That’s okay, but one can’t run a company for its stakeholders on marketing collateral. Well, maybe some companies can.

Read the paper. Make up your own mind. I am waiting for the upgrade cost analysis of SQL Server 10. Hopefully that paper will be available in the next couple of months. Two year old thinking may not fit today’s budget cycle.

Stephen Arnold, November 15, 2009

No one paid me, but I want to make this fact clear to the Federal Financing Bank, an entity with keen interest in transparency, fiscal accountability, and appropriate business methods. I wonder if I can get a loan?

Google Books, The Nov 14 Edition

November 15, 2009

If you were awake at 11 54 pm Eastern time, you would have seen Google’s “Modifications to the Google Books Settlement.” Prime time for low profile information distribution. I find it interesting that national libraries provided Google an opportunity to do their jobs. Furthermore, despite the revisionism in the Sergey Brin New York Times’s editorial, the Google has been chugging away at Google Books for a decade. With many folks up in arms about Google’s pumping its knowledge base and becoming the de facto world library, the Google continues to move forward. Frankly I am surprised that it has taken those Google users so long to connect Google dots. Google Books embraces more than publishing. Google Books is a small cog in a much larger information system, but the publishing and writing angles have center stage. In my opinion, looking at what the spotlight illuminates may be the least useful place toward which to direct attention. Maybe there’s a knowledge value angle to the Google Books project? You can catch up with Google’s late Friday announcement and enjoy this type of comment:

The changes we’ve made in our amended agreement address many of the concerns we’ve heard (particularly in limiting its international scope), while at the same time preserving the core benefits of the original agreement: opening access to millions of books while providing rights holders with ways to sell and control their work online. You can read a summary of the changes we made here, or by reading our FAQ.

Yep, more opportunities for you, gentle reader, to connect Google dots. What is the knowledge value to Google of book information? Maybe one of the search engine optimization experts will illuminate this dark corner for me? Maybe one of the speakers at an information conference will peek into the wings of the Google Information Theatre?

Stephen Arnold, November 15, 2009

I wish to report to the Advisory Council on Historic Preservation that I was not paid to point out that national libraries abrogated their responsibilities to their nations’ citizens. For this comment, I have received no compensation, either recent or historic. Historical revisionism is an art, not a science. That’s a free editorial comment.

The Guardian on Email Surveillance

November 15, 2009

I think this article “Email Surveillance: Ditch It for Good” is an opinion piece. The Guardian is not exactly number one with a bullet in the online world, but it does have a penchant for writing articles that catch my attention. The idea is that the UK government should not “snoop on all our communication and Internet activity.” I disagree. My view is that governments have little choice but to move toward surveillance and increasingly proactive actions with regard to information. There are lots of bad folks out there, and the legal and political consequences of not taking appropriate actions are significant. Islands are pretty good for surveillance too. The UK and Australian enforcement entities are case examples of how electronic nets can be used to catch some interesting fish. The Guardian does not agree with me. So here’s a hypothetical: the UK government does not perform surveillance and a bad event occurs. Many are killed and injured in London. Subsequent investigation reveals that the event was described in emails and other common information channels. What are the legal and political consequences of this turn of events. Surveillance cannot be “ditched for good.” Surveillance is a fact of today’s information world in my opinion. Autonomy and i2.co.uk are two outfits with useful monitoring technology. These companies’ tools were developed to meet a need, even though the Guardian finds the need difficult to accept. An information reality is just like the financial reality many firms face in today’s business climate–Unpleasant to some but a fact nevertheless.

Stephen Arnold, November 16, 2009

I want to report to the Institute of Peace that I was not paid to point out that the Guardian is complaining about an information shift that says, “You can’t go home again.” There’s not enough cash in the goose’s coffers for that journey.

SAP and Its Pricing: A Sign of Deeper Challenges?

November 15, 2009

SAP is an outfit that provides me with some clues about what will happen to over-large enterprise software vendors. The company grew via acquisition. The company followed IBM’s approach to generating revenue from services. The company made shifts in its services pricing. The company has done just about every trick in the MBA handbook, yet revenues continued to soften. The most recent MBA play at SAP is disclosed in a news report from Reuters called “SAP Plans to Raise Licensing Fees”. The notion of releasing interesting news when most people are eating donuts and thinking about their dwindling retirement accounts is catching on among big companies. Fortunately for us in Harrod’s Creek, Reuters never sleeps. The story revealed:

Germany’s largest software company, SAP AG (SAPG.DE), plans to raise licensing fees for thousands of clients who use older versions of its software, German weekly Wirtschaftswoche reported on Saturday. “SAP’s older customers will be especially affected — that means the most loyal,” Andreas Oczko, deputy head of the German SAP client advocacy group DSAG told the magazine. The magazine said older clients who do not switch to newer versions of software applications or have not switched to a new incremental price structure will see the largest cost changes.

There you go. Upgrade or pay more. Upgrade and pay more for engineering support. That’s the MBA play of the week in my opinion.

What about customers who do nothing? Maybe some of these people will take a close look at their options. In a year, Google will have most of the SAP functionalities latent within the expanding Apps’s ecosystem. Then what? In my opinion, SAP may find that its business challenges have been made more problematic by the Google.

I am eagerly awaiting the unfolding of events in 2010.

Stephen Arnold, November 15, 2009

The Veterans Day Committee has to be aware that this opinion is uncompensated. I might add that canny veterans may want to check out their holdings in SAP to avoid the Wal*Mart greeter syndrome.

Less Traffic, Less Revenue: A Google Lesson

November 14, 2009

The confidence of some people is remarkable. There are some high profile rich people who think that Web site traffic is easy to get. Most pipe dreams begin with an assumption that users will visit a Web site. The reality is that getting and keeping Web traffic is tough, even for outfits with a big name brand.

A good example of the “no traffic, no revenue” challenge appears in “AOL’s Google Revenue Is Crashing”. The point of the article is that AOL’s Google revenue is declining. The reason for the decline is irrelevant. Focus on this passage:

Why the decline in Google revenue? Almost half of the decline last quarter is because of lower search query volume — in part because of the decline of AOL’s access business, which drives people to AOL search by default — and the rest of the blame goes to lower revenues per search query, AOL says in its filing. AOL’s Google deal runs through December 19, 2010, so there is some time to figure out what’s next. But as Google (and Microsoft, another potential search partner) see how much leverage they have over AOL in these arrangements, it’s hard to see AOL’s next search deal working out even as favorably as the current one.

With some organizations confident their content and name recognition will generate cash, I wonder if the AOL situation will make much of an impression?

Stephen Arnold, November 14, 2009

I want to report to the Employment Standards Administration that I am unemployed; therefore, this article is a freebie. Disclosure is so satisfying.

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta