Passport Canada: Caught with Its Tech in a Time Warp

April 21, 2009

CTV posted an item from “The Canadian Press” called “Online Form Poses Problem for Passport Canada.” You can read the story here. Passport Canada does what its name says—handles documents related to immigration. Another unit of the Canadian government  is the Canadian Foreign Affairs Department. The agency put some blank forms on its Web site. The form – PPTC 132 – is useful for getting a passport without a person who can verify the John Smith is “really” John Smith. The CTV.com story said:

Canadians who are overseas and need a passport require the form, which allows them to make their own declaration under oath.

Passport Canada keeps the form under tight control. Foreign Affairs Department put the form under loose control. Excitement ensues.

Some thoughts:

  • Coordination within and among government agencies is not too good, not just in Canada, but in most countries. Parkinson’s Law and political budget walls are two good reasons
  • Online remains a mystery in the Internet age
  • The notion of a form repository is a tricky one. The US initiative that I bumped into years ago seems to be ineffective
  • Removing information once it is “out there” is tough.

Are there lessons in this example? Lots. Easy and quick and cheap fixes. Pick two.

Stephen Arnold, April 20, 2009

Fast Search ESP 2009: Some Soft Information about a Hard Problem

April 20, 2009

A happy quack to the reader who sent me a link to this interesting and suggestive article “One Year with Microsoft – A Fast Perspective”. The write up appeared on April 17, 2009, on the Microsoft Enterprise Search Blog. You must read the posting here. The author is Nate, which does not tell me much. In such circumstances, I remind myself that the posting may be a spoof. For my purposes, the snippets of text below are intended as aides de memoire for myself. I have added some preliminary, informal comments to capture the excitement I experienced when I read the post. (The original Microsoft intent to buy Fast Search announcement is here.)

Now to work:

Nate, the author, has worked in “the industry” for 13 years with six years in the Fast Search & Transfer business. Keep in mind that search has been around for more than 40 years and six years is a good start. I assume the author’s experience in search has been shaped by what I call “the Google era”. The sale of Fast Search was precipitated because Fast Search was struggling with money. Fast Search and Google came out of the gate at about the same time in the late 1990s. Google ended up with $20 billion in revenue in the same interval that Fast Search & Transfer approach $80 million (estimated after the 2007 revenues were revisited), a police action, and a a hugely complicated search system that was tough to install. I heard that Google’s “search is simple” campaign was partly in response to the complexity of the installation process for Fast Search ESP and similar old style systems.

The author explained where Fast Search fits in the giant Microsoft Corporation. I did not understand the acronyms, but there were enough units involved to tell me that search is not at the top of the tech pyramid at Microsoft.

Nate presented the acquisition as a pat on the back for a job well done. I respectfully suggest that a financial restatement and a police action are not meritorious.

Nate referenced the Fastforward 2009 conference (which I believe I heard will be merged into another Microsoft conference) as the place where the vision for Fast Search was set forth. He provided a link to a SharePoint unit manager, Kirk Koenigsbauer. The bulk of the Web log write up is a restatement of information presented at the Fast conference earlier in 2009. The key points in my opinion were:

First, commitment. Nate reminded me that Oslo is where the search action is. The discussion of commitment puzzled me. A passage that I noted was:

To be honest, search is such a generally valued concept and the possibilities are so compelling when it’s combined with other Microsoft products and technology that it’s all we can do to stay focused on our main priorities. It’s a good problem.

The word “honest” snagged me. Was the earlier part of the write “not honest”? The statement “it’s all we can do to stay focused on our main priorities” underscored the likelihood that Microsoft is still not sure what’s important in behind the firewall search.

Second, vision. Nate asserted that Microsoft’s vision for search and Fast Search’s vision for search “matched”. I stopped and got out my yellow highlighter and worked through this statement. Microsoft’s vision has been to catch Google and deliver “findability” that keeps SharePoint users and administrators happy. In my descriptions of Fast Search, I use the word “complex” quite a bit. Nate’s vision was, if I read the “visioin” paragraph correctly is to think about Microsoft Surface, which is a touch screen interface. The idea I surmise is to change the interface to search, not the plumbing of search. I received an award from a government agency that included a picture of “lipstick on a pig”. The idea was to remind me that the work I had done would not change the outcome of a government policy, just make it pretty. I thought of that award’s snapshot of a pig when I read about the push to interface.

Third, product plans. Nate references the roadmap. I love roadmaps. I asked myself, “When will there be specific product details?”

Nate concluded by revealing that:

There you have it, my first post for the Microsoft Enterprise Search Blog. Look for more posts from me in this general category of enterprise search vision and strategy. I welcome all comments on this and future entries. Next up – Search plus Natural User Interfaces.

I crave write ups about:

  • Information about on going support for Linux and Solaris installations of Fast Search
  • Detail about the migration plans to Windows servers
  • Return on investment analyses comparing Linux versus Windows servers
  • Documentation about the interaction of Fast ESP subsystems with one another
  • Index updating cycles
  • Scaling best practices
  • A reference architecture for processing terabyte flows of unstructured information in a 24 hour refresh cycle
  • Version upgrade roll backs methods when a point upgrade goes off the rails.

Those topics would hold my interest more than comments about commitment, vision, and plans. The addled goose honks, “Detail, please.”

Stephen Arnold, April 20, 2009

World Digital Library: More than Google Books and Europeana

April 20, 2009

On April 21, 2009, the World Digital Library becomes officially available. You can access the site here. The objectives of the WDL are:

  • Promote international and intercultural understanding
  • Expand the volume and variety of cultural content on the Internet
  • Provide resources for educators, scholars, and general audiences
  • Build capacity in partner institutions to narrow the digital divide within and between countries.

You can run a key word query or narrow by place, time, topic, or type of item.

According to Australia’s The Age here,

Bringing together priceless material, from ancient Chinese or Persian calligraphy to early Latin American photography, it is the world’s third major digital library, after Google Book Search and the EU’s new project, Europeana. Drawing on content from libraries and archives worldwide, it aims to reduce the rich-poor digital divide, expand “non-Western” content on the web, promote better understanding between cultures and provide a global teaching resource.

What is the future of the many virtual library initiatives? What about the neighborhood library? What about federating the catalogs of Google Books, Europeana, and the World Digital Library. I don’t want to run three separate queries. Do you?

Stephen Arnold, April 20, 2009

Google in Trouble

April 20, 2009

Ad Age tackles the thorny question of Google’s brand with the analysis billed under the headline “Is Brand Google in Trouble?” here. To make the point, Ad Age’s art director spelled “trouble” using the Google logo colors. lIn my mind, I was expecting the page to play a riff from The Gladiator from Robert Meredith Wilson’s days with the Sousa band. Instead I heard “But He Doesn’t Know the Territory” from the Music Man. Trouble right here in River City.

The Ad Age article asserted that Google’s economics have “have come back to earth.” Google has a strong  consumer brand. The company has to beware of its “third rail”, which is privacy. The idea is that Google can electrocute itself I think. The Ad Age article said:

PR and branding experts advise Google to stay focused on building the next excellent product and not distract itself with what competitors are doing. And, they say, it’s important to listen to consumers; it’s impossible to please 100% of people, and the conversations in the trade and media world can seep out to the public. What’s important is to be transparent and human about addressing criticism.

As I said, “But He Doesn’t Know the Territory”.

Stephen Arnold, April 20, 2009

Exclusive Interview: Donna Spencer, Enterprise Systems Expert

April 20, 2009

Editor’s Note: Another speaker for what looks like a stellar conference agreed to an interview with Janus Boye. As you know, the Boye 09 Conference in Philadelphia takes place the first week in May 2009, May 5 to May 7, 2009, to be precise. Attendees can choose from a number of special interest tracks. These include a range of topics; including strategy and governance, Intranet, Web content management, SharePoint, user experience, and eHealth. Click here for more conference information. Janus Boye spoke with Donna Spencer on April 16, 2009.

Ms. Spencer is a freelance information architect, interaction designer and writer. She plans how to present the things you see on your computer screen, so that they’re easy to understand, engaging and compelling: Things like the navigation, forms, categories and words on intranets, websites, web applications and business systems.

The full text of the interview appears below.

Why is it so hard for organizations to get a grip on user experience design?

I don’t know that this is necessarily true. There are lots of organizations creating awesome user experiences. Of course, there are a lot who aren’t creating great experiences, but it isn’t because they can’t get a grip on user experience, it is because they care more about themselves than about their customers. If they really cared about their customers they’d do stuff to make their experiences great – and that’s possible without even knowing anything formal about user experience. But because they don’t care about their customers, they will fail, as they should…

Is content or visual design most important to the user experience?

Content (or functionality) is ultimately what people visit a website, intranet or application for. So it’s really, really important to get that right. If the core of the product is bad, it isn’t going to work.

But the visual design is often the part that helps people to get to the content. If the layout is poor, the colours and contrast awful and the site looks like it was designed in 1995, that’s going to stop people from even trying.

So both are important, though if I ever had to choose, I’d go for great content.

Is your book on card sorting really going to be released in 2009?

Yes, by the time the conference is on, there should be real, printed books. 150-odd pages of card sorting goodness. I hear that it should be out around 28 April. Really. I promise.

Does Facebook actually offer a better user experience after the redesign?

That’s a really interesting question. I can only speak for myself, but the thing that struck me about the redesign is that all of a sudden Facebook feels like a different beast. It used to be a site where friends were, but also where there were events, and groups and silly apps. Now it just feels like twitter that you can reply to. It feels like they have done a complete turn-around on who they actually are.

So for me the experience is worse. I can get a better idea of what my friends are doing, but I do that via twitter. Now it’s much harder for me to experience groups, events and all the other things we used to do there. I’m definitely using it less.

Why are you speaking at a Philadelphia web conference organized by a company based in Denmark?

Because they rock! But really, their core business overlaps a lot with what I do. I’m interested in the content the conference offers and I think my experience offers a lot to the attendees. Plus I’ve never been to Philly, and travelling to new places is a wonderful learning experience.

Content Management: Modern Mastodon in a Tar Pit, Part Two

April 20, 2009

Part 2: Challenges and the Upside… Sort of an Upside

The CMS Tar Pits

Today, search and content management systems are a really big problem. There’s no easy solution for several reasons:

  1. CMS in many cases have become larger and more complex over time. At the same time, the notion of an information object has raced along about a half mile ahead of the vendors’ software. In an organization, folks want to do podcasts, create fancy reports with embedded videos, and sometimes animated charts and graphs. “Search” is an overburdened word and it is unlikely that the wide range of content objects can be indexed without considerable resources.
  2. CMS has morphed into more than Web content. As a result, the often primitive workflow and repository functions choke when asked to support a special purpose retrieval; for example, eDiscovery. The solution to this problem is not to upgrade the CMS search system but to license another solution, pump the content into the specialized system, and run the eDiscovery with spoliation features from that system.
  3. CMS has not solved the problem of Web content. The reason goes back to the relationship between a human writing something and a system that sort of keeps that “something” organized and mostly eliminates the writer’s interaction with a programmer. CMS shifts the focus from setting up a method for creating useful, substantive content to the mechanics of keeping track of content objects and components. As a result, after the hoo haa of the CMS, Web sites have a content problem. The problem is that the information is often out of phase with the needs of the Web site user and the people who want the Web site to generate sales.
  4. CMS increases inefficiencies associated with writing. Organizations are committee writing machines. One or more individuals may write something. Then that “something” gets routed around, changes are made, a version is created, that version is shuffled around, and then an output occurs. Most document decisions are made at the 11th hour under an artificial “crisis”. This method absolves the “author” and the reviewers of real responsibility. The result is a lot of versions of the “something” and a document that is mostly something that is impenetrable. The “author” is like the guy or gal who sent me the engineering paper with a bunch of names on it. That person does not know what’s in the document and does not understand some parts of it. To see this type of writing in action read the instructions for a 1099 or a patent application.
  5. CMS costs only go up. Because CMS systems have to handle the content generated by their licensees, the costs for these puppies go one way—through the roof. Here’s why: CMS infrastructure has to be expanded to handle more documents and ever larger content objects. An email may be 4 Kb of XML. Stuff in a video and you get a bit of an extra load. Stuff in 20,000 documents with rich content and you get to buy lots of hardware, storage, bandwidth, and engineers to keep the Rube Goldberg machine running. The CMS has to be rebuilt on the fly which is plugging a leak in a speedboat towing a skier on Lake Huron. The fix is at best temporary.

In this environment, customers want facets, real time indexing, context sensitive queries, personalization, and access to structured data. No problem, but it won’t be cheap, easy, or doable with most of the existing budgets with which I am familiar.

Do marketers say these features can be delivered? You bet your life. Once the sale is made, the marketer goes to the next account. The vendor’s technical team is left to explain the reality and limitations of what search and content processing can do within the CMS environment.

Who’s the Lucky Mastodon?

So what’s the mastodon? The CMS that companies struggle to make work. What’ s the tar pit? The chair in front of the CFO’s desk. The owner of the CMS has to sit down and explain the cost overruns. The CFO may not care that the system is generating massive indirect costs, but she will certainly want to know about the hardware, software, license fees, consulting services, and programming expenditures.

Where do CMS consultants fit in?

There are good consultants (blue chippers) and not so good consultants (azure chippers). The “blue” connotes proven professionals from established services firms; for example, some units of IBM and some McKinsey and Boston Consulting Group folks. The azure chippers come from the companies with a modest track record and probably some wonder marketing lingo. The Regis McKenna school of marketing is a model for the azure chippers.

Consultants are usually a mirror of their clients. So clients get what they purchase and what emerges as “needs”. The result is that clients with a death of expertise in writing, content production, and enterprise publishing don’t get the problem fixed.

What exists now is a feedback loop that leads from the edge of the tar pit to the bottom of the tar pit. After a few million years, a preserved system is dug up, dissected, and compared to whatever tools are available. Because of the turnover among some enterprise technology professionals, the corporate memory is often shallow and the folks responsible for the mastodon have moved on.

The Upside of the CMS Tar Pit

What’s the positive view of this situation?

I see three positives.

First, the disasters of today’s CMS means that a number of individuals have attended the School of Hard Knocks and learned about some of the demands of content creation, production, and distribution.

Second, the newer systems have advanced beyond training wheels. You get air bags. You get seat belts. You get safety glass. You might be injured, but you probably won’t be killed. The US Senate’s CMS after several years of effort with two high profile vendors was shelved and a different approach pursued.

Third, some of today’s systems work and can be used by normal humans with so so writing skills. I know that it is great fun to whack on the Google, but I know that Adhere Solutions (a Google partner) has implemented some nifty systems that use the GOOG as plumbing. I referenced the newer cloud based services from a Web log vendor elsewhere in this essay. I also pointed out that the Xquery outfit MarkLogic may warrant a look.

What should you do if you want to have a CMS with lousy search? My first thought was to ask you to call me. My second thought was to tell you to buy a copy of Successful Enterprise Search Management. You can get information about this 2009 study by Martin White (European guru) and me (American addled goose) here. My third thought was to suggest a Google search. My fourth thought is to start over.

You will have to choose an appropriate path. My suggestion is to avoid the azure chip consulting firm crowd, newly minted experts, and anyone who sounds like a TV game show announcer.

Stephen Arnold, April 20, 2009

Amsterdam Breathes New Life into Old Information Institution

April 19, 2009

A happy quack to the reader who sent me the link to Andrew Keen’s “Digital Dutch Masterpieces” here. The article points out that libraries can be both old and new media. He wrote:

at the Amsterdam public library. Instead of the dustiness and crustiness of the typical 20th century library, visitors to Amsterdam’s central public library will find not only books, but a restaurant as well as a children’s theatre and a public radio and television studio. The library, which is open every day from 10.00 am to 10.00 pm, also holds a series of cultural festivals – such as the upcoming week of poetry – which it then broadcasts on the Internet. Amsterdam library’s website epitomizes its innovative approach to the 21st curation of knowledge. The website features its own customized search engine, the “aquabrowser”, which has integrated the library’s books, CDs and DVDs as well as a rich archive of Amsterdam’s history and culture. Equally innovatively, the website provides those who use it within the walls of the library itself open access to all its digital content.

I did not resonate with the assertion that the library has a “return on investment”. That phrase has a specific meaning in financial circles. I think that the Amsterdam effort returns significant social value. One hopes other libraries absorb the lessons of this case.

Stephen Arnold, April 19, 2009

Twitter: New Whipping Boy

April 18, 2009

I never watched the whipping scenes in pirate movies when I was a kid in central Illinois. The whole pirate shtick (????) scared me. Pirate life looked awful. Small ships. Scurvy. Rats. I saw a cat-o’-nine tails in a museum when I was in college and I shuddered. The nine tails referred to what looked like leather strips with metal tips or claws. The idea of whipping is bad. Whipping with nine claws buries the need on the badness scale. Here’s what one of these corrective devices used by the British Royal Navy in the 17th century looked like.

image

Poor Twitter, the steroid charged child of SMS, is now a whipping boy and the pundits and mavens are using the cat-o’-nine tails to make their point. Coverage of Twitter has morphed from “What use is it anyway?” to “Twitter is evil.” The Tyra Banks’s incident in New York allegedly made use of models less than 5 feet seven inches Tweets. For more information on this remarkable tea party or flash mob, click here.

I loved the headline “Twitter Sucks” in the New York Observer where nothing is “sacred but the truth”. You must read the story here. In a nutshell, Twitter is over exposed. The “trough of disillusionment” is that Twitter is lots of short messages. Most of the messages are banal. But some of them contain surprisingly useful information. Aggregated, the Twitter stream can make interesting ideas assume a form that can be prodded and examined.

Should Twitter be whipped with a cat-o’-nine tails. Sure. That’s the way the world today works. But my view is that Twitter is an example of how real time messaging broadcast to others on the network can trigger unanticipated opportunities or challenges. Twitter may suck. Twitter may be trivial. But one thing is clear. Twitter is going to spawn quite a few real time search innovations. Twitter, like PointCast, may end up the big loser in a month or a year, who knows? But push technology did not die with PointCast and BackWeb. Twitter is an example of a service that neither telcos nor the likes of Google were able to put in a box and control.

Stephen Arnold, April 18, 2009

Microsoft Web Search Market Share

April 18, 2009

Joe Wilcox’s “Microsoft US Search Queries Rise in March” surprised me. You can read his article here. Estimates of online used are fraught with wackiness. If you have been asked to review actual log files for a high traffic site for a month, you know that the counts are reasonably accurate. There are those exciting anomalies such as losing a 48 hour chunk of data for no apparent reason. Most of the big outfits have fancy math to smooth out the hiccups in their statistical analyses. Smoothing is good. So is calculating an acceptable margin of error. Not surprisingly the major usage reports are indicative, not definitive. The question should be, “How far off are these data?” In my experience, somewhere in the 15 to 20 percent range.

Now if we look at the data about Microsoft’s usage, we notice that Mr. Wilcox focuses on an “increase” in Microsoft traffic of 0.1 percent, that is, 8.2 percent in February 2009 to 8.3 percent in March 2009. Apply the margin of error value that makes you comfortable and there’s not much of a change between Google and Microsoft.

The interesting point for me was that Google increased its share despite the lousy economy by a more robust 0.4 percent. But this too is irrelevant. Google’s share of the US search market, according to my goslings number crunching is getting near 80 percent. I like the idea that Google may be getting some competition after a decade of clear sailing, but the competitors need to hurry. Unless Microsoft can leapfrog Google, I don’t see much hope of making its many search investments payoff.

Stephen Arnold, April 18, 2009

End Game for Microsoft Yahoo

April 18, 2009

What a week for Microsoft search. I heard from three different sources that the Fast ESP technology will run on Windows, not Linux or the other forbidden operating systems. Then I read a Reuters’ “analysis” of the Microsoft Yahoo Web search chit chat. Written by Thomson Reuters’ Alexei Oreskovic, the headline was: “ Yahoo and Microsoft Approach Endgame on Search.” With Google’s search share north of 60 percent, I wondered whose game it was. Mr. Oreskovic wrote:

For that reason, running ads with Google is generally considered a “no-brainer.” But a combined Microsoft-Yahoo with nearly 30 percent search market share could provide a large enough audience to also be worthwhile.

My thoughts were:

  • What if Google’s share were higher? Closing the gap becomes more expensive and may be less attractive to advertisers
  • What if the costs of mashing up multiple search services sky rockets so that the anticipated financial upside become a ski jump into unexpected cost overruns
  • What if the technology does not deliver what users want?

I love analyses that evoke more questions than the mavens’ explication answers.

Stephen Arnold, April 19, 2009

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta