Yahoo and User Experience

August 25, 2009

Lots of posts from gurus, azure chip consultants, and real journalists about Yahoo and search. I have plowed through about 15 of the online write ups. A couple jutted above the plain; most we in the low lands. A good example of the thinking that is not quite up the mountain is the write up “Yahoo: We’re Still in the Search Business”. The main point for me was this passage:

“I fully anticipate that our front-end experience will evolve differently from Bing,” said Prabhakar Raghavan, senior vice president of Yahoo labs and search strategy, during a presentation to journalists at Yahoo’s headquarters in Sunnyvale, Calif. “We collaborate on the back end, but we are competitors on the front end.”

So the plumbing is the plumbing. The differentiator is the user experience. To me that means interface. A search box is a part of the interface. Ergo Yahoo cannot do much with the white rectangle into which people type 2.3 words. Yahoo must add links, facets, and any other gizmo that allows a person to find information without typing 2.3 words.

I just looked at the Yahoo splash page for me:

search splash yahoo

I find this page unhelpful. I can personalize the page, but I see the Excite type of clutter that annoys me when I have to hunt for the specific link I want. Examples: NASCAR news. Three clicks. Email. Log on and extra clicks to get past the news headlines. My account for for fee email? Good luck finding this page.

I look forward to user experience changes, but I don’t think interface alone will address the issues I have encountered with Yahoo Shopping, locating news stories that have been removed even though links in the wild to the story are available, and finding specific discussion group content quickly.

I want more than punditry and user experience. I want a system that provides information access. Right now, Yahoo has many opportunities to improve, but the key will be the plumbing. If I understand the posts I have examined. Microsoft and Yahoo will collaborate on plumbing. I had a house once with two plumbing contractors. I recall some exciting discussions with the two plumbers. No one had responsibility for the leaky pipes.

Stephen Arnold, August 25, 2009

Sci Tech Content as Marketing Collateral

August 25, 2009

The notion of running a query across a collection of documents is part of research. Most users assume that the information indexed is going to be like like mixed nuts. In my experience, a more general query against an index of Web logs is likely to contain more of the lower grade nuts. A query passed against a corpus of electrical engineering or medical research reports will return a hit list with higher quality morsels. Those who teach information science often remind the students to understand the source, the bias of the author, and the method of the indexing system. Many people perceive online information as more accurate than other types of research material. Go figure.

When I read “McGill Prof Caught in Ghostwriting Scandal”, I thought about a rather heated exchange at lunch on Friday, August 21. The topic was the perceived accuracy of online information. With some of the new Twitter tools, it is possible for a person to create a topical thread, invite comments, and create a mini conversation on a subject. These conversations can be directed. The person starting the thread defines the terms and the subject. Those adding comments follow the thread. The originator of the thread can add comments of his / her own steering the presentation of information, suggesting links, and managing the information. Powerful stuff. Threads are becoming a big deal, and if you are not familiar with them, you may want to poke around to locate a thread service.

The McGill professor’s story triggered several ideas which may have some interesting implications for marketing and research. For example:

A scholarly paper may look more objective than a comment in a Web log. The Montreal Gazette reported:

Barbara Sherwin – a psychology professor whose expertise in researching how hormones influence memory and mood in humans – was listed as the sole author of an April 2000 article in the Journal of the American Geriatrics Society arguing that estrogen could help treat memory loss in older patients. In fact, the article was written by a freelance author hired by DesignWrite, a ghostwriting firm based in New Jersey. The company was paid by Wyeth to produce ghostwritten articles, which were then submitted to reputable scholars.

I would not have known that this ghostwritten article was a marketing piece. In fact, I don’t think I would have been able to figure it out by myself. That’s important. If I were a student or a researcher, I would see the marketing collateral as objective research. A search system would index the marketing document and possibly the Tweets about the document. Using Twitter hashtags, a concept space can be crafted. Run a query for McGill on Collecta, and you can see how the real time content picked up this ghostwriting story. How many hot topics are marketing plays? My hunch is that there will be more of this content shaping, not less, in the months ahead. Controlling the information flow is getting easier, not harder. More important, the method is low cost. When undiscovered, the use of disinformation may have more impact than other types of advertising.

What happens if a marketer combines a sci tech marketing piece like the Sherwin write up with a conversation directing Twitter tool? My initial thought is that a marketer can control and shape how information is positioned. With a little bit of tinkering, the information in the marketing piece can be disseminated widely, used to start a conversation, and with some nudges in a Twitter thread directed.

I am going to do some more thinking about the manipulation of perception possible with marketing materials and Twitter threads. What can an information consumer do to identify these types of disinformation tactics? I don’t have an answer.

Stephen Arnold, August 25, 2009

Grokker Status

August 24, 2009

This news stuff ruffles the addled goose’s feathers. A post in response to my observation that Grokker was not answering its telephone brought this post to the Beyond Search Web log. The author is Randy Marcinko. Here is his response in full:

Let me clarify the purported mystery…. As many of you know, Groxis had gone through tumultuous times following the dot.com days. Having survived the dot.com generation as a company able to create glowing expenses, it needed to learn how to come to terms with revenue generation. My predecessor (Brian Chadbourne) and I attempted to right the ship and seek out the best path forward.

I took over as Groxis’ CEO in September of 2007 and it became almost immediately apparent that Groxis’ sweet spot was and is with content creators and aggregators–publishers large and small, traditional aggregators, syndicators and others of the content world. This is a group of clients who have a need, for whom Groxis is compelling and a “need-to-have,” not “nice-to-have” purchase. They are also a group of prospects with sales cycles that are manageable for a small company. So we moved down that path. With a great team we were able to make quick changes to the product, making it more vital and current. We jettisoned many old product lines in favor a short list to whom we had the resources to sell. The results were great and we were on track to a cash flow positive Q4 of 2009.

Unfortunately, in Q2 of 2008, we were also on track to close a Series D round of funding, necessary to allow Groxis to move quickly enough to succeed. The round was all but completed in Q3 along with the onset of the economic downturn. With the change in the economy our Series D investors decided that it was not feasible to continue with that financial plan. This was a reality, despite a rich pipeline and refurbished products.

Thanks to a diligent and hardworking team at Groxis, we did our best through 2008 but by the end of Q1 of 2009 the only feasible next step was to close down the current operation. We closed down the day-to-day operation in March 2009. Since that time I have been negotiating with possible acquirers and investors. We have had a great response only time will tell whether a long term solution will emerge.

As information becomes available, I will post it.

Stephen Arnold, August 24, 2009

Microsoft and SEO Optimization

August 23, 2009

Whilst poking around for the latest Microsoft search information, I came across a Web site called Internet Information Services at www.iis.net. I was curious because the write up on the Web site said:

The Site Analysis module allows users to analyze local and external Web sites with the purpose of optimizing the site’s content, structure, and URLs for search engine crawlers. In addition, the Site Analysis module can be used to discover common problems in the site content that negatively affects the site visitor experience. The Site Analysis module includes a large set of pre-built reports to analyze the sites compliance with SEO recommendations and to discover problems on the site, such as broken links, duplicate resources, or performance issues. The Site Analysis module also supports building custom queries against the data gathered during crawling.

The word “experience” did it. I zipped to whois and learned that the site is Microsoft’s. The registrar is an outfit called CSC Protect-a-Brand. Microsoft does not want to let this url slip through its hands I assume. You  can download the tool here.

What interested me was that Microsoft has written the description of the tool without reference to its own Web indexing system. Furthermore, the language is generic which leads me to believe that this extension and the other nine in the category “Search Engine Optimization Toolkit” apply to Google as well.

If you are an SEO wizard and love the Microsoft way, you will want to download and experiment with these tools. Testing might be a good idea. If the tools work equally well for Bing.com and Google.com, has Microsoft emulated the Google Webmaster guidelines? If not, what will be the impact on a highly ranked Google site. With 75 to 85 percent of Web search traffic flowing to Google.com, an untoward tweak might yield interesting effects.

Stephen Arnold, August 23, 2009

Convera and the Bureau of National Affairs

August 22, 2009

A happy quack to the reader who sent me a Tweet that pointed to the International HR Decision Support Network: The Global Solution for HR Professionals. You can locate the Web site at ihrsearch.bna.com. The Web site identifies the search system for the site as Convera’s. Convera has morphed or be absorbed into another company. This “absorption” struck me as somewhat ironic because the Convera Web site carries a 2008 white paper by a consulting outfit called Outsell. You can read that Convera was named by Outsell as a rising star for 2008. Wow! I ran query for executive compensation in “the Americas” and these results appeared:

bna convera

The most recent result was dated August 14, 2009. Today is August 21, 2009. It appears to me that the Convera Web indexing service continues to operate. I was curious about the traffic to this site. I pulled this Alexa report which suggests that the daily “reach” of the site is almost zero percent.

alexa bna

Compete.com had no profile for the site.

I think that the human resources field is one of considerable interest. My recollection is that BNA has had an online HR service for many years. I could not locate much information about the Human Resource Information Network that originally was based in Indianapolis.

Convera appears to be providing search results to BNA, and BNA has an appetite for an online HR information service. The combination, however, seems to be a weak magnet for traffic. Vertical search may have some opportunities. Will Convera and BNA be able to capitalize on them?

But with such modest traffic I wonder why the service is still online. Anyone have any insights?

Stephen Arnold, August 21, 2009

More Local Loco Action: Guardian UK Gets the Bug

August 21, 2009

Short honk: I don’t want to dig too deeply into the efforts of a traditional newspaper company to get more traction in the Webby world. You will want to read Online Journalism Blog’s “The Guardian Kicks Off the Local Data Land Grab” and ponder the implications of the write up.” The idea is that a newspaper wants to hop on the hyper local toboggan before the run dumps the sledder into the snow at the base of the mountain. Mr. Bradshaw, the author of the article, wrote:

Now The Guardian is about to prove just why it is so important, and in the process take first-mover advantage in an area the regionals – and maybe even the BBC – assumed was theirs. This shouldn’t be a surprise to anyone: The Guardian has long led the way in the UK on database journalism, particularly with its Data Blog and this year’s Open Platform. But this initial move into regional data journalism is a wise one indeed: data becomes more relevant the more personal it is, and local data just tends to be more personal.

For whatever reason, hyper local information is getting as much attention as real time search and Twitter. I wish the Guardian good luck with scaling, monetizing, and marketing. My thought is that the hyper local crowd will want to move quickly before Googzilla wanders through this information neighborhood. Finding is a big part of the local information challenge. The deal breaker will be monetizing. The Guardian may well have the most efficient monetization method known to man. I hope so. The Google’s a good monetizer too.

Stephen Arnold, August 21, 2009

Twitter Stream Value

August 19, 2009

Short honk: I want to document the write up in Slashdot “Measuring Real Time Public Opinion With Twitter.” The key point for me was that University of Vermont academics are investigating nuggets that may be extracted from the fast flowing Twitter stream of up to 140 character messages. No gold bricks yet, but the potential for high value information seems to warrant investigation.

Stephen Arnold, August 19, 2009

Readers Digest Enters Intensive Care

August 18, 2009

The Readers Digest, according to the Baltimore Sun’s “Reader’s Digest Bankruptcy Report”, did not surprise me. This outfit had a clever business model and some very confident executives. I interacted with the Readers Digest when it bought the Source, one of the early online services. For me, the Readers Digest had a money machine with its “association” model when my grandmother subscribed to the chubby little paperback-book-sized monthly stuffed with recycled content. I liked the “digest” angle. The notion that my busy grandmother could not read the original article amused me. She was indeed really busy when she was in her 70s. What she liked was that the content was sanitized. The jokes were clean and usually not subject to double entendre.The Readers Digest recognized that the Source was a harbinger and made the leap to get into electronic information with the now moribund Control Data Corporation. The step was similar to an uncoordinated person’s jump off the high dive. The Readers Digest knocked its head on the Source deal and dropped off my online radar. Now the Readers Digest is blazing a new trail for magazine publishers: chopping the number of issues published per year, cutting its circulation guarantee, and learning to love bankruptcy attorneys. Which magazine will be next? Oh, I know the leadership of the dominant magazine companies will chase crafts, home decoration, and Future Publishing’s book-a-zine model. New thinking and methods are needed to save the traditional magazine, a group eager to turn back the clock to the glory days of the Saturday Evening Post. Like Babylonian clay tablets morphing into Home Sweet Home on ceramic wall hangings, magazines will survive. The market is moving beyond the old information delivery vehicles, and the 1938 Fords are struggling to keep pace with Twitter “tweets”. Here’s a quote by Charlie added to the Baltimore Sun article: “Still interesting to thumb through, but reprinting articles that were already published – how long ago? – is not a good model for those who make regular use of the Internet.” Well said.

Stephen Arnold, August 18, 2009

Bing Cherries Ripen Slowly

August 18, 2009

Short honk: Dan Frommer (Silicon Valley Insider) reported that “Bing Search Share Rises Modestly in July”. He said, “Bing’s share was 8.9 percent, up from 8.4 percent in June”. Because online is seasonal, any growth in the summer months, is a positive. Mr. Frommer points out that Yahoo’s search share is heading south. He points out that Yahoo has to grow its search share because “Yahoo will only get revenue from Bing searches performed on Yahoo.” Three quick observations: [a] Yahoo continues to struggle to make its services visible. I have to do a lot of clicking to see current email messages. [b] Yahoo’s search technologies may have been also rans to Google’s but I find the different search interfaces and the unpredictable results when searching for computer gear annoying. I had to write one SSD vendor to locate the product on the vendor’s Yahoo store. The outfit was Memory Suppliers. When I located the product on Yahoo, it was priced at more than $1,000. Error or a merchant trying to skim the unknowing? [c] I ran the query “iss weather photos” for an article I had to write yesterday for the International Online Show’s Web log. I did not get “International space station” snaps of weather systems. I got hits to backyard weather stations. Google delivered what I needed. I settled on using images from USA.gov, which uses the Bing.com system. Yahoo’s image search was less useful than Bing’s and Google’s. I have made this statement before: Yahoo is a floundering AOL. Instead of Yahoo buying AOL, maybe AOL should buy Yahoo. AOL is trying to become a content generation company. I still am not sure what the Yahooligans are doing. I don’t think it is search and that is going to prove to be a misstep.

Stephen Arnold, August 18, 2009

Google in Jeopardy

August 18, 2009

Two heavyweights in the search expertise department have concluded that Google may be vulnerable to Microsoft and Yahoo when their search systems are combined. And if not actually in trouble, two search experts think that Googzilla could be cornered and its market share reduced.

Let’s look at what the search experts assert, using new data from online monitoring and analytics vendors.

Search Engine Land, a publication focused on search engine optimization or SEO, ran “Report: MicroHoo Penetration Near Google’s, Google Users Most Loyal”. Greg Sterling analyzed the comScore data. He noted that Google had “65 percent of the search volume in the US”. He added, [the data] shows that 84 percent of search users are on Google.” Then he inserts the killer comment for those who want Google neutered:

However 73.3 percent of the search user population are on Yahoo and Microsoft, when the two are combined.

He pointed out that Google is a habit. He closed the analysis with the comment:

The new conventional wisdom is that people simply use Google because they’re familiar with it and have become habituated to using it. But I suspect that explanation doesn’t really capture what’s going on.

My question is, “What is going on?” You can look at his presentation of the comScore data and draw your own conclusions.

Along the same line of reasoning, the New York Times weighed in with “The Gap between Google and Rivals May Be Smaller than You Think”. Miguel Helft wrote:

ComScore found that for the combined Yahoo-Microsoft, “searcher penetration,” or the percentage of the online population in the United States that uses one of those search engines, is 73 percent. Google’s searcher penetration is higher, but not by that much: at 84 percent.

Again the foundation of the argument is comScore. Mr. Helft concluded with what I found a surprising direction:

“The challenge will be to create a search experience compelling enough to convert lighter searchers into regular searchers which is generally easier than converting new users,” Eli Goodman, comScore Search Evangelist, said in a press release. “Though clearly easier said than done, if they were to equalize the number of searches per searcher with Google they would command more than 40 percent market share.” That suggests Microsoft may want to spend more of its money improving Bing, rather than on marketing Bing. Spending on both, of course, can’t hurt.

So, two search experts see the comScore data as evidence of an important development in Web searchers’ behavior. And the New York Times is offering Microsoft marketing advice. I recognize the marketing savvy of the New York Times and I think that the New York Times should tell Microsoft what to think and do.

Three questions flapped through the addled goose’s mind:

  1. Are the comScore data accurate? I did not see any information from other Web traffic research firms, nor did I see any data from either Google or Microsoft. My recollection is that any data about Web traffic and user behavior has to be approached with some care. Wasn’t there that notion of “margin of error”?
  2. What is the projected “catch up” time for Microsoft and Yahoo, once the firms’ search businesses are combined? My analyses suggest that the challenge for Microsoft is “to get there from here”. The there is a significant chunk of Google market share. The “here” is the fact that the Bing.com traffic data are in early days, influenced by “pay for search schemes”, and research methods that have not been verified.
  3. Are the search experts working hard to create a story based on a consulting firm’s attempt to generate buzz in the midst of the August doldrums?

Experts like Search Engine Land and the New York Times know their material and are eminently qualified as consultants in information retrieval. What keeps nagging me is that pesky 80 percent Google market share and the 11 years of effort necessary for Google to achieve that share. I am not expert. I see a lot of work ahead for Microsoft.

Stephen Arnold, August 18, 2009

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta