Vignette: Web Content Management Wackiness
May 9, 2009
Years ago I had to work as a person poking into a content management system that was costing more money than the bean counters had projected. I recall reading documents, looking at the contents of servers, and talking with in house information technology professionals as well as various content management experts. I came away from the task with a sense that CMS was less of a product and more of a work in progress. Because the client had written vague requirements, the vendor was not responsible for cost overruns.
Then on a similar job in another country I learned that the same vendor was getting push back due to unexpected costs. Since the chatter was the type of baloney that gets tossed around the watercooler, I filed the information in the uncertain cabinet in my mind.
Different vendors were involved in these two content management projects but in both cases, the cart carrying apples fell over. Over the last few years, I have watched as content management vendors raced to position themselves as more than a lightweight service to keep a Web site ship shape.
Vignette 2005
One of the companies that moved quickly from hot niche to hot niche was Vignette. I noted that the company was named a “leader” by a research firm that prepares the equivalent of a horse race tip sheet. I wondered how a company could master information domains described in this way here.
Vignette’s software and expertise help organizations harness the power of information and the Web for measurable improvements in business efficiency. As the efficiency experts, Vignette (Nasdaq:VIGN) helps organizations increase productivity, reduce costs, improve user experiences and manage risk. Vignette’s intranet, extranet and Internet solutions incorporate portal, integration, enterprise content management and collaboration capabilities that can rapidly deliver unique advantages through an open, scalable and adaptable architecture that integrates with legacy systems. Vignette is headquartered in Austin, Texas, with local operations worldwide. Visit http://www.vignette.com to see how Vignette customers achieve measurable improvements in business efficiency and to find out why more companies prefer Vignette.
The description carries a strong message that Vignette was into efficiency, various nets (Intra, Extra, and Inter), enterprise content, collaboration, portals, etc.
Vignette 2007
By 2007, Vignette described itself in this way here:
Vignette helps organizations improve interactions with customers and partners by delivering highly personalized, interactive online experiences. Vignette’s early content management and delivery tools laid the groundwork for some of the Web’s most popular sites. Today, Vignette’s award-winning Next-Generation Web solution powers some of the world’s most recognizable online brands and enables organizations to have more meaningful interactions with their customers and associates. Vignette’s Imaging and Workflow solution adds the ability to deliver and manage online and offline document-driven customer transactions. Vignette is headquartered in Austin, Texas with operations worldwide. Visit www.vignette.com
The company had added workflow and imaging, tweaked its description of user experience and meaningful interactions.
Vignette 2009
By 2009, Vignette had morphed again here:
Vignette provides software and services that deliver the Web’s most dynamic user experiences. The Vignette Web Experience brings rich media and engaging content to life for the world’s greatest brands. Our Web Experience Management solutions help improve interactions by delivering highly personal, social, and multi-channel online experiences. We help organizations win and retain customers by delivering custom content anytime, anywhere, to any Internet-enabled device.
These descriptions omit Vignette’s push into eDiscovery, one of the hottest sectors for some companies in the content and text processing sector.
In the news release announcing the deal, Open Text specifically refers to Vignette in this way here:
Vignette has an enviable customer base, deep expertise in Web Content Management (WCM) and global distribution capabilities. Vignette customers will benefit from Open Text’s expanded ECM solutions portfolio as well as their Vignette products being supported by the world’s largest independent ECM solutions provider…
Performance
Here’s what Vignette’s financial performance looks like from 2005 to 2008:
- Revenue down from $191 million in FY2005 to $170 million in 2008 here
- Income down from $22 million pretax in 2005 to a loss of about $4.8 million here
- Cash down from $197 million in 2005 to $132 in 2008 here.
The downward drift continued in 2009. More detail is here.
Observations
My thoughts on the deal are subject to change, of course. As of May 8, 2009, my view is:
- Vignette had a good day because the trend for the company was downwards, not upwards
- Open Text has duplicative products and a tough job ahead to deal with revenue and customer issues
- The repositioning of the company since 2005 underscores the thin ice on which content management companies are skating.
Will Open Text convert its $300 million buy into a home run? Looks like a long shot from where I paddle in my duck pond for these reasons:
- Duplicative products make it tough to control costs. Open Text may find itself facing a cost ramp that its financial team may not be able to manage
- The bloom is off the content management rose. Licensees realize the cost and complexity of creating and making effective use of CMS. Consultants make a great living running self-help courses and providing tips on damage control, but the problems are increasing as the financial climate remains gloomy.
- More than Web sites require digital information. Systems to cope with rich media, real time messaging, and specific demands of litigation highlight the immaturity and crankiness of some of the highest profile CMS brands.
My call. A horse with a penchant for balking is now in the Open Text stable. The fancy wordsmithing has not worked for Vignette perhaps due to deeper technical and managerial weaknesses? Search remains a challenge for Vignette licensees in my experience as well. Wow.
Stephen Arnold, May 9, 2009
Microsoft and Search: Interface Makes Search Disappear
May 5, 2009
The Microsoft Enterprise Search Blog here published the second part of an NUI (natural user interface) essay. The article, when I reviewed it on May 4, had three comments. I found one comment as interesting as the main body of the write up. The author of the remark that caught my attention was Carl Lambrecht, Lexalytics, who commented:
The interface, and method of interaction, in searching for something which can be geographically represented could be quite different from searching for newspaper articles on a particular topic or looking up a phone number. As the user of a NUI, where is the starting point for your search? Should that differ depending on and be relevant to the ultimate object of your search? I think you make a very good point about not reverting to browser methods. That would be the easy way out and seem to defeat the point of having a fresh opportunity to consider a new user experience environment.
Microsoft enterprise search Web log’s NUI series focuses on interface. The focus is Microsoft Surface, which allows a user to interact with information by touching and pointing. A keyboard is optional, I assume. The idea is that a person can walk up to a display and obtain information. A map of a shopping center is the example that came to my mind. I want to “see” where a store is, tap the screen, and get additional information.
This blog post referenced the Fast Forward 2009 conference and its themes. There’s a refernce to EMC’s interest in the technology. The article wraps up with a statement that a different phrase may be needed to describe the NUI (natural user interface), which I mistakenly pronounced like the word ennui.
Microsoft Suface. Image Source: http://psyne.net/blog4/wp-content/uploads/2007/09/microsoftsurface.jpg
Several thoughts:
First, I think that interface is important, but the interface depends upon the underlying plumbing. A great interface sitting on top of lousy plumbing may not be able to deliver information quickly or in some cases present the information the user needed. I see this frequently when ad servers cannot deliver information. The user experience (UX) is degraded. I often give up and navigate elsewhere.
Demographics and Their Search Implications: Breathing Room for Online Dinosaurs
April 25, 2009
ReadWriteWeb.com’s “The Technology Generation Gap at Work is Oh So Wide” pointed to a study that I had heard about but not seen. A happy quack to RW2 for the link the the LexisNexis results here. RW2 does a good job of summarizing the highlights of the research, conducted for this unit of Reed Elsevier, the Anglo Dutch giant that provides access to the US legal content in its for fee service. You can read Sarah Perez’s summary here.
I wanted to add three observations that diverge from the RW2 report and are indirectly referenced in the WorldOne Research 47 page distillation of the survey data and accompanying analysis. Keep in mind that the research is now about nine months old and aimed at a sample of those involved in the world’s most honorable profession, lawyering.
First, the demographics are bad news for the for fee vendors of online information. As each cohort makes it way from the Wii to the iPhone, the monetization methods, the expectations of the users, and the content forms themselves must be set up to morph without paying humans to fiddle.
Second, as I zoomed through the data, I came away convinced that lawyers’ perception of technology and mine are different. As a result, I think the level of sophistication in this sample is low compared to that of the goslings swimming in my pond filled with mine run off water. The notion that lawyers who are younger are more technologically adept may be little more than an awareness of the iPhone, not next generation text and content processing systems.
Third, the overall direction of the survey and the results themselves make it clear that it will be a while before the traditional legal information sources are replaced by a gussied up Google Uncle Sam, but it will happen.
My conclusion is that LexisNexis got the reassurance it wanted from these data. Is that confidence warranted as law firms furlough or rationalize staff, face clients who put caps on certain expenses, and look at the lower cost legal services available in the land of outsourcing, India.
Stephen Arnold, April 25, 2009
OpenText and Endeca Tie Up: Digital Asset Management Play
April 17, 2009
OpenText has a six pack of search systems. There’s the original Tim Bray SGML search system (either the first or one of the first), the Information Dimensions BASIS (structure plus analytics which we used for a Bellcore project eons ago), BRS Search (a rewrite of STAIRS III which I’m sure the newly minted search consultant who distributed a search methodology built on a taxonomy will have in depth expertise), the Fulcrum engine (sort of Windows centric with some interesting performance metrics), and a couple of others which may or may not be related to the ones I’ve named). Endeca is a privately held vendor of search and content processing technology. I like the Endeca system for ecommerce sites where the “guided navigation” can display related products. Endeca has been working overtime to develop a business intelligence revenue stream and probe new markets such as traditional library search. The company received an infusion of cash last year and I heard that the company had made strides in addressing both scaling and performance challenges. One reseller allegedly told a government procurement officer that Endeca had no significant limit on the volume of content that it could index and make findable.
So what are these two powerhouses doing?
According to Newsfactor here, the two companies are teaming up for digital asset reuse. Most organizations have an increasing amount of podcasts, videos, images, and other rich media. If you read my link tasty essay about content management (the mastodon) and the complexities of dealing with content objects in containers (tar pit), you know that there is an opportunity to go beyond search.
The Newsfactor story is called “Open Text, Endeca to Deliver Digital Asset Reuse”. My understanding of the Newsfactor version of the deal is that OpenText will integrate Endeca’s asset management system into OpenText content management systems. There are a number of product names in the write up, and I must confess I confuse them with one another. I am an old and addled goose.
What’s the implication of the tie up? I think that Autonomy’s push into asset management with its IDOL server and the Virage software has demonstrated that there’s money in those rich media objects that are proliferating like gerbils. The world of ediscovery has an asset twist as well. Videos and podcasts have to be located and analyzed either by software or a semi alert paralegal, maybe a junior lawyer. OpenText has a solid ediscovery practice, so there’s some opportunity there. In short, I think this tie up helps two established companies deal with a competitor who is aggressive and quicker to seize enterprise opportunities. Autonomy is a serious competitor.
What will Autonomy and other vendors do? I think that in this economic climate there will be several reactions to monitor. Some aggressiveness on the part of Autonomy and probably Adobe will be quick to come. Second, other vendors of search and content processing systems will shift their marketing messages. A number of search systems have this capability and some, like Exalead, can make videos searchable with markers where particular passages can be viewed in the video object. This is quite useful. You can see a demo here. Third, I think that eDiscovery companies already adept at handling complex matters and content objects will become more price competitive. Stratify comes to mind as one outfit that may use price as a counter to the OpenText and Endeca tie up. I can point to start ups, aging me-too outfits like IBM, and a fair number of little known specialists in rich media who may step up their marketing.
This will be interesting to watch. OpenText is a bit like the old Ling Temco Vought type of roll up. Endeca is a solid vendor of search and content processing technology that was unable to pull off an initial public offering and a recipient of cash infusions from Intel and SAP’s venture arm. The expectation is that one plus one will equal three. In today’s market, there’s a risk that a different outcome may result.
Stephen Arnold, April 17, 2009
Ferris Research Content Processing Gossip
April 12, 2009
A happy quack to the reader who sent me a link to this thread about Autonomy. I did some url shaving and located a pointer page here that presents comments about eDiscovery vendors. I have no opinion about the validity of the comments, but I find the tone and themes interesting. Autonomy gets dinged for management. Clearwell merits a post that points out that license fees rose 25 percent in January 2009. Interesting service provided by Ferris Research here.
Stephen Arnold, April 12, 2009
Composite Software
April 12, 2009
I was asked about data virtualization last week. As I worked on a short report for the client, I reminded myself about Composite Software, a company with “data virtualization” as a tagline on on its Web site. You can read about the company here. Quick take: the firm’s technology performs federation. Instead of duplicating data in a repository, Composite Software “uses data where it lives.” If you are a Cognos or BMS customer, you may have some Composite technology chugging away within those business intelligence systems. The company opened for business in 2002 and has found a customer base in financial services, military systems, and pharmaceuticals.
The angle that Composite Software takes is “four times faster and one quarter the cost.” The “faster” refers to getting data where it resides and as those data are refreshed. Repository approaches introduce latency. Keep in mind that no system is latency free, but Composite’s approach minimizes latency associated with more traditional approaches. The “cost” refers to the money saved by eliminating the administrative and storage costs of a replication approach.
The technology makes use of a server that handles querying and federating. The user interacts with the Composite server and sees a single-view of the available data. The system can operate as an enabling process for other enterprise applications, or it can be used as a business intelligence system. In my files, I located this diagram that shows a high level view of Composite’s technology acting as a data services layer:
A more detailed system schematic appears in the companies datasheet “Composite Information Server 4.6” The here. A 2009 explanation of the Composite virtualization process is also available from the same page as the information server document.
The system includes a visual programming tool. The interface makes it easy to point and click through SQL query build up. I found the graphic touch for joins useful but a bit small for my aging eyeballs.
If you are a fan of mashups, Composite makes it possible to juxtapose analyzed data from diverse sources. The company makes available a white paper, written by Bloor Research, that provides a useful round up of some of the key players in the data discovery and data federation sector. You have to register before you can download the document. Start the registration process here.
Keep in mind that this sector does not include search and content processing companies. Nevertheless, Composite offers a proven method for pulling scattered, structured data together into one view.
Stephen Arnold, April 12, 2009
LexisNexis Sues Data Mining Wizard
April 6, 2009
LexisNexis is a unit of Reed Elsevier, the Anglo Dutch professional publishing outfit. In case you have been in a more remote area than Harrod’s Creek, you know that professional publishing companies, like their newspaper counterparts, are not in tall cotton. Publishing and its 16th century business model are out of step with the iPod equipped trophy generation cohorts. In short, there’s financial trouble in what once was a cash carnival.
I found the article “LexisNexis Sues Data-Mining Pioneer Henry Asher” here quite fascinating. Henry Asher sold a company called Seisint Technologies to the LexisNexis business mavens. The price? A reasonable pre crash, MBA crazed $775 million.
Flash forward from 2004 to 2009 and the Anglo Dutch giant is allegedly taking legal action against the data mining wizard for violating a non compete agreement which seems to be have been part of the deal terms.
The Law.com write up digs into some details that make lawyers salivate, but the addled goose finds the tidbits tasteless. What the addled goose does find worth pondering are these questions:
- Hasn’t technology in data mining advanced in four or five years? If not, what’s the difference? If it has, why the fuss about old technology?
- Why hasn’t a company with Reed Elsevier’s resources dominated the data mining market? Perhaps implementation and sales ability are an issue so customers want newer methods from those individuals who have been able deliver in the past?
- What’s the benefit to Henry Asher, a pretty savvy innovator, to take actions that would anger a multi billion dollar outfit that seems unable to generate sufficient new revenue to offset losses in the dead tree side of the firm’s business?
I wonder if there is more to this story than a spat over a non compete. I recall a number of conversations about non compete agreements. I am confused about some aspects of such agreements. In a lousy economy, a person must have sufficient latitude to innovate, commercialize, and build a successful business. The entity demanding a non compete probably sees financial trouble ahead and fears the consequences of a wizard who can deliver. Who is right? Who is wrong?
The addled goose will have to wait and see what the legal process spits outs. What this matter tells me is that tension and pressure are increasing at Reed Elsevier. Will some other problem burst to the surface? If I had more interest in the problems of professional publishing, I would poke into it. But in my view, these are ships modeled after the Titantics of the newspaper business. I will watch from the dock, thank you very much.
Stephen Arnold, April 6, 2009
ODNI Data Mining Report Available
March 8, 2009
If you want to keep a scorecard for data mining projects in some US government agencies, you may find the “Data Mining Report” (unclassified) interesting. You can download a copy here. You will need an acronym knowledgebase to make sense of some of the jargon.
For me, there were two interesting points:
- Video is a sticky wicket: lots of data and the tools are still evolving
- Coordination remains a challenge.
Enjoy.
Stephen Arnold, March 8, 2009
Metadata Perp Walk
March 5, 2009
I mentioned the problems of eDiscovery in a briefing I did last year for a content processing company. I have not published that information. Maybe some day. The point that drew a chuckle from the client was my mentioning the legal risk associated with metadata. I was reporting what I learned in one of my expert witness projects. Short take: bad metadata could mean a perp walk. Mike Fernandes’ “Think You’re Compliant? Corrupt Metadata Could Land You in Jail” here tackled this subject in a more informed way than my anecdote. He does a good job of explaining why metadata are important. Then he hits the marrow of this info bone:
Data recovery cannot be treated as the ugly stepsister of enterprise backup, and the special needs that ECM systems place on backup must not be ignored. Regulatory authorities and industry experts are beginning to demand more ECM- and compliance-savvy recovery management strategies, thereby setting new industry-wide legal precedents. One misstep can lead to disaster; however, there are approaches and ECM solutions that help avoid noncompliance, downtime and other incidents.
If you are floating through life assuming that your metadata are shipshape, you will want to make a copy of Mr. Fernandes’ excellent write up. Oh, and why the perp walk? Bad metadata can annoy a judge. More to the point, bad metadata in the hands of the attorney from the other side can land you in jail. You might not have an Enron problem, but from the inside of a cell, the view is the same.
Stephen Arnold, March 5, 2009
Storage Rages
March 5, 2009
ComputerWorld’s “Virtualization the Top Trend over the Next 5 Years” here underscores a potential opportunity that most traditional search and content processing vendors won’t be able to handle with their here-and-now solutions.
“Storage technology is similar to insurance in the financial services industry. In times of a recession, you have to manage your risk. Storage protects what you have and reduces risk,” said Steve Ingledew, managing director of Millward Brown Research’s Technology Practice.
What is interesting about this quote from the ComputerWorld article is that storage itself becomes a risk. Are most search and content processing systems up to task of managing massive repositories of digital information? The answer, in my opinion, is, “Sort of.” Autonomy moved to buy Interwoven to bolster its enterprise information and eDiscovery footprint. Specialists such as Clearwell Systems and Stratify (Iron Mountain) are farther along than most search and content processing companies. But when the volume of data gets into the tera and peta range, the here-and-now systems may not be up to the task.
With storage booming, there are some major opportunities for companies such as Aster Data, InfoBright, and Perfect Search. Unfamiliar with these companies? One may become the next big thing in data management. Google was on my short list, but the company seems to have lost some zip in the last 12 months. Amazon? At its core it is still an ecommerce vendor and not set up to handle the rigors of spoliation. Storage rages forward.
Stephen Arnold, March 5, 2009