Forrester Lets the Social Cat Out of The Bag

May 18, 2012

Enterprise Communications recently reported on social media monitoring in the article “Forrester’s Evaluation on Social Media Monitoring.”

According to the article, the Forrester report evaluated nine current vendors that it split into the groups: current offering, strategy, and marketing presence. The research revealed some potential changes and considerations businesses should take into account when implementing a social media monitoring strategy.

the results revealed:

Radian6 and Visible Technologies were, according to Forrester and should be used as a guide only, leading the market principally based on two key aspects: dashboards that are broad in the functionality, and road maps that are innovative in design. Forrester’s evaluation also rated Attensity, NM Incite, Converseon, SDL, Networked Insights and Synthesio.”

Unfortunately these “listening tools” only gather data. The companies are then forced to decide what to do with that data. I wonder if the coverage is free or pay to play? Real consulting firms and real analysts do everything for intellectual rewards.

Jasmine Ashton, May 18, 2012

Sponsored by PolySpot

New Survey Asserts Google+ Has Weak Social Engagement

May 18, 2012

Fast Company recently published a story on some disconcerting results from a new RJ Metrics report on the “social spine” of Google. In the article, entitled “Exclusive: New Google+ Study Reveals Minimal Social Activity, Weak User Engagement,” writer Austin Carr reveals that after surveying 40,000 random Google+ users, surveyors found the search giant’s social network to have minimal social activity and weak user engagement.

According to the article, the report found that the average post on Google+ has less than one +1, less than one reply, and less than one re-share. Also, roughly 30% of users who make a public post never make a second one and even after making five public posts, there is a 15% chance that a user will not post publicly again.

A Google spokesperson countered these results with a statement saying:

“By only tracking engagement on public posts, this study is flawed and not an accurate representation of all the sharing and activity taking place on Google plus. As we’ve said before, more sharing occurs privately to circles and individuals than publicly on Google+. The beauty of Google+ is that it allows you to share privately–you don’t have to publicly share your thoughts, photos or videos with the world.”

Since the survey only looked at user’s public profiles, we can’t be completely sure of the accuracy of its results. However, what we can be sure of is that Google needs to increase its social game because there are some skeptics out there.

Jasmine Ashton, May 18, 2012

Sponsored by PolySpot

Attivio Offers a Spin on the Big Data Bandwagon

May 18, 2012

Attivio, a software company specializing in enterprise search solutions and unified information access, recently published an informative blog post called “How We Handle Open Source.”

According to the post, there are many open source products, particularly those put out by the Java community, that are responsible for some of today’s hottest technology trends. Unfortunately, there are also a lot of bugs and functional gaps as well. So it is necessary that companies have a system in place to handle these issues.

When Attivio encounters an issue with open source code, the company follows a series of steps to check for possible solutions and then:

“Lastly, we strive to create a formal ticket, patch and test for contribution back to the open source community. In all fairness, this is our weakest part of the process, but one we are striving to improve. Many of our changes are small in nature and fix either esoteric edge cases or general code cleanliness like the thread example I mentioned above, but most changes are still useful for the wider community.”

Every company is different but what they all have in common is the need for a troubleshooting plan to address the issues that can come from using open source products. Attivio’s solution is an excellent example of this.

Jasmine Ashton, May 18, 2012

Sponsored by PolySpot

MarkLogic: The Door Revolves

May 17, 2012

MarkLogic hit $55 or $60 million.  Not good enough. Exit one CEO; enter an Autodesk exec. Hit $100 million. Not good enough. Enter a new CEO. Navigate to “Former senior Oracle exec Gary Bloom named CEO of Mark Logic.” The new CEO is either going to grow the outfit or get it sold if I understand the write up. Here’s a passage which caught my attention:

Gary Bloom has been named CEO of Mark Logic, which returns him to his database roots.

According to MarkLogic’s Web page, the company is:

an enterprise software company powering over 500 of the world’s most critical Big Data Applications with the first operational database technology capable of handling any data, at any volume, in any structure.

However, I can download a search road map. Hmmm. I thought search was dead.  Well big data search is where the action is. MarkLogic is pushing forward with its XML data management system.

Stephen E Arnold, May 17, 2012

Sponsored by HighGainBlog

Google and Going Beyond Search

May 17, 2012

The idea for this blog began when I worked through selected Ramanathan Guha patent documents. I have analyzed these in my 2007 Google Version 2. If you are not familiar with them, you may want to take a moment, download these items, and read the “background” and “claims” sections of each. Here are several filings I found interesting:

  • US2007 003 8600
  • US2007 003 8601
  • US2007 003 8603
  • US2007 003 8614
  • US2007 003 8616

The utility of Dr. Guha’s invention is roughly similar to the type of question answering supported by WolframAlpha. However, there are a number of significant differences. I have explored these in the chapter in The Google Legacy “Google and the Programmable Search Engine.”

I read with interest the different explanations of Google’s most recent enhancement to its search results page. I am not too eager to highlight “Introducing the Knowledge Graph: Things, Not Strings” because it introduces terminology which is more poetic and metaphorical than descriptive. Nevertheless, you will want to take a look at how Google explains its “new” approach. Keep in mind that some of the functions appear in patent documents and technical papers which date from 2006 or earlier. The question this begs is, “Why the delay?” Is the roll out strategic in that it will have an impact on Facebook at a critical point in the company’s timeline or is it evidence that Google experiences “big company friction” when it attempts to move from demonstration to production implementation of a mash up variant.

In the various analyses by experts, “real” journalists, and folks who are fascinated with how Google search is evolving, I am concerned that  some experts describe the additional content as “junk” and others view the new approach as “firing back at Bing.”

You must reach your own conclusion. However, I want to capture my observations before they slip from my increasingly frail short term memory.

First, Google operates its own way and in a “Google bubble.” Because the engineers and managers are quite intelligent, clever actions and economy are highly prized. Therefore, the roll out of the new interface tackles several issues at one time. I think of the new interface and its timing as a Google multiple war head weapon. The interface takes a swipe at Facebook, Bing, and Wolfram Alpha. And it captures linkage, wordage, and puffage from the experts, pundits, and wizards. So far, all good for Google.

A MIRV deployment. A single delivery method releases a number of explosive payloads. One or more may hit a target.

Second, the action reveals that Google * had * fallen behind in relevancy, inclusion of new content types, and generating outputs which match the “I have no time or patience for research” user community. If someone types Lady Gaga, the new interface delivers Lady Gaga by golly. Even the most attention deprived Web or mobile user can find information about Lady Gage, click, explore, and surf within a Guha walled garden. The new approach, in my view, delivers more time on Google outputs and increases the number of opportunities to display ads. Google needs to pump those ads for many reasons, not the least of which is maintaining revenue growth in the harsh reality of rising costs.

Third, the approach allows Google to weave in or at least make a case to advertisers that it is getting on its social pony, collecting more fine grained user data, and offering a “better search experience.” The sale pitch side of the new interface is part of Google’s effort to win and retain advertisers. I have to remind myself that some advertisers are starting to realize that “old fashioned” advertising still works for some products and concepts; for example, space advertising in certain publications, direct mail, and causing mostly anonymous Web surfers to visit a Web site and spit out a request for more information or, better yet, buy something.

The new interfaces, however, are dense. I point out in the Information Today column which runs next month that the density is a throw back to the portal approaches of the mid 1990s. There are three columns, dozens of links, and many things with which to entice the clueless user.

In short, we are now in the midst of the portalization of search. When I look for information, I want a list of relevant documents. I want to access those documents, read them, and in some cases, summarize or extract factoids from them. I do not want answers generated by someone else, even if that someone is tapping in the formidable intelligence of Ramanathan Guha.

http://www.billdolson.com/SkyGround/reentryseries/reentryseries.htm

So Google has gone beyond search. The problem is that I don’t want to go there via the Google, Bing, or any other intermediary’s intellectual training wheels. I want to read, think, decide, and formulate my view. In short, I like the dirty, painful research process.

Stephen E Arnold, May 17, 2012

Sponsored by Polyspot

IBM Stakes Territory in Big Data Field

May 17, 2012

IBM is collecting big data vendors in order to improve its position in that arena, Forbes tells us in “Big Blue Makes Big Moves in Big Data.” Varicent and Vivisimo have already been acquired, and Tealeaf Technology is next. Deal details have not been disclosed.

Sales data analysis company Varicent will be used to boost IBM’s Smarter Analytics Signature Solutions division. The multi-format data capture experts at Vivisimo will contribute about 120 workers to the Software Group. Tealeaf Technology specializes in customer experience analytics software. It is hoped its skillset will reveal patterns with which to streamline online customer experiences.

Forbes contributor Trefis Team writes:

“With these acquisitions, IBM hopes to extend its analytics and Smarter Commerce initiative by providing marketing executives, and e-commerce and customer service professionals with real-time insights into customer behavior.

“These acquisitions will also help expand IBM’s data analytics offering which is already the broadest in the industry spanning software, hardware and services. The Watson system developed by IBM Research is a key innovation in this space, and over a hundred partners have adopted its Big Data platform already. By adding to its portfolio the targeted search and aggregation capabilities of these acquisitions, IBM is well on its way to become the standard in Big Data analytics.”

I wouldn’t put it past them, and Watson is indeed a huge set of points on their scoreboard.

We notice how IBM is now explaining Vivisimo as information optimization instead of the somewhat confusing big data vendor. But what is information optimization? We don’t know. We don’t think Forbes does either.

Cynthia Murrell, May 17, 2012

Sponsored by PolySpot

IBM Big Data Initiative Coming in Focus with Cloudera, Hadoop Partnerships

May 17, 2012

Big data management and analytics is becoming a key basis of competition as organizations look to turn their complex and large data sets into business assets. In “Analyst Commentary: IBM Adds Search and Broadens Hadoop Strategy with Big Data,” Stuart Lauchlan comments on IBM’s Vivisimo acquisition. Lauchlan says that the acquisition puts to rest the ambiguity of IBM’s Hadoop partnership strategy. He also has this to add about handling big data:

By definition, one of the major problems in discovering the information “nuggets” in Big Data environments is that the volume of data is large and consequently difficult to traverse or search using traditional enterprise search and retrieval (ESR) tools that require the creation and maintenance of indexes before a query can be made. Vivisimo’s offering indexes and clusters results in real time, and its scalability enables dynamic navigation across results delivered, as well as the automation of discovery, reducing the burden/time of analysis.

Even though the actual value of the acquisition has not been declared, we do know that IBM has spent $14 billion in the last seven years on analytics-related products and companies. And while IBM has already acquired a service like Vivisimo, it seems that IBM saw value in the search software’s new capabilities, such as federated discovery and navigation.  IBM is no doubt trying to take SharePoint, the major player in enterprise.

Lauchlan’s article is a comprehensive overview of the IBM strategy. It may be a worthy read to keep in the loop on enterprise search news. But while IBM seeks to develop a comprehensive search solution with big acquisitions, organizations can turn to expert third party solutions to also get the power of efficient and federated search now.

The search experts at Fabasoft Mindbreeze offer a cost-effective suite of solutions to tame big data sprawl and connect your users to the right information at the right time. And with Folio connectors, organizations can access on-premise and cloud data with one easy search. Here you can read about the enterprise search solution:

The data often lies distributed across numerous sources. Fabasoft Mindbreeze Enterprise gains each employee two weeks per through focused finding of data (IDC Studies). An invaluable competitive advantage in business as well as providing employee satisfaction…But an all-inclusive search is not everything. Creating relevant knowledge means processing data in a comprehensible form and utilizing relations between information objects. Data is sorted according to type and relevance. The enterprise search for professionals.

Navigate to http://www.mindbreeze.com/ to learn more.

Philip West, May 17, 2012

Sponsored by Pandia.com

New Book Directs Enterprise Management in Product Lifecycle Implementation

May 17, 2012

In practice implementing a successful product lifecycle management (PLM) program within an enterprise is logical and easy – it just makes sense. Unfortunately executives find themselves often with an overwhelmed workforce of upper and middle level management not sure how to best use the new PLM solution.  Leslie O. Magsalay, a specialist in PLM implementation has recently created a two part book series aimed at helping professionals understand how to best utilize PLM solutions as explained in the San Francisco Gate article, “The PMO Practice Executive Leslie O. Magsalay Aims to Improve Product Life Cycle Management In Her Latest Book”.

The article describes the book and its purpose as:

“The author guides executives through effective strategies used to manage the resources and products in their portfolio and the roles and responsibilities that turn organizations into idea factories. For Program Management professionals and product teams, this playbook highlights the roles, responsibilities and best practices in the product life cycle that assure successful product launch and portfolio management. The much broader audience of organizations outside of the Program Management Office will read this playbook to understand and integrate their activities in the common activities associated with product delivery.”

Magsalay’s books highlight the need for effective and prompt training of all employees within an enterprise who will be interacting with PLM software.  Respected data management providers across the globe understand this need and work tirelessly with their clients to ensure that whatever training is required is accomplished from professionals – not books.  Before any corporations, small or large, decides upon a PLM solution for their needs a thorough look into the ongoing customer support and training practices of the PLM provider is absolutely necessary to avoid having to rely on a book for help in a crisis.

Catherine Lamsfuss, May 17, 2012

Are Big Data Firms Really the Next Big Thing?

May 17, 2012

If these wild and crazy numbers are to be believed, the rise of open source big data solutions is about to accelerate abundantly. ComputerWorld’s piece, “Explosive Growth Expected for Hadoop, MapReduce-Related Revenues: IDC,” shares conclusions of a recent report from analyst firm IDC. The paper projects that this market will go from $77 million last year to nearly $813 million in 2016. Writer Chris Kanaracus notes:

“Hadoop has enjoyed a steady stream of interest from commercial analytics and database vendors in recent years, who have begun offering commercial products and services for it.”

That is true. Given IDC’s enthusiasm, one might be tempted to hurry and snap up some stock in Hadoop-related companies like Lucid Imagination. However, there are some caveats to this “next big thing.” The Economic Times cautions us about “Big Data, Big Opportunities, Big Myths.” The write up notes that, often, hot trends within the technology and investment communities:

“. . . misjudge their own importance and collapse dramatically. Big data is the latest of these movements. Tech firms created a buzzword and then a global movement that comprise hardware, software and services companies. Investors started jumping in recently and now the universities have started creating special courses. Yet the big data market is small compared to several other sectors.”

So, big data solutions will either soar and make investors rich, rich, rich, or . . . not so much.

Cynthia Murrell, May 17, 2012

Sponsored by PolySpot

Funnelback Demo Video

May 17, 2012

The Cloud Harvester is hosting a new demo video for Funnelback‘s flagship product. The video is short but sweet—in less than a minute and a half, it clearly conveys how to create a new faceted navigation filter in Funnelback Enterprise Search.

Funnelback was grew from technology developed by CSIRO, Australia’s premier scientific research agency. The company was established in 2005, and was bought by UK content management outfit Squiz in 2009. They offer Enterprise and Website Search, both of which include customizable features. Both local and SaaS deployment options are available.

Regarding their Enterprise Search product, Funnelback’s Web site promises a comprehensive product:

“Funnelback Enterprise makes information available via a single web interface in a timely, consistent and convenient manner, leading to faster, more informed decision-making. Funnelback Enterprise can search across a myriad of corporate content repositories including websites, intranets, shared drives, document management systems, email systems, SharePoint and databases.”

Curious about the name? Their About page shares this:

“The name Funnelback is a play on the name of two spiders; the funnel-web and the redback, both native to Australia. The name was also chosen because of Funnelback’s ability to rapidly ‘funnel’ relevant information ‘back’ to the user.” [links added]

Quirky, clever, and memorable. And quintessentially Austrailian, just like Funnelback.

Cynthia Murrell, May 17, 2012

Sponsored by PolySpot

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta