A BreakDown of E-Discovery and Hackgate
June 25, 2012
The E-Discovery 2.0 blog recently reported on more scandals surrounding the Murdoch organization and how it plays into the history of E-Discovery crimes at large, in the article “The Demise of the News of the World: An Analysis of Hackgate Through and E-Discovery Lense.”
According to the report, there have been 60 civil claims brought to the UK that are derived from Hackgate, a monumental phone hacking scandal that unfolded in July of last year. These have resulted in the discovery of both conspiracy and the willful destruction of evidence in most cases.
The article states:
“The News Corporation has both the U.S. and U.K. to contend with regarding the defensibility of their information management systems and potential sanctions. However, in either scenario, the intentional deletion of relevant evidence is an obstruction of justice (in a criminal sense). News Corporation is a prime example of a multinational corporation that is not only suffering from the repercussions of bad behavior, but one that could not mitigate these risks at the highest level due to poor information management.”
I completely agree with the author’s assertion, particularly in issues where legality is concerned. In order to protect your organization from legal attacks, proper content management is a must.
Jasmine Ashton, June 25, 2012
Sponsored by PolySpot
New Version of Funnelback
June 25, 2012
Funnelback’s latest version boasts a number of new features, we learned at Regina’s List in “Funnelback 11 Launched with Automated Tuning and SEO Assistant.” The press release describes the new Automated Tuning component:
“Brett Matson, Managing Director of Funnelback, said Funnelback 11 has the ability to continually and automatically optimize its ranking using a correct answer set determined by the customer. This enables customers to intuitively adjust the search engine ranking algorithm to ensure it continuously adapts and is optimized to the ever-changing characteristics of their own information environment. A related benefit is that it exposes how effectively the search engine is ranking, said Mr. Matson.”
Other new features include an integrated SEO assistant, updatable indexes, efficient crawling, 64-bit indexing, a new high performance search interface, a broken links report, and a People Search feature for users’ customers. The software is available on Windows, on Linux, and as a cloud service.
Based in Australia, Funnelback grew from technology developed by premier scientific research agency CSIRO. The company was established in 2005, and was bought by UK content management outfit Squiz in 2009. They offer Enterprise and Website Search, both of which include customizable features. Their memorable name derives from the names of two Australian spiders, the funnel-web and the red back.
Cynthia Murrell, June 25, 2012
Sponsored by PolySpot
A Repositioning of Autonomy: A News Advisory from HP
June 25, 2012
Since obtaining Autonomy, Hewlett-Packard Company (HP) has made a great deal of changes in the products that they sell. Now rather than solely selling hardware, it also has its hands in meaning based computing. Most recently, HP came out with the 4 page News Advisory “Autonomy Announces Big Data Solutions in the Cloud.”
According to the report, HP and Autonomy are uniquely positioned to be able to help businesses capitalize on the big data revolution because of their solutions like Autonomy IDOL 10. Solutions like IDOL make it possible for companies to turn disparate data into actionable assets that can make a profit.
The report states:
“As the volume and variety of customer data continues to grow exponentially, marketers are increasingly focused on harnessing the value of this information to accelerate revenue growth. In fact, Gartner predicts that by 2017 the chief marketing officer (CMO) will spend more on IT than the chief information officer (CIO). Autonomy is extending its industry-leading digital marketing platform by delivering Autonomy Optimost Clickstream Analytics, which provides marketers with a single, consistent view of visits, conversions and customer engagement.”
By utilizing products like Autonomy IDOL, marketers are able to get a better understanding of their clients interests because they are able to analyze both structured and unstructured data.
Jasmine Ashton, June 25, 2012
Sponsored by PolySpot
How Librarians Play an Integral Role When Searching for Historical Documents
June 25, 2012
The Center Square Journal recently published “Meet Julie Lynch, Sulzer Library’s Historical Search Engine,” an article that introduces readers to the librarian who oversees the archive of manuscripts, maps and photographs donated by residents of Chicago’s neighborhoods north of North Avenue.
According to the article, the Northside Neighborhood History Collection encompasses more than 30 collections that document the history of schools, religious institutions, neighborhoods, homeowners’ associations, local businesses, community leaders, parks, the Chicago River, and the streets and transportation in communities located north of North Avenue to the city limits on the east, west and north sides of Chicago.
Due to the nature of her work, Lynch is the human equivalent of a search engine. However, she differs in one key aspect:
“Unlike Google, Lynch delivers more than search results, she provides context. That sepia-tinged photograph of the woman in funny-looking clothes on a funny-looking bicycle actually offers a window into the impact bicycles had on women’s independence. An advertisement touting “can build frame houses” demonstrates construction restrictions following the Great Chicago Fire. Surprisingly, high school yearbooks — the collection features past editions from Lane Tech, Amundsen and Lake View High Schools — serve as more than a cautionary tale in the evolution of hairstyles.”
Despite the increase in technology that makes searching information as easy as tapping a touch screen, this article reiterates the importance of having real people to contextualize these documents.
Jasmine Ashton, June 25, 2012
Sponsored by PolySpot
Quote to Note: Management Wisdom
June 24, 2012
I love management wisdom published by the New York Times, an outfit working to inject technologists into its management structure. Yes.
Navigate to the your local vendor of newspapers (good luck with that). Purchase the June 24, 2012, New York Times which contains news as fresh as two day tuna, and read “Who Made That Cubicle?” in the New York Times Magazine. (Fees may apply, but the Sunday newspaper is just $6 in rural Kentucky. You will find the quote below on page 19 as the last paragraph of a Dilbert-type story: “Not all organizations are intelligent and progressive, Propst [the father of the Dilbert cubicle] two years before he died in 2000.” Now the keeper:
Lots are run by crass people. They make little, bitty cubicles and stuff people in them. Barren rat hole places.” He spent his last years apologizing for his utopia.
Ah, irony. Utopia. Are any search and content processing vendors relying on cubicles? Probably not. Enlightened management. Don’t trip over the scooters, volleyball, or crate of organic protein bars.
Stephen E Arnold, June 24, 2012
Sponsored by Polyspot
Government Generated Open Source Software
June 24, 2012
The push for transparent government has resulted in an interesting side effect. Data.gov announces, “Welcome to the Open Government Platform: Data.gov Releases Open Source Software.” This software was built by government for government. Data.gov partnered with India’s National Informatics Centre to create the platform (interesting), and any national, state, or local governing entity can download the Open Government Platform (OGPL) in the pursuit of providing data to its citizens. Non-government developers are able to play with it, too. The announcement states:
“Based on Drupal, the core software includes a data management system, web site, and social networking community support. This full package, in early release, is now available for public download, comments, and open source development. . . . In using an open source method of development, the OGPL community will provide future technology enhancements, open government solutions, and community-based technical support. OGPL exemplifies a new era of diplomatic collaboration that benefits the global community by promoting government transparency and increasing citizen engagement.”
Sounds like a move in the right direction. This transparent government thing is picking up speed; as of the posting of the write up, thirty other national governments and a number of US states and localities have launched their own open government sites.
Cynthia Murrell, June 24, 2012
Sponsored by PolySpot
Switches Cure Some Google Search Migraines
June 24, 2012
Many of us have suffered a GSM (Google Search Migraine) due to changing search parameters and now there’s a way to ease the pain. Tuck away your aspirin, the article Google Search Parameters in 2012 gives us a useful run down on Google search ‘switches’ for command line queries.
The author stated the following:
“Knowing the parameters Google uses in its search is not only important for SEO geeks. It allows you to use shortcuts and play with the Google filters.”
“Google keeps adding new parameters to its URL to keep pace with the increasing complexity of the search product, the Google interface and the integration of verticals. While most parameters are known, some aren’t.”
This was followed up with a list of search parameter switches broke down in the following categories; Normal, Advanced, Area (Language, Country), Advanced tools, Other Factors and Unclear Parameters.
Each parameter offers more detail, ensuring that the user can generate more productive results. The other factors category is useful because it tells us how to turn off filters that Google automatically adds. You see, if you don’t set the stage yourself, than Google will do it according to information they’ve obtained from your history.
Google is a very versatile search engine with a vast amount of ways to define results. The convenience of knowing advanced search operators will stimulate and enhance productivity and personal use. The availability of the search parameter switches will make Google search migraines a thing of the past.
Jennifer Shockley, June 24, 2012
Rumor or Revisionism: Google to the Cloud!
June 23, 2012
I read with some amusement “Google to launch Amazon, Microsoft cloud rival at Google I/O.” The main idea is that Google is going to roll out cloud services to compete with Amazon but really the purpose is to compete with Microsoft. Read the GigaOM “real” journalist story and decide what is being reported.
My view is that Google has been a cloud vendor from its earliest days. In The Google Legacy I described some Google research which made Google the “cloud” and everything thing was within Google. I did a briefing for some wild and crazy telecommunications folks in which a diagram showed that a telecommunications partner offering Internet service would have content and services delivered from the Google cloud. Sharply reduced latency was part of the plan. Google served the digital goods from its servers to which the telco partner would be party to the plan.
The date? 2004. My sources included information dating back to 2001.
Google in the cloud? Yep. What seems to be mesmerizing folks is that Google may make a public announcement. Be still my heart.
My question remains: Why has Google delayed bundling its cloud services for years. The foot dragging allowed Amazon to deploy most of the services with which Google engineers were fiddling.
A second question: Why has Google not moved enterprise search to the cloud? Google touts that it is a hardware company, but the Google Search Appliance is out of phase with the shift some firms are making to hosted or cloud services.
My hunch as a non journalist is that Google does not have the ability to execute and, thus, finds itself tagging along after others are in the market and enjoying a modicum of success.
Why? What about management? What about the ability to do something about market trends before those trends ossify? Honk. (If you want to receive our free, registration required newsletter Honk!, write thehonk at yandex dot com. I am more blunt in the original essay which is distributed every Tuesday at 7 am Eastern.)
Stephen E Arnold, June 23, 2012
Sponsored by Polyspot
Google Strolls a Local Path. Does the Firm Know Its Way?
June 23, 2012
Google is prancing down the catwalk with yet another design in hopes of rivaling competitors. Google Places got to thin so now the risqué Google+ took over to strut their stuff with local search. According to the article Google Places Is Over, Company Makes Google+ the Center Of Gravity for Local Search This new approach will rival the social interaction of Facebook and Twitter by allowing merchants to develop followers and message them.
Google has returned to a two search-box approach for Google+ Local search in order to allow users to narrow their search and:
“If you click the new “Local” tab in Google+ you’re taken to a personalized local home (discovery) page, which offers a mix of popular, social and recommended content. There are several variables that go into the content that appears on this page. The same two people in Seattle won’t see the same page, though aspects of it may be the same.”
This new design is still a work in progress, so more frills may be added later. Right now results are based on overall Zagat point system reviews and posts, along with places touched by your circle. Google has tried to make their model more attractive to users, like Facebook and Twitter. Local search is a tough challenge. Can Google do a Project Runway and “make it work, people.”
Jennifer Shockley, June 23, 2012
Could Sepaton Have Duped the Deduping Competition?
June 23, 2012
Sepaton just called ‘game over ‘on de-duplication competitors. Their newly released software will open doors for database de-duping the likes of which have never been seen, according to Sepaton Update Tackles Large-Enterprise Database Deduplication.
Additional storage options are always welcomed by customers, so Symantec’s clients should be content. The DeltaStor DbeXtreme should provide the flexibility to make some interesting waves in the industry.
Jason Buffington of Enterprise Strategy Group stated:
“If you ask a DBA how to best back up large data sets, they will tell you to ‘turn ON multi-streaming. Customers don’t have to choose between multi-streaming, multiplexing and capacity reduction through higher de-dupe. Sepaton’s customers can set data reduction ratios and storage utilization by client and backup job.”
This is a software only release for now, but storage and servers will become available within the next six months to a year. At that point customers will see an extreme boost in performance and security. They have been testing for a while and based on initial trials the software performance increases by a factor of 2 and throughout by 20%, so there is room for improvement.
The DeltaStor DbeXtreme software is unique because it eliminates tradeoffs between the backup performance and de-duplication process. Their database de-duplication doesn’t use hashing, but instead analyzes the data after receipt while it’s gathered in the storage pool. Thus, it eliminates redundant elements while many other solutions just can’t do that. If this software functions up to expectations, than Sepaton duped the competition.
Jennifer Shockley, June 23, 2012