December 11, 2012
This week, the Text Radar blog covered some excellent articles discussing big data’s impact on a multitude of different industries that deal with content intelligence and compliance.
One interesting story, “Big Data Information Governance Essential to Mitigate Risk with Unstructured Data” explains how, due to the fact that content storage has become relatively inexpensive, organizations are attempting to store huge volumes of data which is often unstructured and difficult to analyze.
The article concludes:
“Effective information governance not only helps make business operations more efficient, but also mitigates risk. Most organizations are so busy just trying to manage structured information that they haven’t yet addressed unstructured content, much less given enough attention to litigation risk associated with information. Now is the time.”
On a more uplifting note, “Big Data Process for Winning Marketing Strategies” recommends using big data as the driving force behind marketing decisions rather than intuition.
The article explains:
“Remember, it’s important to lead with data, not hunches. When it comes to your customers, are you relying on data or do you still rely on intuition when making decisions? If it’s the latter, you’re not alone. Hunches, however, are not a reliable source for decision making—no matter how long you’ve used them. Using Big Data for decision making not only improves the customer experience, it gives you a clear, more complete view of each customer, and allows you to communicate with enhanced relevance.”
The old ways are no longer cutting it in any industry. Even disaster relief is using big data’s cutting edge technology. “Big Data and its Role in Hurricane Sandy Response by Steven Connolly” explains big data’s role in predicting and responding to hurricane Sandy. Some of the hurricane’s devastation was lessoned by big data’s ability to allow state and local governments to take safety precautions up and down the coast.
In addition to this, the article states:
“Connolly also points out that weather professionals were able to let people know about the potential significance of the storm so they could be more prepared. These predictions were “getting more detailed and more accurate as more data points were available.” In addition, organizations, like Google, were able to develop applications to help. Google utilized its Superstorm Sandy application to overlap power outage information along with weather details on a map that also included shelter locations and traffic conditions. Social good and practical uses for Big Data analysis are exciting.”
It is important to remember that regardless of your company’s field, it is important that you choose a data management system that analyze big data and take the work-load off your employees by automating the process. Smartlogic’s Semaphore Content Intelligence Platform does this and so much more.
Jasmine Ashton, December 11, 2012
November 14, 2012
Respect data decentralization. That is the key to “The Challenge of Defensible Deletion of Distributed Legacy Data,” according to the eDiscovery Law & Tech Blog at X1 Discovery.
Blogger John Patzakis submits that, for large enterprises with data subject to governance requirements, centralization can make it hard to create a defensible retention schedule. Most archiving systems require that bits of data be pulled from their cozy homes on group and departmental silos and dumped into a central system before any retention and management process can even begin. He writes:
“Forcing centralization on these many pockets of productivity is highly disruptive and rarely effective due to scalability, network bandwidth and other logistical challenges. So what this leaves is the reality that for any information remediation process to be effective, it must be executed within these departmentalized information silos.”
Not surprisingly, Patzakis recommends one of X1′s own products, X1 Rapid Discovery, to do just that. The company has produced an hour-long webinar outlining their method. According to the article:
“X1 Rapid Discovery represents game-changing technology to effectuate the remediation of distributed legacy data due to its ability to install on demand virtually anywhere in the enterprise, including remote data silos, its light footprint web browser access, and intuitive interface. X1 Rapid Discovery enables for effective assessment, reporting, categorization and migration and remediation of distributed information assets by accessing, searching and managing the subject data in place without the need for migration to the appliance or a central repository.”
Sounds good. It may well be that X1 Rapid Discovery is the best solution for this process, or maybe not. Either way, the webinar could be worth a gander.X1 Discovery makes e-discovery tools and enterprise search solutions for IT and legal professionals. Founded by Idealab, the company is located in Pasadena, CA.
Cynthia Murrell, November 14, 2012
November 6, 2012
This past week, The Text Radar blog posted some interesting articles that discussed the topics of content intelligence, compliance, and big data that were incredibly pertinent to the needs of modern companies.
“Using Location Based Data to Deliver Better Products and Services” provides readers with the example of a successful company, in this case Starbucks, using location-based data to send customers special offers when they were physically close to a Starbucks store. The article also highlights other interesting examples of what is currently being developed in the field of location-based data services.
When explaining the potential of location-based data, the article states:
“Future location-based services applications will only be strengthened by integrating in other data streams (existing customer data or information triggered by an event such as a product scan or entering a store) and then using data analysis to quickly determine the next best action that can be taken. As evidenced by some of the emerging applications touched on here, we’ve only begun to scratch the surface for the possibilities that exist between data analysis and location-based data.”
Successful companies like Amazon are using big data to improve customer relationships. “Quick and Accurate Access to Big Data Lets Amazon Build Customer Relationships” explains how Amazon is harnessing big data in a way that builds relationships.
Text Radar writer, Alice Wilson concludes:
“The article acknowledges that Amazon has an easier time of synchronizing data because it has grown larger consistently rather than by amassing other companies, but that many organizations are not taking advantage of opportunities because they treat customer service as an effort that should be minimized rather than an opportunity to be taken advantage of. Madden emphasizes that giving your employees access to the right information when interacting with a customer is critical. The article points out that good data support doesn’t require increasing the workforce or even expanding training because most people know how to handle this type of data if they are just given access to it.”
It’s common knowledge that there is going to be a shortage of people with data analysis skills. Colleges and Universities are coming out with programs but should high schools as well? “How Early in Life Data Scientists Should be Prepared for Big Data Work Debated” makes the claim that there is a need to start educated our youth on data analysis before they reach college.
“The article mentions those working to promote data analysis in high school, including Alex Philp, founder and CTO of TerraEchos and Eric Tangedahl, Technology Director of the University of Montana School of Business Administration. The article also explains the participation of many students from a Dallas, Texas Science and Engineering Magnet School in IBM’s annual Master the Mainframe Contest and also IBM’s Innovation summer camp. The article also notes, however, that Big Data analysis is more than just raw technology and requires expertise and maturity beyond a secondary school student’s capabilities and that some experts prefer students concentrate on areas such as data interpretation, data extraction, and data modeling.”
When working with big data it is important that we look to experts in the field. The Smartlogic Semaphore Content Intelligence Platform enhances search capabilities by classifying metadata. This technology can be useful in any field.
Jasmine Ashton, November 6, 2012
June 26, 2012
Google is working overtime to keep attention focused on technical issues. You can wallow in the smart software encomium in the New York Times. (See “How Many Computers to Identify a Cat? 16,000” in the June 27, 2012, environmentally unfriendly newspaper or you can give the newspaper’s maybe here, maybe gone link at http://goo.gl/Twl9I.) The Google I/O Conference fast approaches, so there are the concomitant write ups about a Google hardware and news in “Google’s I/O Conference: New Operating System, Tablet”.
But there are two personnel stories which seem to haunt the company at what is the apex of the Google techno-promo machine: Larry Page’s minor voice problem and a person few people outside of Google have heard about. Both of these are potential “information minefields.” Google does not, as far as I know, have an effective demining system in place.
I have avoided commenting directly on the health thing. You can get the story or what passes for a story in “Google CEO Larry Page and the Healthy Way to Answer, ‘What’s Wrong?’” But I do have an opinion about the wizard responsible for Local Search, Maps, Earth, Travel, Payments, Wallet, Offers, and Shopping. I read more about about one Google executive than I expected in “This Exec May Have The Hardest Job At Google, And His Colleagues Are Tired Of Seeing Him Get Trashed In The Press.”
The basic idea, as I understand it, is:
Last week, we [Business Insider] published a story headlined: “Depending On Whom You Ask, This Google Exec Is Either ‘Weak’ Or He Just Drew The Short Straw?
The publication did some digging and learned from “senior sources”:
Their view is that Huber is a top-notch Google executive who asked for the hardest challenge his boss could give him and he got it – in the form of nascent, unproven products and an executive reporting to him that ended up being vastly under-qualified for her job.
The weak link in the Google brain mesh was a person from PayPal. Yikes. A female goofed with some PayPal type projects. The story wraps up:
May 4, 2012
Short honk: I don’t much, if any, attention to Yahoo. My last big analysis of Yahoo was shortly after its then Chief Technology Officer tried to explain to a financial services client of mine that Yahoo was ahead of Google in search. Crazy assertion from a crazy outfit. In my report, I included an image of Terry Semel as the captain of the Titanic. Got a laugh. Yahoo got zero positives from me. (By the way is that Wikipedia profile of Mr. Semel accurate? Check it out between conference calls and SMS texting.)
Navigate to “Scott Thompson Resume Scandal Is Not an Inadvertent Mistake—He Also Claimed Comp Sci Degree as CTO of PayPay.” I want to comment on the spelling of résumé but who cares? That’s my attitude to the coverage of an executive fudging a biography. Furthermore, in my analysis Yahoo is the type of outfit which lives in a world of illusion, silos, and confusion.
The fact that a senior executive would take the time to do a little digging is absolutely no surprise to me. I hear the phrase “I’m too busy” from people whom I know are not too busy. Some of these people ask me for work and then tell me, “We have a spring vacation.” I heard this phrase from a company president who is guiding a company which is losing millions of dollars each quarter. Right. Vacation. Spring break.
I think we have plenty of solid evidence of a core governance problem at Yahoo, but the same issue exists in many US organizations. Whether it is the confusion about the actions of US government employees or the unfortunate Google Street View incident, governance is not a core competency in many US organizations. Enron, Lehman Brothers, Tyco—remember these executive edifices?
Furthermore I don’t think governance can be fixed quickly, if at all.
When an individual professional does not do the basics like checking key facts, the egregious mistakes will continue and most likely increase.
Governance problems are not black swans.
Governance problems are a direct outcome of people who do not focus, gather information, analyze, and reflect.
Rushing to meetings, asking for others to collect information, and staring at mobile devices—these are flashing signals of trouble at Yahoo and elsewhere.
Fiddling with a biography is either effective public relations, impactful marketing, or the shortest distance between a person and the top of Maslow’s hierarchy. For me, Yahoo and fake credentials are no big deal.
Baloney is the business of many businesses.
Desperation marketing is the new normal marketing.
Stephen E Arnold, May 4, 2012
Sponsored by no one but me.
March 1, 2012
According to the Nextgov article “Solyndra Investigation Led to New Search Tool at Energy” Google Search has joined the fight against crime. A “congressional investigation into alleged mishandling of a $528 million Energy Department loan guarantee” to Solyndra caused a data overload for Energy’s Loan Department. Energy workers were trying to provide congressional investigators with the information they needed but their current document search tools simply created even bigger problems. According to Energy Chief Technology Officer Peter Tseronis the searches “resulted in “GS-15s standing at printers hitting print, print, print, copy, copy, copy for emails, attachments, PDFs — information that was just voluminous.” In response to this mounting problem the CTO office joined forces with an outside vendor to help modify their existing Google search system. Users throughout the department then had the ability to index as well as sort through various emails, Word documents etc. Looks like Google Search was ready and saved the day.
April Holmes, March 1, 2012
Sponsored by Pandia.com
February 2, 2012
It seems our nation is finally getting its records in digital order. The Sacramento Bee reports, “Autonomy Empowers U.S. to Meet President Obama’s New Memorandum on Government Records.” According to the Memorandum, government agencies must standardize their content policies and transfer relevant files to the U.S. National Archives and Records Administration. The press release presents Autonomy as the tool for the job:
Autonomy delivers a comprehensive suite of information governance solutions that addresses the broad and varying needs of enterprises and government agencies. With support for over 150 languages and access to over 1,000 file types through 400 pre-built connectors to disparate content sources, repositories and legacy systems, the Autonomy solutions can apply policy consistently to every information source in the enterprise simultaneously, while managing content in place and reducing duplicates across all enterprise repositories.
Autonomy, owned by HP, is a leader in the field of unstructured data management and serves prominent public and private organizations around the globe. The company was founded in 1996, and has made its fortune on the fruit of research originally performed at Cambridge University.
Cynthia Murrell, February 2, 2012
Sponsored by Pandia.com
December 20, 2011
From the “Why Am I Not Surprised” Department. News Flash.
UN Computer System Failure
A flub at the United Nations– an estimated nearly $400 million flub– has been made public as UN officials are scrambling to get the botched project back on track. Perhaps “flub” is too strong? Maybe in UN speak, the error was an administrative concern. Yes, that’s it. Administrative concern.
The United Nations’ project, known as Umoja, is a computer and software system that promised to reform the organization but has been at a standstill since June. Umoja, which was intended to be an administrative system to cut down on waste and fraud, was led by Secretary General Ban Ki-moon. Fox News’ article, “UN’s Botched Computer-System Overhaul: A Major ‘Failure’ of Ban Ki-Moon’s Management” tells us more:
Ban’s officials are scrambling to get the jinxed project known as Umoja (Swahili for unity) back on track after a key UN budget committee heard from Ban’s office last week that the sweeping information technology overhaul, already a year behind schedule, won’t be finished until 2015, three years beyond the original target date. The committee also said it was “deeply disturbed and dismayed” by the UN’s “apparent lack of awareness and foreknowledge” about the sputtering status of the project.”
This is entropy from top to bottom. Is this the UN’s approach to information management? It appears that guessing about technology may not work and the organization should probably make more solidified plans before pushing such a large and costly project forward. From peacekeeping to computing, the UN is rowing against the current of competence in my opinion.
Andrea Hayden, December 20, 2011
Sponsored by Pandia.com
November 15, 2011
We continue to run across some interesting stories about Enterprise Data. This one from Catherine Lamsfuss caused quite a debate at lunch. Here’s what we read:
As the amount of data within a business or industry grows the question of what to do with it arises. The article, “Business Process Management and Mastering Data in the Enterprise“, on Capgemini’s website explains how Business Process Management (BPM) is not the ideal means for managing data.
According the article as more and more operations are used to store data the process of synchronizing the data becomes increasingly difficult.
As for using BPM to do the job, the article explains,
While BPM tools have the infrastructure to do hold a data model and integrate to multiple core systems, the process of mastering the data can become complex and, as the program expands across ever more systems, the challenges can become unmanageable. In my view, BPMS solutions with a few exceptions are not the right place to be managing core data[i]. At the enterprise level MDM solutions are for more elegant solutions designed specifically for this purpose.
The answer to this ever-growing problem was happened upon by combining knowledge from both a data perspective and a process perspective. The article suggests that a Target Operating Model (TOM) would act as a rudder for the projects aimed at synchronizing data. After that was in place a common information model be created with enterprise definitions of the data entities which then would be populated by general attributes fed by a single process project.
While this is just one man’s answer to the problem of data, it is a start. Regardless of how businesses approach the problem it remains constant–process management alone is not efficient enough to meet the demands of data management.
“It’s not the process its the people that implement and use the process that matter” stated Jasmine Ashton in a final summary of the lunch debate. We had to agree. However, as we looked through the Polyspot data management description that Ms. Lamsfuss’ article pointed us to we had to agree that starting with a good technology implementation could go a long way towards helping the people follow the processes.
Constance Ard November 15, 2011
November 1, 2011
This week, SharePoint Semantics provided it’s followers with several informative articles to help end users navigate through the mine field that often is the SharePoint experience.
While we often highlight best practices when it comes to SharePoint, its also important that we learn from our own and others mistakes. In the post SharePoint Dan Helps You Avoid SharePoint Worst Practices http://sharepointsemantics.com/2011/10/sharepoint-dan-helps-you-avoid-sharepoint-worst-practices/ we learn that the root of poor SharePoint experiences lie in lack of planning in Development, Architecture, Implementation, and change management. Without appropriate governance or documentation, chaos reigns. The article also points out some helpful links that can help readers avoid these negative experiences.
To compliment SharePoint worst practices, I also wanted to recommend five SharePoint best practices. In Change Your SharePoint Thinking to Maximize Your SharePoint Productivity http://sharepointsemantics.com/2011/10/change-your-sharepoint-thinking-to-maximize-your-sharepoint-productivity. One helpful tip that writer Ken Toth highlights is:
“Don’t Think Documents, Think Lists and Wikis. Lists with a few columns allows you to use views on the list, which is much better than having to open documents to see the contents. Wikis are easy to link together or generate a document index.”
“The Official Word on Governing SharePoint Services From Microsoft Technet” http://sharepointsemantics.com/2011/10/the-official-word-on-governing-sharepoint-services-from-microsoft-technet/ shares a Technet guide that outlines creating and managing your SharePoint services. Toth advises using the Semaphore Content Intelligence Platform from Smartlogic to make sure that your SharePoint implementation best meets the business needs of your organization.
The new iPad has received a lot of fanfare and, with many new features, it’s one of Apple’s most in-demand products. It is important that we show how SharePoint can be used on this device. How to Make Your SharePoint Work Like iGoogle — Part 2 http://sharepointsemantics.com/2011/10/how-to-make-sharepoint-work-like-igoogle-part-2/
points end users to an article that teaches you how to create an iGoogle-style interface for SharePoint 2010.
While these articles provide helpful tips for users to efficiently overcome the lack of out-of-the box help that SharePoint provides, It is important that users recognize the web application platform’s limitations and utilize other products like Smartlogic’s Semaphore Content Intelligence Platform. Smartlogic fills in the gaps by using semantic technology to deliver information quickly and in context.
Jasmine Ashton, November 1, 2011
Sponsored by Pandia.com