Analytics Improves SharePoint Experience

March 13, 2014

Microsoft partners are responsible for SharePoint add-ons that increase usability and efficiency for users. Webtrends is one such partner that offers an Analytics for SharePoint solution. Broadway World covers their latest announcement in the article, “Employee Adoption for SharePoint Soars With Webtrends Analytics.”

The article begins:

“Webtrends, a Microsoft-preferred partner for SharePoint analytics, today announced a 64% year-over-year increase in customer bookings for its Analytics for SharePoint business . . . Leveraging deep analytics expertise and use cases from customers like BrightStarr and Siemens, Webtrends highlights key insights and successes, including a preview of an analytics for Yammer solution, during the SharePoint Conference in Las Vegas, NV on March 3-6.”

Stephen E. Arnold has a lot to say about SharePoint from his platform, ArnoldIT.com. As a longtime search expert, Arnold knows that SharePoint’s success hinges on customization and add-ons, which allow an organization to take this overwhelming solution and make it work for them.

Emily Rae Aldridge, March 13, 2014

Thiess and IBM Work Together to Improve Thiess’s Efficiency

March 12, 2014

The article titled IBM and Thiess Collaborate on Predictive Analytics and Modeling Technologies on Mining-Technology.com explores the partnership of IBM and Thiess, an Australian construction, mining and service provider. The collaboration is centered on both predictive analytics in regards to maintenance and replacement information as well as early detection of malfunctions. The article states,

“Thiess Australian mining executive general manager Michael Wright said the analytics and modeling can offer great opportunities to improve business of the company. “Working with IBM to build a platform that feeds the models with the data we collect and then presents decision support information to our team in the field will allow us to increase machine reliability, lower energy costs and emissions, and improve the overall efficiency and effectiveness of our business,” Wright said.”

This is another big IBM bet. The collaboration will start with Thiess’s mining haul trucks and excavators. Models will be constructed around such information as inspection history of the equipment, weather conditions and payload size. These models will then be used to help make more informed decisions about operational performance, and will allow for early detection of anomalies as well as predictions about when a piece of equipment will require a replaced part. This will in turn allow Thiess to plan productions more accurately around the predicted health of a given machine.

Chelsea Kerwin, March 12, 2014

Sponsored by ArnoldIT.com, developer of Augmentext

Tableau Finds Success Since Going Public Last Year

March 12, 2014

Investment site the Street is very enthused about Tableau Software, which went public less than a year ago. In fact, they go so far as to announce that “Tableau’s Building the ‘Google for Data’.” In this piece, writer Andrea Tse interviews Tableau CEO Christian Chabot. In her introduction, Tse notes that nearly a third of the company’s staff is in R&D—a good sign for future growth. She also sees the direction of Tableau’s research as a wise. The article explains:

“The research and development team has been heavily focused on developing technology that’s free of skillset constraints, utilizable by everyone. This direction has been driven by the broad, corporate cultural shift to employee-centric, online-accessible data analytics, from the more traditional, hierarchical or top-down approach toward data analysis and dissemination.

“Tableau 9 and Tableau 10 that are in the product pipeline and soon-to-be-shipped Tableau 8.2 are designed to highlight ‘storytelling’ or visually striking data presentation.

“Well-positioned to ride the big data wave, Tableau shares, as of Tuesday’s [February 11] intraday high of $95, are now trading over 206% above its initial public offering price of $31 set on May 16.”

In the interview, Chabot shares his company’s research philosophy, touches on some recent large deals, and takes a gander at what’s is ahead. For example, his developers are currently working hard on a user-friendly mobile platform. See the article for details. Founded in 2003 and located in Seattle, Tableau Software grew from a project begun at Stanford University. Their priority is to help ordinary people use data to solve problems quickly and easily.

Cynthia Murrell, March 12, 2014

Sponsored by ArnoldIT.com, developer of Augmentext

Attensity Analyze 6.3: Signs of Life Evident

March 11, 2014

Attensity has been a quiet sentiment, analytics, text processing vendor for some months. The company has now released a new version of its flagship product, Analyze, now at version 6.3. The headline feature is “enhanced analytics.”

According to a company news release, Attensity is “the leading provider of integrated, real-time solutions that blend multi-channel Voice of the Customer analytics and social engagement for enterprise listening needs.” Okay.

The new version of Analyze delivers to licensees real time information about what is trending. The system provides “multi dimensional visualization that immediately identifies performance outliers in the business that can impact6 the brand both positively and negatively.” Okay.

The system processes over 150 million blogs and forums, Facebook, and Twitter. Okay.

As memorable as these features are, here’s the passage that I noted:

Attensity 6.3 is powered by the Attensity Semantic Annotation Server (ASAS) and patented natural language processing (NLP) technology. Attensity’s unique ASAS platform provides unmatched deep sentiment analysis, entity identification, statistical assignment and exhaustive extraction, enabling organizations to define relationships between people, places and things without using pre-defined keywords or queries. It’s this proprietary technology that allows Attensity to make the unknown known.

“To make the unknown known” is a bold assertion. Okay.

I have heard that sentiment analysis companies are running into some friction. The expectations of some licensees have been a bit high. Perhaps Analyze 6.3 will suck up customers of other systems who are dissatisfied with their sentiment, semantic, analytics systems. Making the “unknown known” should cause the world to beat a path to Attensity’s door. Okay.

Stephen E Arnold, March 11, 2014

Twenty Electric Text Analytics Platforms

March 11, 2014

Butler Analytics collected a list of “20+ Text Analytics Platforms” that delve through the variety of text analytics platforms available and what their capabilities are. According to the list, text analytics has not reached its full maturity yet. There are three main divisions in the area: natural language processing, text mining, and machine learning. Each is distinct and each company has their own approach to using these processes:

“Some suppliers have applied text analytics to very specific business problems, usually centering on customer data and sentiment analysis. This is an evolving field and the next few years should see significant progress. Other suppliers provide NLP based technologies so that documents can be categorized and meaning extracted from them. Text mining platforms are a more recent phenomenon and provide a mechanism to discover patterns that might be used in operational activities. Text is used to generate extra features which might be added to structured data for more accurate pattern discovery. There is of course overlap and most suppliers provide a mixture of capabilities. Finally we should not forget information retrieval, more often branded as enterprise search technology, where the aim is simply to provide a means of discovering and accessing data that are relevant to a particular query. This is a separate topic to a large extent, although again there is overlap.”

Reading through the list shows the variety of options users have when it comes to text analytics. There does not appear to be a right or wrong way, but will the diverse offerings eventually funnel

down to few fully capable platforms?

Whitney Grace, March 11, 2014
Sponsored by ArnoldIT.com, developer of Augmentext

Infogistics Starts Daxtra

March 7, 2014

Infogistics calls itself a leading company in text analysis, document retrieval, and text extraction for various industries. One would not think that after visiting their Web site that has not been updated since 2005. The company does, however have a new vested interest in DaXtra Technologies, its new endeavor to provide content processing solutions for personnel and human resources applications.

Here is an official description from the Web site:

“For almost a decade we’ve been at the forefront of technology and solutions within our marketplace, giving our customers the competitive edge in their challenge to source the best available jobseekers, and find them quickly. Over 500 organizations, spanning all continents, use our resume analysis, matching and search products – from the world’s largest staffing companies to boutique recruiters, corporate recruitment departments, job boards and software vendors. This global reach is made possible via our multilingual CV technology which can automatically parse in over 25 different languages.”

DaXtra’s products include DaXtra Capture-a recruitment management software, DaXtra Search, DaXtra Parser-turns raw data into structured XML, DaXtra Components-to manage Web services, and DaXtra Analytics to come in 2014. The company appears to make top of the line personnel software that deletes the confusion in HR departments. What is even better is that the Web site is updated.

Whitney Grace, March 07, 2014
Sponsored by ArnoldIT.com, developer of Augmentext

Still Explaining Bayes

March 4, 2014

Bayes’s Theorem is the founding basis for predictive analytics. Gigaom’s article tries to explain how not only Bayes’s Theorem is used in predictive analytics, but there is another factor: “How the Solution To the Monty Hall Problem Is Also The Key To Predictive Analytics.”

The Monty Hall Problem is named after the Let’s Make a Deal host. Here is how it works:

“The show used what came to be known as the Monty Hall Problem, a probability puzzle named after the original host. It works like this: You choose between three doors. Behind one is a car and the other two are Zonks. You pick a door – say, door number one – and the host, who knows where the prize is, opens another door – say, door number three – which has a goat. He then asks if you want to switch doors. Most contestants assume that since they have two equivalent options, they have a 50/50 shot of winning, and it doesn’t matter whether or not they switch doors. Makes sense, right?”

If a data scientist had been on the show, he would have used Bayes’s Theorem to win the prize. The solution is to switch doors.

The Monty Hall Problem is used in business, but Bayes’s Theorem is becoming more widespread. It is used to link big data and cloud computing, which also powers predictive analytics. What follows is an explanation of the theorem’s importance and impact on business, which is not new. It ends with encouraging people to rely on Bayes over Monty Hall.

What will the next metaphor comparison be?

Whitney Grace, March 04, 2014
Sponsored by ArnoldIT.com, developer of Augmentext

Predictive Analysis Forms Its Own Vocal Reality Search

March 4, 2014

Did you ever think that predictive analytics would be used to determine the next singing sensation? I did not think so. “SAPVoice: How To Predict A Future Pop Star” from Forbes details how music labels are using data to find star power. The form of predictive analytics is called predictive business. Despite its immaterial aspects, music does contain many data points:

“Her record label, Universal Music Group taps thousands of data points generated daily for the artists it manages that reveal how particular customer segments are responding to them. Managers search a database of a million interview subjects, containing data on everything from where a consumer shops to the new music she prefers. With such tools at hand, YouTube won’t be the only way to find the next stars; scouts will also dig through the data.”

It is not just the music industry tapping into this new resource. Consumer goods, healthcare, technology, and manufacturing are using it to signal red flags and increase efficiency.

SAP steps in with its own predictive business model that focuses on predicting with accuracy, determining the best actions to take based on the data, and act fast on the data results. This approach has paid off for many companies.

Will the singing capitals of the world embrace SAP’s methodology? Don’t some disaffected recording moguls shoot handguns when disaffected? If the software does not deliver value, will there be gunplay at a Las Vegas intersection or maybe Wall Street if it does not pay off in the finance sector?

Whitney Grace, March 04, 2014
Sponsored by ArnoldIT.com, developer of Augmentext

Log Files: Search, Short Cuts, and Low Costs

February 26, 2014

I read “Splunk Feels the Heat from Stronger, Cheaper Open Source Rivals.” InfoWorld is up to its old tricks again. Log files have been around for decades. Many organizations allow more recent entries to overwrite previous log files. I know that some people believe that this practice has gone the way of the dodo. Well, would you like to buy a bridge?

For those who keep log files and want to figure out what treasures nestle therein, an outfit has marketed an expensive “search” system. Splunk is the darling of many information technology gurus. In Washington, DC, I am surprised when laborers in the Federal vineyard do not sport a Splunk tattoo.

IDC’s view is that there is charge rolling down the road. The write up points out that Splunk is no longer limited. Like most information access systems, the company has expanded. In fact, the wizards at IDC parrot the jargon: Analytics. Here’s the passage I noted:

Splunk started strong and has only grown stronger as it’s branched out to become a wide-ranging analytics platform. But the free version of Splunk is quite limited, and the enterprise version’s pricing is based on the amount of data indexed, which adds up to prohibitive costs for some.

The important factoid is, in my opinion, cost. Most organizations want to reduce costs for some little understood information tasks. Making heads or tails out of the ever burgeoning and frequently overwritten log files may be at the top of the budget tightening list.

IDC, truly an expert in open source software, points out that “open source competition has been emerging in the background.” I suppose that’s why IDC is selling at $3,500 a whack analyses of open source such as this gem produced in part by IDC’s wizards. See Report 237410. Who wrote that? Worth a look I suppose.

The angle is that Graylog2 and Elasticsearch are chasing after Splunk. I am not sure if this is old news, good news, or silly news. What’s clear is that InfoWorld is covering open source and not emphasizing its deep research.

Cost control is a subtle point. I am delighted that the write up creeps up on one of the central attributes of open source software: No license fees. But what of the costs of installing, tuning, and maintaining the open source solution? Ah, not included in the write up. If you pony up $3,500 for an IDC open source report, I assume more substance is provided. Who wrote those IDC open source reports like 237410? Was it an IDC analyst, marketer, or reporter? Did the information come from another source?

Anyway, good PR for Elasticsearch. Bad PR for Splunk.

Stephen E Arnold, February 26, 2014

Online Accuracy: Not What It Seems

February 25, 2014

I read “Publishers Withdraw More than 120 Gibberish Papers.” The article reports that Springer and IEEE have begun the process of removing “computer generated nonsense.” The article explains how to create a fake paper in case you are curious. What about the papers in online services and commercial databases that contain bogus data? Do researchers discern false information?

PLOS, an open access scientific publisher, said that it would ask authors to make their data more available. You can read about this long overdue action in “PLOS’ New Data Policy: Public Access to Data.”

I wonder why the much vaunted text analysis software does not flag suspect information. Perhaps marketing is more important than accuracy?

Stephen E Arnold, February 25, 2014

« Previous PageNext Page »