HP Healthcare Analytics Aids in Reducing Waste
March 17, 2014
The article titled HP Autonomy Unlocks Value of Clinical Data with HP Healthcare Analytics from Market Watch explores HP’s announcement of a new analytics platform for healthcare providers to use in their work to comprehend clinical data, both structured and unstructured. The new platform was created in a partnership between HP and Standford Children’s Health and Lucile Packard Children’s Hospital. It is powered by HP Idol. The article states,
“The initial results have already yielded valuable insights, and have the potential to improve quality of care and reduce waste and inefficiency.
Though the core mission of the Information Services Analytics team at Lucile Packard Children’s Hospital Stanford is to enable operational insights from structured clinical and administrative data, innovation projects are also a key strategic initiative of the group… The healthcare industry faces the enormous challenges of reducing cost, increasing operational efficiency and elevating the quality of patient care.”
Costs have gotten out of control and it is the hope of this collaboration that analytics might be the key. A huge part of problem is the unstructured data that is overlooked in the form of text in a patient’s records, notes from the doctor or emails between the doctor and patient. HP Idol’s ability to understand and categorize such information will make early diagnosis and early detection much more possible. For more information visit www.autonomy.com/healthcare.
Chelsea Kerwin, March 17, 2014
Sponsored by ArnoldIT.com, developer of Augmentext
Improving SharePoint Search Efficiency
March 17, 2014
For many users, search is pretty much the main point of SharePoint, yet many complain of the inefficiency and inaccuracy of the search function. Search Windows Server addresses the issue in a great article that highlights search features from SharePoint 2007 to SharePoint 2013. Read the details in “Five Ways to Make SharePoint Search More Efficient.”
The article begins:
“Admins and end users alike find that using the search feature in SharePoint is helpful, but it can be frustrating . . . We compiled the five best tips to help SharePoint users work through common questions and situations with SharePoint search. Covering multiple versions of SharePoint, these tips highlight how to make searching in SharePoint more efficient, how to improve search functionality and more.”
Stephen E. Arnold has an interest in search; in fact he has made a career of it. His Web site, ArnoldIT.com, highlights the latest in search – the good and the bad. SharePoint gets a lot of coverage.
Emily Rae Aldridge, March 17, 2014
Bing: A Quote to Note and a Search
March 16, 2014
First, navigate to Bing and run the query “Bing Market Share.” The first hit is to “The Bing Dilemma: What To Do With The Little Search Engine That Can.” The write up contains a chart showing Bing market share. Bing is the orange line. The line way at the top is Google.
In “Bing’s Harry Shum Bags The 2014 Outstanding Technical Leadership Award At Microsoft,” in my opinion there is a quote to note:
“I am proud that we have built a very high-quality search engine comparable to Google and with differentiating features. We have provided to society, even to humanity, a different voice than Google.”
On a philosophical note: If a search engine retrieves in the forest, are its results relevant? Your essay response is 20 percent of your grade.
Stephen E Arnold, March 16, 2014
Fetopolis Find Fault with Facebook Marketing
March 16, 2014
The article titled This $US600,000 Facebook Ad Disaster Is A Warning Small Business Owners on Business Insider Australia tells the story of Kapur Brar, CEO of small business Fetopolis. Fetopolis is a compendium of online fashion magazines with a healthy online following. Until recently, Brar relied heavily on marketing through Facebook, spending $100,000 a day. The article explains why Brar has “fallen out of love with Facebook,”
“He discovered…that his Facebook fanbase was becoming polluted with thousands of fake likes from bogus accounts. He can no longer tell the difference between his real fans and the fake ones. Many appear fake because the users have so few friends, are based in developing countries, or have generic profile pictures. At one point, he had a budget of more than $US600,000 for Facebook ad campaigns, he tells us. Now he believes those ads were a waste of time.”
Strangely, this story isn’t really being told, in spite of Facebook having 25 million small businesses using Facebook for marketing at varying levels of sophistication.
Did the purchase of WhatsApp cause this interesting story to slip into oblivion? The article offers some defense of Facebook- the majority of customers are happy, the payment of Brar’s bill is disputed, and yet it is also true that Facebook does not allow for third party “click audits,” which is standard practice.
Chelsea Kerwin, March 16, 2014
Sponsored by ArnoldIT.com, developer of Augmentext
Google Flu Trends: How Algorithms Get Lost
March 15, 2014
Run a query for Google Flu Trends on Google. The results point to the Google Flu Trends Web site at http://bit.ly/1ny9j58. The graphs and charts seem authoritative. I find the colors and legends difficult to figure out, but Google knows best. Or does it?
A spate of stories have appeared in New Scientist, Smithsonian, and Time that pick up the threat that Google Flu Trends does not work particularly well. The Science Magazine podcast presents a quite interesting interview with David Lazar, one of the authors of “The Parable of Google Flu: Traps in Big Data Analysis.”
The point of the Lazar article and the greedy recycling of the analysis is that algorithms can be incorrect. What is interesting is the surprise that creeps into the reports of Google’s infallible system being dead wrong.
For example, Smithsonian Magazine’s “Why Google Flu Trends Can’t Track the Flu (Yet)” states, “The vaunted big data project falls victim to periodic tweaks in Google’s own search algorithms.” The write continues:
A huge proportion of the search terms that correlate with CDC data on flu rates, it turns out, are caused not by people getting the flu, but by a third factor that affects both searching patterns and flu transmission: winter. In fact, the developers of Google Flu Trends reported coming across particular terms—those related to high school basketball, for instance—that were correlated with flu rates over time but clearly had nothing to do with the virus. Over time, Google engineers manually removed many terms that correlate with flu searches but have nothing to do with flu, but their model was clearly still too dependent on non-flu seasonal search trends—part of the reason why Google Flu Trends failed to reflect the 2009 epidemic of H1N1, which happened during summer. Especially in its earlier versions, Google Flu Trends was “part flu detector, part winter detector.”
Oh, oh. Feedback loops, thresholds, human bias—Quite a surprise apparently.
Time Magazine’s “Google’s Flu Project Shows the Failings of Big Data” realizes:
GFT and other big data methods can be useful, but only if they’re paired with what the Science researchers call “small data”—traditional forms of information collection. Put the two together, and you can get an excellent model of the world as it actually is. Of course, if big data is really just one tool of many, not an all-purpose path to omniscience, that would puncture the hype just a bit. You won’t get a SXSW panel with that kind of modesty.
Scientific American’s “Why Big Data Isn’t Necessarily Better Data” points out:
Google itself concluded in a study last October that its algorithm for flu (as well as for its more recently launched Google Dengue Trends) were “susceptible to heightened media coverage” during the 2012-2013 U.S. flu season. “We review the Flu Trends model each year to determine how we can improve—our last update was made in October 2013 in advance of the 2013-2014 flu season,” according to a Google spokesperson. “We welcome feedback on how we can continue to refine Flu Trends to help estimate flu levels.”
The word “hubris” turns up in a number of articles about this “surprising” suggestion that algorithms drift.
Forget Google and its innocuous and possibly ineffectual flu data. The coverage of the problems with the Google Big Data demonstration have significance for those who bet big money that predictive systems can tame big data. For companies licensing Autonomy- or Recommind-type search and retrieval systems, the flap over flu trends makes clear that algorithmic methods require baby sitting; that is, humans have to be involved and that involvement may introduce outputs that wander off track. If you have used a predictive search system, you probably have encountered off center, irrelevant results. The question “Why did the system display this document?” is one indication that predictive search may deliver a load of fresh bagels when you wanted a load of mulch.
For systems that do “pre crime” or predictive analyses related to sensitive matters, uninformed “end users” can accept what a system outputs and take action. This is the modern version of “Ready, Fire, Aim.” Some of these actions are not quite as innocuous as over-estimating flu outbreaks. Uninformed humans without knowledge of context and biases in the data and numerical recipes can find themselves mired in a swamp, not parked at the local Starbuck’s.
And what about Google? The flu analyses illustrate one thing: Google can fool itself in its effort to sell ads. Accuracy is not the point of Google or many other online information retrieval services.
Painful? Well, taking two aspirins won’t cure this particular problem. My suggestion? Come to grips with rigorous data analysis, algorithm behaviors, and old fashioned fact checking. Big Data and fancy graphics are not, by themselves, solutions to the clouds of unknowing that swirl through marketing hyperbole. There is a free lunch if one wants to eat from trash bins.
Stephen E Arnold, March 15, 2014
March 13, 2014
Best-Practices for Big Data Report
March 15, 2014
The article titled Report: Best Practices for Big Data Projects on GCN explores the forty-four page IBM report called Realizing the Promise of Big Data. The report includes a history of big data, and explanations for its different applications in the public and private sectors. The report further breaks down the usage of big data by the federal government and local government. The article provides some tips from the report, such as strong oversight,
“A staff with expertise in the technology, business and policy aspects of the project can help prevent any major surprises and ensure everything goes as planned. The development of key performance indicators is critical to big data projects. Both process and outcome measures are essential to the project’s success. Performance measures are centered on improving efficiency, such as lowering the cost of operations. Outcome measures focus on how the customers perceive the service being delivered.”
As might be clear from the quote, the report focuses on how best to design and implement a project of gaining insight from big data. At this point, most of us are still not sure what big data means, but somehow there are there are best practices for a fuzzy field of government effort.
Chelsea Kerwin, March 15, 2014
Sponsored by ArnoldIT.com, developer of Augmentext
Inflation or Desperation: Pricing Free Online Services
March 14, 2014
Yep, it’s illogical. How can a free online service get a price tag. Easy as Amazon’s boosting the fee for Prime and Facebook’s cooking up whizzy new types of advertising. But the big news is tucked between the lines of “Desktop Search to Decline $1.4 Billion as Google Users Shift to Mobile.”
Here’s a tasty factoid:
In the scope of Google’s overall ad revenues, mobile search is gaining significant share. Up from 19.4% in 2013, mobile search will comprise an estimated 26.7% of the company’s total ad revenues this year. Desktop search declined to 63.0% of Google’s ad revenues in 2013, having already fallen from 72.7% in 2012.
You may have noticed how lousy the search results are from Bing, Google, and Yahoo. Even the metasearch engines are struggling. Just run some queries on Ixquick.com or DuckDuckGo.com and do some results comparisons.
Because most of the world’s Internet users rely on Google to deliver comprehensive and accurate results, users are unaware of the information that is not easily findable. Investigators and professional researchers are increasingly aware that finding information is getting harder, a log harder if our research is on the beam.
As users shift from desktops to mobile the GoTo/Overture advertising model loses efficiency. There are a number of reasons, including the difficulty of entering queries while riding a crowded bus to the small screens to the dorky big type interfaces that are gaining popularity to the need to provide a brain dead single / limited function app to help a person locate pizza.
For Google and other desktop centric companies, the shift has implications for advertising revenue. Smaller screens and changing behavior means the old GoTo / Overture model won’t work. The impact on traditional Web sites is not good. Here’s a report for a company that did the search engine optimization thing, the redesign thing, and the new marketing “experts” thing. Looks grim, doesn’t it.
I won’t name the owner of this set of red arrows, but you can check out your own Web site and blog usage stats and compare your “performance” to this outfit’s.
Layers A Search Engine for Social Media
March 14, 2014
The article on wlfi.com titled Frankfort Teen Creates Idea for New Search Engine discusses the work of fifteen-year-old Spencer Jordan. His new idea for a search engine was to focus the search among ones social media networks. He got the idea when he was switching from one social media app and another, and noticed that it might be possible to streamline that process. Layers, Spencer’s search engine, is still in the “dream” phase,
“For now, “Layers” is just an dream, but to make it reality, Jordan has to pay a programmer to create the site. In order to raise the $10,000 needed, he began fundraising through an online donation website. “I’ve been trying to get my friends, and family and the public to support me, and to back me and to help me accomplish this,” said Jordan. As of Sunday, Jordan hasn’t raised any of his $10,000 goal, but he said failure is not an option.”
In spite of the lack of funding, Spencer is not ready to quit. (As of Thursday, February 27 he has raised $80.) Should Google be nervous or just open its checkbook to buy this idea? The ability to search through Youtube, Facebook, Twitter and Instagram is appealing; in Spencer’s words it “declutters” social media.
Chelsea Kerwin, March 14, 2014
Sponsored by ArnoldIT.com, developer of Augmentext
IBM Leads in Report on Perceived Capability
March 14, 2014
The article on Bloomberg BusinessWeek titled IBM Named a Leader in IDC MarketScape for Business Consulting Services explains the 2014 Worldwide Business Consulting 2014 Vendor Assessment report measuring client feedback by IDC Marketscape. Using their vendor analysis model IDC studies data to provide information on where a business stands in relation to other business in the same field. The report studied worldwide client perception of 600 of IDC’s clients. The article explains,
“IBM is “seen as the most capable of all firms at delivering value-creating innovation and driving innovation through an organization” when compared to other firms worldwide….”We are seeing that new models of engagement are reshaping the front-office agenda. Â Soon there will be no distinction between business strategy and the use of data,” said Sarah Diamond, general manager, IBM Global Business Services. “The IDC MarketScape report reinforces the need for a first-of-a-kind consulting practice such as IBM Interactive Experience to innovate client experience.”
Does pay to play delivers what IBM wants and needs? IBM credits their focused office approach and use of data and analytics for their client’s positive response. Clearly the survey indicates that IBM enjoys an excellent reputation for customer service. This praise for IBM’s consulting services comes as no huge surprise.
Chelsea Kerwin, March 14, 2014
Sponsored by ArnoldIT.com, developer of Augmentext
Fast Web, Slow Web
March 14, 2014
The article titled How the Fast Web is Impairing How You Think on LifeHacker introduces the Slow Web movement in response to dangerous habits being formed around the Fast Web. The Fast Web is the web most of us are accustomed to, it is dictated by an overwhelming amount of information that we don’t set out to take in, but breeze through while checking Facebook statuses or clicking through pages on StumbleUpon. The article explains the problem this creates,
“The life of the Fast Web is one of constant distraction. Going slow leads to sharper focus on fewer things and, in the case of Automatic Believing theory, less risk of falling for something that’s just not true…When the rush of digital life eases with Slow Web, you have an opportunity to take notice of the world around you and open your eyes to something you could be missing out on that’s right under your nose.”
The article finds support for its theory with the Inattentional Blindness theory, which stipulates that people focused on one distraction will miss other things happening around them. Scheduling the time we spend on the internet will prevent us from falling into the time-wasting distractions that are so appealing and yet ultimately useless.
Chelsea Kerwin, March 14, 2014
Sponsored by ArnoldIT.com, developer of Augmentext