October 6, 2015
An article at the SmartDataCollective, “The Difference Between Business Intelligence and Real Data Science,” aims to help companies avoid a common pitfall. Writer Brigg Patton explains:
“To gain a competitive business advantage, companies have started combining and transforming data, which forms part of the real data science. At the same time, they are also carrying out Business Intelligence (BI) activities, such as creating charts, reports or graphs and using the data. Although there are great differences between the two sets of activities, they are equally important and complement each other well.
“For executing the BI functions and data science activities, most companies have professionally dedicated BI analysts as well as data scientists. However, it is here that companies often confuse the two without realizing that these two roles require different expertise. It is unfair to expect a BI analyst to be able to make accurate forecasts for the business. It could even spell disaster for any business. By studying the major differences between BI and real data science, you can choose the right candidate for the right tasks in your enterprise.”
So fund both, gentle reader. Patton distinguishes each position’s area of focus, the different ways they use and look at data, and their sources, migration needs, and job processes. If need to hire someone to perform these jobs, check out this handy clarification before you write up those job descriptions.
Cynthia Murrell, October 6, 2015
October 1, 2015
Gentle reader, I know you remember that beloved Alta Vista once was a Hewlett Packard property. Ah, what might have been.
I though about HP’s muffed bunnies when I read “Carly Fiorina’s Legacy as CEO of Hewlett Packard.” The write up summarizes an academic approach to figuring out what happened at HP before Carly Fiorina was replaced by Mark Hurd was replaced by Leo Apotheker was replaced by Meg Whitman.
What type of manager was Ms. Fiorina? Here’s the objective assessment in the article:
When Fiorina came to HP, the culture that she walked into was very much “aim, aim, aim and fire” — a slow culture, during a time when companies were moving very fast. In that context, she was what we want our change leaders to be — bold and disruptive. One of her moves was to buy Compaq, which had a fast moving “Internet” culture — “aim, fire, fire, re-aim, fire.”
I assume that approach contributed to the slow, steady decline of the Alta Vista search system.
Who benefited from HP’s handling of Alta Vista? I would suggest the Alphabet Google thing.
Stephen E Arnold, October 1, 2015
September 29, 2015
The article on Computer World titled Technology that Predicts Your Next Security Fail confers the current explosion in predictive analytics, the application of past occurrences to predict future occurrences. The article cites the example of the Kentucky Department of Revenue (DOR), which used predictive analytics to catch fraud. By providing SAS with six years of data the DOR received a batch of new insights into fraud indicators such as similar filings from the same IP address. The article imparts words of wisdom from SANS Institute instructor Phil Hagen,
“Even the most sophisticated predictive analytics software requires human talent, though. For instance, once the Kentucky DOR tools (either the existing checklist or the SAS tool) suspect fraud, the tax return is forwarded to a human examiner for review. “Predictive analytics is only as good as the forethought you put into it and the questions you ask of it,” Hagen warns…. Also It’s imperative that data scientists, not security teams, drive the predictive analytics project.”
In addition to helping the IRS avoid major fails like the 2013 fraudulent refunds totaling $5.8 billion, predictive analytics has other applications. Perhaps most interesting is its use protecting human assets in regions where kidnappings are common by detecting unrest and alerting organizations to lock up their doors. But it is hard to see limitations for technology that so accurately reads the future.
Chelsea Kerwin, September 29, 2015
September 29, 2015
NTENT is a leading natural language processing and semantic search company, that owns the Convera technology, and according to Business Wire Dan Stickel was hired as the new CEO, says “NTENT Appoints Dan Stickel As New CEO.” NTENT is focused on expanding the company with AltaVista and Google. Using Stickel’s experience, NTENT has big plans and is sure that Stickel will lead the company to success.
“CEO, Stickel’s first objective will be to prioritize NTENT’s planned expansion along geographic, market and technology dimensions. ‘After spending significant time with NTENT’s Board, management team and front-line employees, I’m excited by the company’s opportunities and by the foundation that’s already been laid in both traditional web and newer mobile capabilities. NTENT has clearly built some world-class technology, and is now scaling that out with customers and partners.’”
In his past positions as CEO at Metaforic and Webtrends s well as head of the enterprise business at AltaVista and software business at Macrovision, Stickel has transitioned companies to become the leaders in their respective industries.
The demand for natural language processing software and incorporating it into semantic search is one of the biggest IT trends at the moment. The field is demanding innovation and NTENT believes Stickel will guide them.
Whitney Grace, September 29, 2015
September 28, 2015
A new, indispensable position for companies is the chief technology officer or the chief information officer. Their primary responsibilities are to manage the IT department, implement new ways to manage information, and/or develop software as needed. There is a new position that companies will be creating in the future and the title is chief marketing technology officer, says Live Mint in “Make Way CIOS, CMOS: Here Comes The CMTO.”
Formerly the marketing and IT departments never mixed, except for the occasional social media collaboration. Marketers are increasing their reliance on technology to understand their customers and it goes far beyond social media. Marketers need to be aware of the growing trends in mobile shopping and search, digital analytics, gamification, online communities, and the power of user-generated content.
“The CMO’s role will graduate to CMTO, a marketer with considerable knowledge of technology. The CMTO, according to Nasscom, will not only conceptualize but also build solutions and lay down the technical and commercial specifications while working alongside the IT team on vendor selection.”
It is not enough to know how to market a product or promote an organization. Marketers need to be able to engage with technology and understand how to implement to attract modern customers and increase sales. In other words, evolving the current marketing position with a new buzzword.
Whitney Grace, September 28, 2015
September 25, 2015
With a title like “AML-A Challenge Of Titanic Proportions” posted on Attivio metaphoric comparisons between the “ship of dreams” and icebergs is inevitable. Anti-money laundering compliances have seen an unprecedented growth between 2011-2014 of 53%, says KPMG’s Global Anti-Money Laundering (AML) Survey. The costs are predicted to increase by more than 25% in the next three years. The biggest areas that are requiring more money, include transaction monitoring systems, Know Your Customer systems, and recruitment/retention systems for AML staff.
The Titanic metaphor plays in as the White Star Line director Bruce Ismay, builder Thomas Andrew, and nearly all of the 3327 passengers believed the ship was unsinkable and the pinnacle of modern technology. The belief that humanity’s efforts would conquer Mother Nature was its downfall. The White Star Line did not prepare the Titanic for disaster, but AML companies are trying to prevent their ships are sinking. Except they cannot account for all the ways thieves can work around their system, just as the Titanic could not avoid the iceberg.
“Systems need to be smarter – even capable of learning patterns of transaction and ownership. Staff needs more productive ways of investigating and positively concluding their caseload. Alerting methods need to generate fewer ‘false positives’ – reducing the need for costly human investigation. New sources of information that can provide evidence need to come online faster and quickly correlate with existing data sources.”
The Titanic crew accidentally left the binoculars for the crow’s nest in England, which did not help the lookouts. The current AML solutions are like the forgotten binoculars and pervasive action needs to be taken to avoid the AML iceberg.
Whitney Grace, September 25, 2015
September 24, 2015
One of the new legal buzzwords is knowledge management and not just old-fashioned knowledge management, but rather quick, efficient, and effective. Time is an expensive commodity for legal professionals, especially with the amount of data they have to sift through for cases. Mondaq explains the importance of knowledge management for law professionals in the article, “United States: A Brief Overview Of Legal Knowledge Management.”
Knowledge management first started in creating an effective process for managing, locating, and searching relevant files, but it quickly evolved into implementing a document managements system. While knowledge management companies offered law practices decent document management software to tackle the data hill, an even bigger problem arose. The law practices needed a dedicated person to be software experts:
“Consequently, KM emphasis had to shift from finding documents to finding experts. The expert could both identify useful documents and explain their context and use. Early expertise location efforts relied primarily on self-rating. These attempts almost always failed because lawyers would not participate and, if they did, they typically under- or over-rated themselves.”
The biggest problem law professional face is that they might invest a small fortune in a document management license, but they do not know how to use the software or do not have the time to learn. It is a reminder that someone might have all the knowledge and best tools at their fingertips, but unless people have the knowledge on how to use and access it, the knowledge is useless.
September 21, 2015
I read “How Healthcare.gov Botched $600 Million worth of Contracts.” My initial reaction was that the $600 million figure understated the fully loaded costs of the Web site. I have zero evidence about my view that $600 million was the incorrect total. I do have a tiny bit of experience in US government project work, including assignments to look into accounting methods in procurements.
The write up explains that a an audit by the Office of the Health and Human Services office of Inspector General identified the root causes of the alleged $600 million Healthcare.gov Web site. The source document was online when I checked on September 21, 2015, at this link. If you want this document, I suggest you download it. Some US government links become broken when maintenance, interns, new contractors, or site redesigns are implemented.
The news story, which is the hook for this blog post, does a good job of pulling out some of the data from the IG’s report; for example, a list of “big contractors behind Healthcare.gov.” The list contains few surprises. Many of the names of companies were familiar to me, including that of Booz, Allen, where I once labored on a range of projects. There are references to additional fees from scope changes. I am confident, gentle reader, that you are familiar with scope creep. The idea is that the client, in the case of Healthcare.gov, needed to modify the tasks in the statement of work which underpins the contracts issued to the firms which perform the work. The government method is to rely on contractors for heavy lifting. The government professionals handle oversight, make certain the acquisition guidelines are observed, and plug assorted types of data into various US government back office systems.
The news story repeated the conclusion of the IG’s report that better training was need to make the Healthcare.gov type of project work better in the future.
My thoughts are that the news story ignored several important factors which in my experience provided the laboratory in which this online commerce experiment evolved.
First, the notion of a person in charge is not one that I encountered too often in my brushes with the US government. Many individuals change jobs, rotating from assignment to assignment, so newcomers are often involved after a train has left the station. In this type of staffing environment, the enthusiasm for digging deep and re-rigging the ship is modest or secondary to other tasks such as working on budgets for the next fiscal year, getting involved in new projects, or keeping up with the meetings which comprise the bulk of a professional’s work time. In short, decisions are not informed by a single individual with a desire to accept responsibility for a project. The ship sails on, moved by the winds of decisions by those with different views of the project. The direction emerges.
Second, the budget mechanisms are darned interesting. Money cannot be spent until the project is approved and the estimated funds are actually transferred to an account which can be used to pay a contractor. The process requires that individuals who may have never worked on a similar project create a team which involves various consultants, White House fellows, newly appointed administrators, procurement specialists with law degrees, or other professionals to figure out what is going to be done, how, what time will be allocated and converted to estimates of cost, and the other arcana of a statement of work. The firms who make a living converting statements of work into proposals to do the actual work. At this point, the disconnect between the group which defined the SOW and the firms bidding on the work becomes the vendor selection process. I will not explore vendor selection, an interesting topic outside the scope of this blog post. Vendors are selected and contracts written. Remember that the estimates, the timelines, and the functionality now have to be converted into the Healthcare.gov site or the F-35 aircraft or some other deliverable. What happens if the SOW does not match reality? The answer is a non functioning version of Healthcare.gov. The cause, gentle reader, is not training.
Third, the vendors, bless their billable hearts, now have to take the contract which spells out exactly what the particular vendor is to do and then actually do it. What happens if the SOW gets the order of tasks wrong in terms of timing? The vendors do the best they can. Vendors document what they do, submit invoices, and attend meetings. When multiple vendors are involved, the meetings with oversight professionals are not the places to speak in plain English about the craziness of the requirements or the tasks specified in the contract. The vendors do their work to the best of their ability. When the time comes for different components to be hooked together, the parts usually require some tweaking. Think rework. Scope change required. When the go live date arrives, the vendors flip the switches for their part of the project and individuals try to use the system. When these systems do not work, the problem is a severe one. Once again: training is not the problem. The root cause is that the fundamental assumptions about a project were flawed from the git go.
Is there a fix? In the case of Healthcare.gov, there was. The problem was solved by creating the equivalent of a technical SWAT team, working in a very flexible manner with procurement requirements, and allocating money without the often uninformed assumptions baked into a routine procurement.
Did the fix cost money? Yes, do I know how much? No. My hunch is that there is zero appetite in the US government, at a “real” news service, a watchdog entity, or an in house accountant to figure out the total spent for Healthcare.gov. Why do I know this? The accounting systems in use by most government entities are not designed to roll up direct and indirect costs with a mouse click. Costs are scattered and methods of payment pretty darned crazy.
Net net: Folks can train all day long. If that training focuses on systems and methods which are disconnected from the deliverable, the result is inefficiency, a lack of accountability, and misdirection from the root cause of a problem.
I have been involved in various ways with government work in the US since the early 1970s. One thing remains consistent: The foundational activities are uneven. Will the procurement process change? Forty years ago I used to think that the system would evolve. I was wrong.
Stephen E Arnold, September 21, 2015
September 17, 2015
The article titled IBM Watson Health Unit Begins to Take Shape on TechCrunch investigates the work being done to initiate the new healthcare unit in Boston and surrounding community that IBM hopes to use to address major issues in healthcare. Already this year IBM has purchased and partnered with numerous companies in the field. Recently, Boston Children’s Hospital joined the list as well as Apple and Johnson & Johnson. The article states,
“As part of today’s broad announcement, IBM indicated that it would be working with Sage Bionetworks’ Open Biomedical Research Platform around the first Apple projects. Sage will be collecting information from Apple Devices using ResearchKit developer tools, initially with breast cancer and Parkinson’s patients. It will be aggregating storing, curating and analyzing the information coming in from the Apple Devices. IBM will be providing the underlying technology with its IBM Watson Health Cloud platform.”
Additionally, IBM Watson Health Cloud for Life Science Compliance was also announced, as the cherry built on top of IBM Softlayer. It is designed to aid companies in the life science industry with a fully compliant cloud solution capable of meeting the demands of the heavily regulated field. Not mentioned in the article is any mention of what the revenues are for this Health Unit initiative, as if they are entirely irrelevant.
Chelsea Kerwin, September 17, 2015
September 14, 2015
i read “How to Balance the Five Analytic Dimensions.” The article presents information which reminded me of a college professor’s introductory lecture about data analysis.
The basics are definitely important. As the economy slips toward 2016, the notion of trade offs is an important one to keep in mind. According to the article, making sense of data via analytics involves understanding and balancing:
- The complexity of the data. Yep, data are often complex.
- Speed. Yep, getting results when the outputs are needed is important.
- The complexity of the analytics. Yep, adding a column of numbers and calculating the mean may be easy but not what the data doctor ordered.
- Accuracy and precision. The idea is that some outputs may be inappropriate for the task at hand. In theory, results should be accurate, or at least accurate enough.
- Data size. Yep, crunching lots of data can entail a number of “big” and “complex” tasks.
I agree with these points.
The problem is that the success of a big or small data project with simple or complex analytics is different from a laundry list of points to keep in mind. Knowing the five points is helpful if one is taking a test in a junior college information management class.
The write up does not address the rock upon which many analytics project crashes; that is:
What are the systems and methods for balancing resources across these five dimensions?
Without addressing this fundamental question, how can good decisions be made when the foundation is assumed to be level and stable?
Most analytics work just like the textbook said they would. The outputs are often baloney because the underlying assumptions were assumed to be spot on.
Why not just guess and skip the lecture? I know. Is this an acceptable answer: “That’s too time consuming and above our pay grade”?
The professional who offers this answer may get an A in class but an F in decision making.
Stephen E Arnold, September 14, 2015