December 5, 2013
The article titled IBM Introduces Watson to the Public Sector Cloud on GCN explores the potential for Watson now that IBM has opened it up to developers. IBM Watson Solutions recently won the 2013 North America New Product Innovation award for its combination of communication skills and evaluation abilities. Even more recently, IBM gave up on its competition with Amazon Web Services for a CIA contract for 10 years and $600M. But the loss has not rained out the parade, as the article explains:
“The initial target market for IBM Watson Developers Cloud is the private sector, with IBM touting third-party applications in such areas as retail and health care. But analysts say the offering will impact big data problems in the public sector, too. McCarthy sees potential for Watson-powered apps in such areas as fraud analysis, which the White House is ramping up due to worries about scammers taking advantage of consumers signing up for its new health care plans. “
Sounds like there is a job for Watson at Healthcare.gov, what with the massive potential for fraud issues. Another possibility is putting Watson to work on entity analytics for Homeland Security, looking for patterns in data. Entity analytics is mainly about comparing huge amounts of data and who could be better at that than IBM’s supercomputer?
Chelsea Kerwin, December 05, 2013
December 3, 2013
The explosion of big data continues to put pressure on IT departments. GCN examines how government agencies are approaching the challenge in, “As Big Data Grows, Technologies Evolve Into Ecosystems.” Writer Rutrell Yasin frames the issue of deploying big-data platforms and analytics:
“But what is the best way to accomplish this: By cobbling together various ‘point products’ that address all of the big data processes, or by building a ‘big data platform’ that integrates all of the capabilities organizations need to apply deep analytics?”
The article goes on to examine the most prominent solutions competing for institutional big-data dollars. Not surprisingly, IBM‘s Eric Sall advocates a comprehensive platform, like his company’s InfoSphere. It looks like many organizations, though, are responding to the lure of open source. Though it is often the cheaper approach, the disparate nature of open-source solutions can pose its own problems. The article looks at efforts from outfits like Red Hat and Cisco that aim to consolidate apps and systems from different sources (both open source and paid). It is worth a look if your organization is at or approaching the big-data-solution crossroads.
The article concludes:
“The bottom line is that organizations need these massively parallel processing systems and other big data tools that can scale out to address the volume, velocity and variety of big data, whether they come from a proprietary vendor’s platform or a platform based on open technologies. It makes life simpler for organizations if their workforce can unlock the value of their data via an ecosystem of integrated tools, industry experts said.”
Indeed, simpler is usually better. Even if saving money is your main goal, do not dismiss paid solutions that help manage open source resources; the savings in time and frustration often more than make up for the added cost.
Cynthia Murrell, December 03, 2013
November 28, 2013
The article titled DOD Says “No Mas” On Commercial Cloud, Puts Brakes on $450M Contract on Ars Technica has some concerned that the government is rethinking its commitment to the cloud. Scott Stewart, contracting officer for the Defense Information Systems Agency (DISA) explained the decision was caused by a lack of demand from the Defense Department.
The article explains:
“The contract, for which the DISA began drafting a request for proposals this summer, would have picked up to 10 cloud providers to supply Internet-accessible file storage, database hosting, Web hosting, and virtual servers—allowing the military to offload public, non-sensitive systems from its own infrastructure. As it turns out, the various military services and other DOD agencies that the DISA serves aren’t terribly interested in doing that. The federal government… has been trying to reduce the number of public-facing websites it maintains.”
It is yet to be determined whether the contract is being adjusted to meet the more modest requirements or scrapped entirely. As mentioned in the quote, this is not the only instance of concerns of overspending. In 2011 the White House froze all creation of new websites. In the meantime, the military has been dealing with security issues that have caused them to rely on DISA’s data centers.
Chelsea Kerwin, November 28, 2013
November 27, 2013
An article posted on Tech Eye titled US Spying is Killing the Internet Claims Google explains the outrage expressed by Google when it was released that the NSA had tapped into their system in order to obtain user information. Google’s security director Richard Salgado warns that the US government’s snooping could eventually lead to a “splinter net” in which governments put up barriers and cause the market to be restricted.
The article explains:
“Salgado warned that the NSA operations led to “a real concern” inside and outside the United States about the role of government and the Foreign Intelligence Surveillance Court, which decides in secret on legal problems about electronic surveillance efforts.”
But is the lady protesting too much? Google has been accused of its own plans to take over the Internet, as this article titled Google’s Latest Scheme to Control the Internet May Surprise You investigates on Worldcrunch. Google Plus in particular might warrant extra attention. In spite of being considered a failure when likened to Facebook, the article suggests that comparison is faulty. The number of Google Plus members may be small, but more important is Google’s ability to track and store the information we input.
And the money talks:
“Perhaps the proof is in the numbers: Google generated $50 billion in 2012 revenue, $40 billion of it from advertising. And though 2.7 billion Facebook “likes” are being registered every day, its revenue during the same period was just $4 billion.”
So let Google worry about the NSA all they want. Some of us are preoccupied with our paranoia about another company, which the article sums up as a Keanu Reeves style matrix in which we will all stay happily ignorant of our dependence.
Chelsea Kerwin, November 27, 2013
November 23, 2013
I read “Tension and Flaws Before Health Website Crash.” The good news is that the story focuses on what is now old news: Management challenges at the agency responsible for Healthcare.gov. The bad news—at least for champions of XML repositories, XML normalization, and XML as the “answer” to a wide range of information management woes—is that XML (extensible markup language) is not the slam dunk, whiz bang solution some true believers hope.
Here’s the passage that caught my attention:
Another sore point was the Medicare agency’s decision to use database software, from a company called MarkLogic, that managed the data differently from systems by companies like IBM, Microsoft and Oracle. CGI officials argued that it would slow work because it was too unfamiliar. Government officials disagreed, and its configuration remains a serious problem.
MarkLogic has not been identified as a vendor creating some headaches until now. MarkLogic has a system that can store information and data in an XML data management system. The trick is that content not in XML must be normalized; that is, converted to XML. MarkLogic has developed some proprietary methods to perform its data management operations. A person familiar with XML may not be conversant with the MarkLogic conventions. The upside of this approach is that MarkLogic has experts who are able to address most customer requests. The downside is that a person familiar with XML but not MarkLogic can introduce some problems into an otherwise spiffy system.
In the last few years, MarkLogic has had a number of senior management changes. I track the company via my Overflight system and have noted that the firm has gone from a company that does a good job of publicizing itself to an outfit that has trimmed back on its public presence. You can check out the MarkLogic Overflight on the ArnoldIT.com Web site. The minimal news flow, the absence of tweets, and the termination of public blog content can be verified by visiting the paste every few days.
One interesting aspect of MarkLogic is that the company has positioned itself as a publishing platform. Once content is in the repository, it is possible to slice and dice information and data. Publishers can use this feature to whip out books with little or no involvement of human editors. But the company has, like Verity, grafted on other features and services. These range from enterprise search to text mining to electronic mail management.
I heard that the company was to have been a $200 or $300 million dollar a year operation a few years ago. The firm may be the best kept secret in terms of its revenues and profits. If so, kudos. But if the company has not been able to demonstrate strong growth and healthy net profits, the firm may need to ramp up its publicity and marketing activities.
The New York Times’s comment may be hogwash. Even if a stretch, getting a paragraph that strikes me as less than favorable raises some questions; for example:
- Are proprietary extensions a good idea for an XML system that must be used by folks who are not into XML?
- Will the transformations between and among content from disparate systems remain bottlenecks during periods of high content flow and usage?
- Will Oracle seize on the MarkLogic system and revive its flow of information about the weaknesses of XML as compared with content stored in an Oracle data management system?
MarkLogic has rolled through three of four presidents in the last few years. Dave Kellogg departed, and I mostly lost track of who followed him. At the time of his departure MarkLogic was in the $60 million estimated revenues. Will the management turmoil kick in again? Will the company continue to expand its features and functions as Verity did prior to its initial public offering? Are there parallels between the trajectories of Convera, Delphes, Entopia, and Verity and MarkLogic. For some case analyses, check out www.xenky.com/vendor-profiles.
Stephen E Arnold, November 23, 2013
November 20, 2013
Healthcare.gov has a blog. You can find it at this link. There is a link for October posts. There is a link for September posts. I was not able to access the full set of posts for either month. Here’s what I saw:
I thought the content would be at this link.
Oversight, content management problem, content removal, or my error? Interesting. It is tough to search when content is not available for indexing.
I wanted to read the posts to the blog before and after the launch. No joy. Should I be suspicious?
Stephen E Arnold
November 1, 2013
The shocking and yet unsurprising article on The Washington Examiner titled Troubled Obamacare Website Wasn’t Tested Until A Week Before Launch provides the periodic table of healthcare.gov vendors and an immediate glimpse into the problems. A plethora of brave anonymous sources reveal that management was practically non-existent, with the central CMS unwilling to claim leadership, and other vendors unable to state who was in charge or overseeing the compatibility of the complex system. One such source quoted in the article explains,
“The challenge with this project was that the decisions were made very, very late in the project, and no one organization … seemed to know how this complex ecosystem of applications, interfaces, user processes and hardware should all work together.”… Another former CMS contract employee who also requested anonymity said, “CMS was not capable of being the integrator. … An integrator used to be someone like an IBM. That is how this business used to be run. CMS is not an integrator. CMS operates as numerous disparate organizations.”
The article also states that the government typically does use external contractors as system integrators because they know they are not good at it. Even the Air Force, which has some experience doing integration with weapons systems would not attempt IT integration. The problem here being that the government refused to relinquish control, and yet was unable to do the work required to make the website useable.
Chelsea Kerwin, November 01, 2013
October 31, 2013
The article Obama Administration Promises ‘Tech Surge’ to Fix Ailing Healthcare.gov website on The Verge discusses President Obama’s proposed path to adjusting the new healthcare initiative. Overshadowed by the government shutdown, the difficulties users have encountered on the site have included confusion, long waits, inability to create an account and even, as the article mentions, wrong information being sent to insurance providers. The article explains that the Department of Health and Human Services (HHS) will be launching a “tech surge” which comprises bringing in more consultants and the “aggressive” monitoring of problem areas. Some of this has already gone into effect, as the article states,
“Apparently, the website has already seen some improvements. For a short time, as HHS explains, the department created a “virtual waiting room” for those attempting to create an account — but this only caused more confusion. That ad hoc solution is gone now, but we still haven’t received a firm number of how many people have successfully created accounts and received insurance. Instead, the HHS has only said that the site has had 19 million unique visits.”
This embarrassment for the Obama administration is partially due to the complexity of the “technical hurdles” faced, especially that contractors most likely needed Federal Information Security Management Act certifications. Recently President Obama addressed the nation to admit to the issues with the site, and to reassure Americans that those issue are being worked on. The promised “tech surge” can’t hurt, but it waits to be seen whether adding more consultants will solve the myriad problems with the website.
Chelsea Kerwin, October 31, 2013
October 26, 2013
I read “Senator Intensifies Probe of Data Brokers.” It seems that the leaders in Washington, DC have discovered data aggregation. Let me think. Right. Data aggregation has been around for more than a half century. I remember Ian Sharp (anyone remember him?) telling me about his discovery of data aggregation when he was a lad and before he created his business in the 1970s.
The point of the write up I noted was:
“However, if these recent news accounts are accurate, they raise serious questions about whether Experian as a company has appropriate practices in place for vetting its customers and sharing sensitive consumer data with them, regardless of the particular line of business.” Mr. Rockefeller’s letter is part of a larger effort by the Commerce Committee to understand how companies collect, share and sell intimate details about the shopping habits, health concerns, family circumstances and financial status of consumers at a time when Americans are increasingly sharing personal information online.
I have not comment about Experian or any similar firm.
My reaction is that if the leaders in DC are willing to name a particular company, that’s interesting. More intriguing is the question, “Will the various committees start taking a closer look at outfits like Thomson Reuters, McGraw Hill, and (hold your breath), the New York Times?
There are many ways to deliver a solution to the problem of certain organizations disseminating information.
Stephen E Arnold, October 26, 2013
October 21, 2013
I will be giving my last public talk in 2013 at the upcoming Search Summit. I am revealing some data about the trajectory of commercial search versus free and open source search. My focus is not just on costs. I will address the elephant in the room that few of the sleek search poobahs elect to ignore—management.
As part of my preparation, I read an interesting public relations and positioning white paper from Oracle. The essay is “The Department of Defense (DoD) and Open Source Software.” You should be able to locate a copy at the Oracle Middleware Web page. But maybe not. Well, take that up with Oracle, Google, and whoever indexes public Web pages.
The argument in the white paper is that open source is useful within the context of commercial software. The premise is that a commercial company develops robust products like Oracle’s database and then rigorously engineers that product to meet the tough standards imposed by the US government. Then, canny engineers will integrate some open source software into that commercial solution. The client—in this case the Microsoft loving Department of Defense—will be able to get the support it needs to handle the demands of global war fighting.
There are three fascinating rhetorical flourishes in the white paper. These are directly germane to the direction some of the discussions of commercial and proprietary versus free and open source software have been moving. I will give a couple of case examples in my talk in early November 2013, and I assume that the slide deck for my talk will find its way into one or more indexing services. I won’t plow that ground again. Below are some new thoughts.
First, the notion that commercial and proprietary software is better than open source software is amusing. I think that any enterprise software is rife with bugs and problems that can never be fixed because there is neither time, money, or appetite to ameliorate the problems. I was at a meeting at the world’s largest software company when one executive said, “There are a couple thousand bugs in Word. Numbering is one issue. We will maybe get around to fixing the problem.” That was six years ago. Guess what? Numbering is still an interesting challenge in a long document. Is Oracle like the world’s largest software company? Oracle has some interesting features in its products? Check out this sample page. Make your own decision. Software has been, is, and will be complicated stuff. The fact that people correlate clicking a hot link with “simple” just adds impetus to the “this is easy” view of modern systems. No software is better. Some works within specific parameters. Push outside the parameters and you find darned exciting things.
Second, the idea that a large bureaucracy can make decisions based on cost benefits is crazy. Worldwide bean counters and lawyers work to nail down assumptions and statements of work that are designed to minimize costs and deliver specific functionality. How is that working out? If I read one more after the fact analysis of the flawed heath insurance Web site, I may unplug my computer and revert to paper and printed books. I did a major study of a government site in 2007. Guess what? The system did not work and still does not work. Are there analyses, reports, and Web pages explaining the issue? Sure. What’s the fix? People either go to a government office and talk to a human or make a phone call in the hope that the human on the other end of the line can address the issue. The computer system? Unchanged. My report? Probably still in a drawer somewhere.
Third, the idea that a publicly traded company cares about open source is amusing. Open source is simply a vehicle to reduce costs to the publicly traded company and generate consulting revenue. The fact is that most of the folks who embrace open source need some help from firms specializing in that open source product. I can name two companies, each with more than $30 million in venture funding, that have a business model built on selling proprietary software, consulting, and engineering services. Open source sure looks like a Trojan horse to me. Why does IBM embrace Lucene yet sell branded products and services? Maybe to eliminate some software acquisition costs and sell consulting.
A happy quack to http://goo.gl/lxKb6I
On one hand, Oracle is correct in pointing out that free and open source software looks cheaper than commercial and proprietary software in terms of licensing fees. Oracle is also correct that the major cost of software has little to do with the license fee.
On the other hand, Oracle adds some mist to the fog surrounding open source. When open source vendors have to generate revenue to pay back investors or build out their commercial business, the costs are likely to be high.
Open source software begins as a public spirited effort, a way to demonstrate programming skills, and a marketing effort. There are other reasons as well. But in today’s world, software is the weak link in most businesses. Systems are getting less reliable, despite the long string of nines that some companies use to prove their systems are wonderful. But like the optical character recognition program that is 99 percent accurate, the more content pushed through these system, the more the errors mount. Xerox continues to struggle with error rates in a technology that was supposed to be a slam dunk.
Net net: Read the Oracle white paper. Then when you work out a budget, focus less on the sizzle of open source and more on the basic management skills it takes to make something work on time and on budget. Remember. Publicly traded companies and open source companies that have taken money from venture capitalists have to generate a profit or they disappear.
The basics are important. The Oracle white paper skips over some of these in its effort to put open source in perspective. Any software project requires attention to detail, pragmatism, technical expertise, and money.
Stephen E Arnold, October 21, 2013