March 8, 2014
I am all for slipshod work, particularly when delivered by government contractors. Hey, the emphasis is on scope changes and engineering change orders, not on delivering what the wild and crazy statement of work requires.
I was delighted to read the Hacker News thread at http://bit.ly/MW4epC about broken links and missing data sets on Data.gov at www.data.gov. The thread contains a number of interesting comments. These may be evidence that substandard attention to detail suggests digital eczema. Just Bing it.
In the early days of www.firstgov.gov, some effort was expended to minimize the number of dead links on US government servers. In the present incarnation as www.usa.gov, there are some interesting changes.
My view is that the dead links are a lesser problem than content that is no longer available and to which the links have been removed. If I were younger, I would suggest that you, gentle reader, look for information about MIC, RAC, and ZPIC contract awards. But I will not.
Stephen E Arnold, March 8, 2014
February 11, 2014
One of my two or three readers sent me a link to “DARPA-BAA-14-21: Memex.” The item is interesting because it reaches back to the idea of Vannevar Bush, sidesteps the use of the word “Memex” by a search vendor once operating in the United Kingdom, and provides pretty clear proof that DARPA is not happy with search. You can dig into the details at https://www.fbo.gov/utils/view?id=32c351ba7850360e140a29f363819052.
US government content has some interesting characteristics. One of the most interesting is that items like DARPA-BAA-14-21 appear without context. For example, there is not a hint, nary a whisper of In-Q-Tel’s investments in search and content processing. Years ago, I heard at an intel conference that In-Q-Tel funds promising companies but few of these deliver operational payoffs. You can see a list of In-Q-Tel investments at https://www.iqt.org/portfolio/. Some of these companies deliver darned interesting demonstration systems. Others have offered solutions that were eventually abandoned. Others are like Fourth of July fireworks; that is, the financial support and walk arounds provide the type of show that some decision makers perceive as progress and purposeful action.
The net net is that this DARPA item underscores that information retrieval system is not appropriate for the future needs of DARPA. For me, this is one indication that my assertion about the troubled state of information retrieval.
Perhaps the funding, the TREC tests, and the DARPA solicitation will yield a payoff for operational personnel. “Perhaps” is a bit soft even if the devalued dollars are real. Our research offers some interesting facts that finding information today is more difficult than it was five years ago.
Stephen E Arnold, February 11, 2014
February 7, 2014
What do you make of this headline from All Analytics: “Text And The City: Municipalities Discover Text Analytics”? Businesses have been using text mining software for awhile and understand the insights it can deliver to business decisions. The same goes for law firms that must wade through piles of litigation. Are governments really only catching onto text mining software now?
The article reports on several examples where municipal governments have employed text mining and analytics. Law enforcement agencies are using it to identify key concepts to deliver quick information to officials. The 311 systems, known as the source of local information and immediate contact with services, is another system that can benefit from text analytics, because it can organize and process the information faster and more consistently.
There are many ways text analytics can be helpful to local governments:
“Identifying root causes is a unique value proposition for text analytics in government. It’s one thing to know something happened — a crime, a missed garbage collection, a school expulsion — and another to understand where the problem started. Conventional data often lacks clues about causes, but text reveals a lot.”
The bigger question is will local governments spend the money on these systems? Perhaps, but analytic software is expensive and governments are pressured to find low-cost solutions. Expertise and money are in short supply on this issue.
Whitney Grace, February 07, 2014
December 5, 2013
The article titled IBM Introduces Watson to the Public Sector Cloud on GCN explores the potential for Watson now that IBM has opened it up to developers. IBM Watson Solutions recently won the 2013 North America New Product Innovation award for its combination of communication skills and evaluation abilities. Even more recently, IBM gave up on its competition with Amazon Web Services for a CIA contract for 10 years and $600M. But the loss has not rained out the parade, as the article explains:
“The initial target market for IBM Watson Developers Cloud is the private sector, with IBM touting third-party applications in such areas as retail and health care. But analysts say the offering will impact big data problems in the public sector, too. McCarthy sees potential for Watson-powered apps in such areas as fraud analysis, which the White House is ramping up due to worries about scammers taking advantage of consumers signing up for its new health care plans. “
Sounds like there is a job for Watson at Healthcare.gov, what with the massive potential for fraud issues. Another possibility is putting Watson to work on entity analytics for Homeland Security, looking for patterns in data. Entity analytics is mainly about comparing huge amounts of data and who could be better at that than IBM’s supercomputer?
Chelsea Kerwin, December 05, 2013
December 3, 2013
The explosion of big data continues to put pressure on IT departments. GCN examines how government agencies are approaching the challenge in, “As Big Data Grows, Technologies Evolve Into Ecosystems.” Writer Rutrell Yasin frames the issue of deploying big-data platforms and analytics:
“But what is the best way to accomplish this: By cobbling together various ‘point products’ that address all of the big data processes, or by building a ‘big data platform’ that integrates all of the capabilities organizations need to apply deep analytics?”
The article goes on to examine the most prominent solutions competing for institutional big-data dollars. Not surprisingly, IBM‘s Eric Sall advocates a comprehensive platform, like his company’s InfoSphere. It looks like many organizations, though, are responding to the lure of open source. Though it is often the cheaper approach, the disparate nature of open-source solutions can pose its own problems. The article looks at efforts from outfits like Red Hat and Cisco that aim to consolidate apps and systems from different sources (both open source and paid). It is worth a look if your organization is at or approaching the big-data-solution crossroads.
The article concludes:
“The bottom line is that organizations need these massively parallel processing systems and other big data tools that can scale out to address the volume, velocity and variety of big data, whether they come from a proprietary vendor’s platform or a platform based on open technologies. It makes life simpler for organizations if their workforce can unlock the value of their data via an ecosystem of integrated tools, industry experts said.”
Indeed, simpler is usually better. Even if saving money is your main goal, do not dismiss paid solutions that help manage open source resources; the savings in time and frustration often more than make up for the added cost.
Cynthia Murrell, December 03, 2013
November 28, 2013
The article titled DOD Says “No Mas” On Commercial Cloud, Puts Brakes on $450M Contract on Ars Technica has some concerned that the government is rethinking its commitment to the cloud. Scott Stewart, contracting officer for the Defense Information Systems Agency (DISA) explained the decision was caused by a lack of demand from the Defense Department.
The article explains:
“The contract, for which the DISA began drafting a request for proposals this summer, would have picked up to 10 cloud providers to supply Internet-accessible file storage, database hosting, Web hosting, and virtual servers—allowing the military to offload public, non-sensitive systems from its own infrastructure. As it turns out, the various military services and other DOD agencies that the DISA serves aren’t terribly interested in doing that. The federal government… has been trying to reduce the number of public-facing websites it maintains.”
It is yet to be determined whether the contract is being adjusted to meet the more modest requirements or scrapped entirely. As mentioned in the quote, this is not the only instance of concerns of overspending. In 2011 the White House froze all creation of new websites. In the meantime, the military has been dealing with security issues that have caused them to rely on DISA’s data centers.
Chelsea Kerwin, November 28, 2013
November 27, 2013
An article posted on Tech Eye titled US Spying is Killing the Internet Claims Google explains the outrage expressed by Google when it was released that the NSA had tapped into their system in order to obtain user information. Google’s security director Richard Salgado warns that the US government’s snooping could eventually lead to a “splinter net” in which governments put up barriers and cause the market to be restricted.
The article explains:
“Salgado warned that the NSA operations led to “a real concern” inside and outside the United States about the role of government and the Foreign Intelligence Surveillance Court, which decides in secret on legal problems about electronic surveillance efforts.”
But is the lady protesting too much? Google has been accused of its own plans to take over the Internet, as this article titled Google’s Latest Scheme to Control the Internet May Surprise You investigates on Worldcrunch. Google Plus in particular might warrant extra attention. In spite of being considered a failure when likened to Facebook, the article suggests that comparison is faulty. The number of Google Plus members may be small, but more important is Google’s ability to track and store the information we input.
And the money talks:
“Perhaps the proof is in the numbers: Google generated $50 billion in 2012 revenue, $40 billion of it from advertising. And though 2.7 billion Facebook “likes” are being registered every day, its revenue during the same period was just $4 billion.”
So let Google worry about the NSA all they want. Some of us are preoccupied with our paranoia about another company, which the article sums up as a Keanu Reeves style matrix in which we will all stay happily ignorant of our dependence.
Chelsea Kerwin, November 27, 2013
November 23, 2013
I read “Tension and Flaws Before Health Website Crash.” The good news is that the story focuses on what is now old news: Management challenges at the agency responsible for Healthcare.gov. The bad news—at least for champions of XML repositories, XML normalization, and XML as the “answer” to a wide range of information management woes—is that XML (extensible markup language) is not the slam dunk, whiz bang solution some true believers hope.
Here’s the passage that caught my attention:
Another sore point was the Medicare agency’s decision to use database software, from a company called MarkLogic, that managed the data differently from systems by companies like IBM, Microsoft and Oracle. CGI officials argued that it would slow work because it was too unfamiliar. Government officials disagreed, and its configuration remains a serious problem.
MarkLogic has not been identified as a vendor creating some headaches until now. MarkLogic has a system that can store information and data in an XML data management system. The trick is that content not in XML must be normalized; that is, converted to XML. MarkLogic has developed some proprietary methods to perform its data management operations. A person familiar with XML may not be conversant with the MarkLogic conventions. The upside of this approach is that MarkLogic has experts who are able to address most customer requests. The downside is that a person familiar with XML but not MarkLogic can introduce some problems into an otherwise spiffy system.
In the last few years, MarkLogic has had a number of senior management changes. I track the company via my Overflight system and have noted that the firm has gone from a company that does a good job of publicizing itself to an outfit that has trimmed back on its public presence. You can check out the MarkLogic Overflight on the ArnoldIT.com Web site. The minimal news flow, the absence of tweets, and the termination of public blog content can be verified by visiting the paste every few days.
One interesting aspect of MarkLogic is that the company has positioned itself as a publishing platform. Once content is in the repository, it is possible to slice and dice information and data. Publishers can use this feature to whip out books with little or no involvement of human editors. But the company has, like Verity, grafted on other features and services. These range from enterprise search to text mining to electronic mail management.
I heard that the company was to have been a $200 or $300 million dollar a year operation a few years ago. The firm may be the best kept secret in terms of its revenues and profits. If so, kudos. But if the company has not been able to demonstrate strong growth and healthy net profits, the firm may need to ramp up its publicity and marketing activities.
The New York Times’s comment may be hogwash. Even if a stretch, getting a paragraph that strikes me as less than favorable raises some questions; for example:
- Are proprietary extensions a good idea for an XML system that must be used by folks who are not into XML?
- Will the transformations between and among content from disparate systems remain bottlenecks during periods of high content flow and usage?
- Will Oracle seize on the MarkLogic system and revive its flow of information about the weaknesses of XML as compared with content stored in an Oracle data management system?
MarkLogic has rolled through three of four presidents in the last few years. Dave Kellogg departed, and I mostly lost track of who followed him. At the time of his departure MarkLogic was in the $60 million estimated revenues. Will the management turmoil kick in again? Will the company continue to expand its features and functions as Verity did prior to its initial public offering? Are there parallels between the trajectories of Convera, Delphes, Entopia, and Verity and MarkLogic. For some case analyses, check out www.xenky.com/vendor-profiles.
Stephen E Arnold, November 23, 2013
November 20, 2013
Healthcare.gov has a blog. You can find it at this link. There is a link for October posts. There is a link for September posts. I was not able to access the full set of posts for either month. Here’s what I saw:
I thought the content would be at this link.
Oversight, content management problem, content removal, or my error? Interesting. It is tough to search when content is not available for indexing.
I wanted to read the posts to the blog before and after the launch. No joy. Should I be suspicious?
Stephen E Arnold
November 1, 2013
The shocking and yet unsurprising article on The Washington Examiner titled Troubled Obamacare Website Wasn’t Tested Until A Week Before Launch provides the periodic table of healthcare.gov vendors and an immediate glimpse into the problems. A plethora of brave anonymous sources reveal that management was practically non-existent, with the central CMS unwilling to claim leadership, and other vendors unable to state who was in charge or overseeing the compatibility of the complex system. One such source quoted in the article explains,
“The challenge with this project was that the decisions were made very, very late in the project, and no one organization … seemed to know how this complex ecosystem of applications, interfaces, user processes and hardware should all work together.”… Another former CMS contract employee who also requested anonymity said, “CMS was not capable of being the integrator. … An integrator used to be someone like an IBM. That is how this business used to be run. CMS is not an integrator. CMS operates as numerous disparate organizations.”
The article also states that the government typically does use external contractors as system integrators because they know they are not good at it. Even the Air Force, which has some experience doing integration with weapons systems would not attempt IT integration. The problem here being that the government refused to relinquish control, and yet was unable to do the work required to make the website useable.
Chelsea Kerwin, November 01, 2013