December 1, 2015
Electronic health records (EHRs) were to bring us reductions in cost and, just as importantly, seamless record-sharing between health-care providers. “Epic Fail” at Mother Jones explains why that has yet to happen. The short answer: despite government’s intentions, federation is simply not part of the Epic plan; vendor lock-in is too profitable to relinquish so easily.
Reporter Patrick Caldwell spends a lot of pixels discussing Epic Systems, the leading EHR vendor whose CEO sat on the Obama administration’s 2009 Health IT Policy Committee, where many EHR-related decisions were made. Epic, along with other EHR vendors, has received billions from the federal government to expand EHR systems. Caldwell writes:
“But instead of ushering in a new age of secure and easily accessible medical files, Epic has helped create a fragmented system that leaves doctors unable to trade information across practices or hospitals. That hurts patients who can’t be assured that their records—drug allergies, test results, X-rays—will be available to the doctors who need to see them. This is especially important for patients with lengthy and complicated health histories. But it also means we’re all missing out on the kind of system-wide savings that President Barack Obama predicted nearly seven years ago, when the federal government poured billions of dollars into digitizing the country’s medical records. ‘Within five years, all of America’s medical records are computerized,’ he announced in January 2009, when visiting Virginia’s George Mason University to unveil his stimulus plan. ‘This will cut waste, eliminate red tape, and reduce the need to repeat expensive medical tests.’ Unfortunately, in some ways, our medical records aren’t in any better shape today than they were before.”
Caldwell taps into his own medical saga to effectively illustrate how important interoperability is to patients with complicated medical histories. Epic seems to be experiencing push-back, both from the government and from the EHR industry. Though the company was widely expected to score the massive contract to modernize the Department of Defense’s health records, that contract went instead to competitor Cerner. Meanwhile, some of Epic’s competitors have formed the nonprofit CommonWell Health Alliance Partnership, tasked with setting standards for records exchange. Epic has not joined that partnership, choosing instead to facilitate interoperability between hospitals that use its own software. For a hefty fee, of course.
Perhaps this will all be straightened out down the line, and we will finally receive both our savings and our medical peace of mind. In the meantime, many patients and providers struggle with changes that appear to have only complicated the issue.
Cynthia Murrell, December 1, 2015
December 1, 2015
What is a researcher’s dream? A researcher’s dream is to be able to easily locate and access viable, full-text resources without having to deal with any copyright issues. One might think that all information is accessible via the Internet and a Google search, but if this is how you think professional research happens then you are sadly mistaken. Most professional articles and journals are locked behind corporate academic walls with barbed wire made from copyright laws.
PR Newswire says in “Linguamatics Expands Cloud Text Mining Platform To Include Full-Text Articles” as way for life science researchers to legally bypass copyright. Linguamatics is a natural language processing text-mining platform and it will now incorporate the Copyright Clearance Center’s new text mining solution RightFind XML. This will allow researchers to have access to over 4,000 peer-reviewed journals from over twenty-five of scientific, technical, and medical publishers.
“The solution enables researchers to make discoveries and connections that can only be found in full-text articles. All of the content is stored securely by CCC and is pre-authorized by publishers for commercial text mining. Users access the content using Linguamatics’ unique federated text mining architecture which allows researchers to find the key information to support business-critical decisions. The integrated solution is available now, and enables users to save time, reduce costs and help mitigate an organization’s copyright infringement risk.”
I can only hope that other academic databases and publishers will adopt a similar and (hopefully) more affordable way to access full-text, viable resources. One of the biggest drawbacks to Internet research is having to rely on questionable source information, because it is free and readily available. Easier access to more accurate information form viable resources will not only improve information, but also start a trend to increase its access.
Whitney Grace, December 1, 2015
November 25, 2015
The article titled Business Intelligence Vendor Yellowfin Signs Global Reseller Agreement with Zinnovate on Sys-Con Media provides an overview of the recent partnership between the two companies. Zinnovate will be able to offer Yellowfin’s Business Intelligence solutions and services, and better fulfill the needs that small and mid-size businesses have involving enterprise quality BI. The article quotes Zinnovate CEO Hakan Nilsson on the exciting capabilities of Yellowfin’s technology,
“Flexible deployment options were also important… As a completely Web-based application, Yellowfin has been designed with SaaS hosting in mind from the beginning, making it simple to deploy on-premise or as a cloud-based solution. Yellowfin’s licensing model is simple. Clients can automatically access Yellowfin’s full range of features, including its intuitive data visualization options, excellent Mobile BI support and collaborative capabilities. Yellowfin provides a robust enterprise BI platform at a very competitive price point.”
As for the perks to Yellowfin, the Managing Director Peter Baxter explained that Zinnovate was positioned to help grow the presence of the brand in Sweden and in the global transport and logistics market. In the last few years, Zinnovate has developed its service portfolio to include customers in banking and finance. Both companies share a dedication to customer-friendly, intuitive solutions.
Chelsea Kerwin, November 25, 2015
November 9, 2015
I clipped an item to read on the fabulous flight from America to shouting distance of Antarctica. Yep, it’s getting smaller.
The write up was “So Far, Tepid Responses to Growing Cloud Integration Hariball.” I think the words “hair” and “ball” convinced me to add this gem to my in flight reading list.
The article is based on a survey (nope, I don’t have the utmost confidence in vendor surveys). Apparently the 300 IT “leaders” experience
pain around application and data integration between on premises and cloud based systems.
I had to take a couple of deep breaths to calm down. I thought the marketing voodoo from vendors embracing utility services (Lexmark/Kapow), metasearch (Vivisimo, et al), unified services (Attivio, Coveo, et al), and licensees of conversion routines from outfits ranging from Oracle to “search consulting” in the “search technology” business had this problem solved.
If the vendors can’t do it, why not just dump everything in a data lake and let an open source software system figure everything out. Failing that, why not convert the data into XML and use the magic of well formed XML objects to deal with these issues?
It seems that the solutions don’t work with the slam dunk regularity of a 23 year old Michael Jordan.
The write up explains:
The old methods may not cut it when it comes to pulling things together. Two in three respondents, 59%, indicate they are not satisfied with their ability to synch data between cloud and on-premise systems — a clear barrier for businesses that seek to move beyond integration fundamentals like enabling reporting and basic analytics. Still, and quite surprisingly, there isn’t a great deal of support for applying more resources to cloud application integration. Premise-to-cloud integration, cloud-to-cloud integration, and cloud data replication are top priorities for only 16%, 10% and 10% of enterprises, respectively. Instead, IT shops make do with custom coding, which remains the leading approach to integration, the survey finds.
My hunch is that the survey finds that hoo-hah is not the same as the grunt work required to take data from A, integrate it with data from B, and then do something productive with the data unless humans get involved.
I noted this point:
As the survey’s authors observe. “companies consistently under estimate the cost associated with custom code, as often there are hidden costs not readily visible to IT and business leaders.”
Reality just won’t go away when it comes to integrating disparate digital content. Neither will the costs.
Stephen E Arnold, November 9, 2015
October 28, 2015
Data such as financial information and medical files are supposed to be protected behind secure firewalls and barriers that ensure people’s information does not fall into the wrong hands. While digital security is at the best it has ever been, sometimes a hacker does not to rely on his/her skills to get sensitive information. Sometimes all they need to do is wait for an idiotic mistake, such as what happened on Amazon Web Services wrote Gizmodo in “Error Exposes 1.5 Million People’s Private Records On Amazon Web Services.”
Tech junkie Chris Vickery heard a rumor that “strange data dumps” could appear on Amazon Web Services, so he decided to go looking for some. He hunted through AWS, found one such dump, and it was a huge haul or it would have been if Vickery was a hacker. Vickery discovered it was medical information belonging to 1.5 million people and from these organizations: Kansas’ State Self Insurance Fund, CSAC Excess Insurance Authority, and the Salt Lake County Database.
“The data came from Systema Software, a small company that manages insurance claims. It still isn’t clear how the data ended up on the site, but the company did confirm to Vickery that it happened. Shortly after Vickery made contact with the affected organizations, the database disappeared from the Amazon subdomain.”
The 1.5 million people should be thanking Vickery, because he alerted these organizations and the data was immediately removed from the Amazon cloud. It turns out that Vickery was the only one to access the data, but it begs the question what would happen if a malicious hacker had gotten hold of the data? You can count on that the medical information would have been sold to the highest bidder.
Vickery’s discovery is not isolated. Other organizations are bound to be negligent in data and your personal information could be posted in an unsecure area. How can you get organizations to better protect your information? Good question.
Whitney Grace, October 28, 2015
October 15, 2015
Google is never afraid to brag about its services and how much better they are compared to their competitors. Google brandishes its supposed superiority with cloud computing on its Google Cloud Platform Blog with the post, “Google Cloud Platform Delivers The Industry’s Best Technical And Differentiated Features.” The first line in the post even comes out as a blanket statement for how Google feels about its cloud platform: “I’ll come right out and say it: Google Cloud Platform is a better cloud.”
One must always take assertations from a company’s Web site as part of its advertising campaign to peddle the service. Google products and services, however, usually have quality written into their programming, but Google defends the above claim saying it has core advantages and technical differentiators in compute, storage, network, and distributed software tiers. Google says this is for two reasons:
“1. Cloud Platform offers features that are very valuable for customers, and very difficult for competitors to emulate.
- The underlying technologies, created and honed by Google over the last 15 years, enable us to offer our services at a much lower price point.”
Next the post explains the different features that make the cloud platform superior: live migration, scaling load balances, forty-five second boot times, three second archive restore, and 680,000 IOPS sustained Local SSD read rate. Google can offer these features, because it claims to have the best technology and software engineers. It does not stop there, because Google also offers its cloud platform at forty percent cheaper than other cloud platforms. It delves into details about why it can offer a better and cheaper service. While the argument is compelling, it is still Google cheerleading itself.
Google is one of the best technology companies, but it is better to test and review other cloud platforms rather than blinding following a blog post.
October 8, 2015
What has Paul Doscher been up to? We used to follow him when he was a senior executive over at LucidWorks, but he has changed companies and is now riding on clouds. PRWeb published the press release “Restlet Appoints Paul Doscher As New CEO To Accelerate Deployment Of Most Comprehensive Cloud-Based API Platform.” Doscher is the brand new president, CEO, and board member at Restlet, leading creators of deployed APIs framework. Along with LucidWorks, Doscher held executive roles at VMware, Oracle, Exalead, and BusinessObjects.
Restlet hot its start as an open source project by Jerome Louvel. Doscher will be replacing Louvel as the CEO and is quite pleased about handing over the reins to his successor:
“ ‘I’m extremely pleased that we have someone with Paul’s experience to grow Restlet’s leadership position in API platforms,’ said Louvel. ‘Restlet has the most complete API cloud platform in the industry and our ease of use makes it the best choice for businesses of any size to publish and consume data and services as APIs. Paul will help Restlet to scale so we can help more businesses use APIs to handle the exploding number of devices, applications and use cases that need to be supported in today’s digital economy.’ ”
Doscher wants to break down the barriers for cloud adoption and take it to the next level. His first task as the new CEO will be implementing the API testing tools vendor DHC and using it to enhance Restlet’s API Platform.
Restlet is ecstatic to have Doscher on board and Louvel is probably heading off to a happy retirement.
October 2, 2015
Enterprise management systems (ECM) were supposed to provide an end all solution for storing and organizing digital data. Data needs to be stored for several purposes: taxes, historical record, research, and audits. Government agencies deployed ECM solutions to manage their huge data loads, but the old information silos are not performing up to modern standards. GCN discusses government agencies face upgrading their systems in “Migrating Your Legacy ECM Solution.”
When ECMs first came online, information was stored in silos programmed to support even older legacy solutions with niche applications. The repositories are so convoluted that users cannot find any information and do not even mention upgrading the beasts:
“Aging ECM systems are incapable of fitting into the new world of consumer-friendly software that both employees and citizens expect. Yet, modernizing legacy systems raises issues of security, cost, governance and complexity of business rules — all obstacles to a smooth transition. Further, legacy systems simply cannot keep up with the demands of today’s dynamic workforce.”
Two solutions present themselves: data can be moved from an old legacy system to a new one or simply moving the content from the silo. The barriers are cost and time, but the users will reap the benefits of upgrades, especially connectivity, cloud, mobile, and social features. There is the possibility of leaving the content in place using interoperability standards or cloud-based management to make the data searchable and accessible.
The biggest problem is actually convincing people to upgrade. Why fix what is not broken? Then there is the justification of using taxpayers’ money for the upgrade when the money can be used elsewhere. Round and round the argument goes.
September 23, 2015
It is here at last! After several years, Microsoft has finally upgrades its SharePoint and it comes with an exciting list of brand new features. That is not all Microsoft released an upgrade for; Microsoft’s new cloud hybrid search also has a beta. PC World examines the new Microsoft betas in the article, “Microsoft Tests SharePoint 2014 And Enterprise Cloud Hybrid Search.”
SharePoint, the popular collaborative content platform, is getting well deserved upgrade that will allow users to finally upload files up to ten gigabytes, a new App Launcher for easier accessibility for applications, simplified file sharing controls, and better accessibility on mobile devices. As with all Microsoft upgrades, however, it is recommended that SharePoint 2016 is not downloaded into the product environment.
The new cloud hybrid search will make it easier for users to locate files across various Office 365 programs:
“On top of the SharePoint beta, Microsoft’s new cloud hybrid search feature will allow Office 365 users who also run on-premises SharePoint servers to easily access both the files stored in their company’s servers as well as those stored in Microsoft’s cloud. This means that Microsoft Delve, which gives users an at-a-glance view of their team members’ work, can show files that are stored in a company’s servers and in Microsoft’s servers side by side.”
The new search feature will ease server’s workload for creating and maintaining search indices. Microsoft is encouraging organizations to switch to its cloud services, but it still offers products and support for on-site packages.
While the cloud offers many conveniences, such as quick access to files and for users to be able to work from any location, the search function will increase an ease of use. However, security is still a concern for many organizations that prefer to maintain on-site servers.
September 22, 2015
Exalead is Dassault Systems’s big data software targeted specifically at businesses. Exalead offers innovative data discovery and analytics solutions to manage information in real time across various servers and generate insightful reports to make better, faster decisions. It is the big data solution of choice for many businesses across various industries. The Exalead blog shares that “PricewaterhouseCoopers Is Launching Its Information Management Application, Based on Exalead CloudView.”
PricewaterhouseCoopers (PwC) analyzed the amount of time users spent trying to locate, organize, and disseminated information. When users spend the time on information management, they lose two valuable resources: time and money. PwC designed Pulse, a search and information tool as a solution to the problem.
“The EXALEAD CloudView software solution from Dassault Systèmes facilitates the rapid search and use of sources of structured and unstructured information. In existence since 2007, this enterprise information management concept was integrated initially in other software applications. Since it was reworked as EXALEAD CloudView, the configuration of the queries has become easier and they are processed much faster. Furthermore, the results of the searches are more precise, significantly reducing the number of duplicates and the time wasted managing them. PwC has deliberately decided to roll out Pulse on an international scale gradually, in order to generate plenty of enthusiasm amongst users. A business case is prepared for each country on the basis of its needs, the benefits and the potential savings. PwC also intends to make the content in Pulse accessible by other internal systems (e.g., the project workspaces), to integrate the sources, and to make the search function even smarter.”
Pulse is supposed to cut costs and reinvest the resources into more fruitful venues. One interesting aspect to note is that PwC did not build the Pulse upgrade, Exalead provided the plumbing.