Defending Against Java Deserialization Ransomware
July 13, 2016
What is different about the recent rash of ransomware attacks against hospitals (besides the level of callousness it takes to hold the well-being of hospital patients for ransom)? CyberWatch brings us up to date in, “My Layman’’s Terms: The Java Deserialization Vulnerability in Current Ransomware.” Writer Cheryl Biswas begins by assuring us it is practicality, not sheer cruelty, that has hackers aiming at hospitals. Other entities, like law enforcement agencies, which rely on uninterrupted access to their systems to keep people safe are also being attacked. Oh, goody.
The problem begins with a vulnerability at the very heart of any Java-based system, the server. And here we thought open source was more secure than proprietary software. Biswas informs us:
“This [ransomware] goes after servers, so it can bring down entire networks, and doesn’t rely on the social engineering tactics to gain access. It’s so bad US-CERT has issued this recent advisory. I’ve laid out what’s been made available on just how this new strain of ransomware works. And I’ve done it in terms to help anybody take a closer look at the middleware running in their systems currently. Because a little knowledge could be dangerous thing used to our advantage this time.”
The article goes on to cover what this strain of ransomware can do, who could be affected, and how. One key point—anything that accepts serialized Java objects could be a target, and many Java-based middleware products do not validate untrusted objects before deserialization. See the article for more technical details, and for Biswas’ list of sources. She concludes with these recommendations:
“Needs to Happen:
“Enterprises must find all the places they use deserialized or untrusted data. Searching code alone will not be enough. Frameworks and libraries can also be exposed.
“Need to harden it against the threat.
“Removing commons collections from app servers will not be enough. Other libraries can be affected.
“Contrast Sec has a free tool for addressing issue. Runtime Application Self-Protection RASP. Adds code to deserialization engine to prevent exploitation.”
Organizations the world over must not put off addressing these vulnerabilities, especially ones in charge of health and safety.
Cynthia Murrell, July 13, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Watson Weekly: IBM Watson Service for Use in the IBM Cloud: Bluemix Paas, IBM SPSS, Watson Analytics
July 5, 2016
The article on ComputerWorld titled Review: IBM Watson Strikes Again relates the recent expansions of Watson’s cloud service portfolio, who is still most famous for winning on Jeopardy. The article beings by evoking that event from 2011, which actually only reveals a small corner of Watson’s functions. The article mentions that to win Jeopardy, Watson basically only needed to absorb Wikipedia, since 95% of the answers are article titles. New services for use in the IBM Cloud include the Bluemix Paas, IBM SPSS, and Predictive Analytics. Among the Bluemix services is this gem,
“Personality Insights derives insights from transactional and social media data…to identify psychological traits, which it returns as a tree of characteristics in JSON format. Relationship Extraction parses sentences into their components and detects relationships between the components (parts of speech and functions) through contextual analysis. The Personality Insights API is documented for Curl, Node, and Java; the demo for the API analyzes the tweets of Oprah, Lady Gaga, and King James as well as several textual passages.”
Bluemix also consists of AlchemyAPI for ftext and image content reading, Concept Expansion and Concept Insights, which offers text analysis and linking of concepts to Wikipedia topics. The article is less kind to Watson Analytics, a Web app for data analysis with ML, which the article claims “tries too hard” and is too distracting for data scientists.
Chelsea Kerwin, July 5, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
What Is in a Name? Procedures Remain Indifferent
July 2, 2016
I read “Brexit: “Bayesian” Statistics Renamed “Laplacian” Statistics.” Years ago I worked briefly with a person who was I later learned a failed webmaster and a would-be wizard. One of that individual’s distinguishing characteristics was an outright rejection of Bayesian methods. The person thought la place was a non American English way or speaking French. I wonder where that self appointed wizard is now. Oh, well. I hope the thought leader is scrutinizing the end of Bayes.
According to the write up:
With the U.K. leaving the E.U., it’s time for “Bayesian” to exit its titular role and be replaced by “Laplacian”.
I support this completely. I assume that the ever efficient EU bureaucracy in the nifty building in Strasbourg will hop on this intellectual bandwagon.
Stephen E Arnold, July 2, 2016
Bad News for Instant Analytics Sharpies
June 28, 2016
I read “Leading Statisticians Establish Steps to Convey Statistics a Science Not Toolbox.” I think “steps” are helpful. The challenge will be to corral the escaped ponies who are making fancy analytics a point and click, drop down punch list. Who needs to understand anything. Hit the button and generate visualizations until somethings looks really super. Does anyone know a general who engages in analytic one-upmanship? Content and clarity sit in the backseat of the JLTV.
The write up is similar to teens who convince their less well liked “pals” to go on a snipe hunt. I noted this passage:
To this point, Meng [real statistics person] notes “sound statistical practices require a bit of science, engineering, and arts, and hence some general guidelines for helping practitioners to develop statistical insights and acumen are in order. No rules, simple or not, can be 100% applicable or foolproof, but that’s the very essence that I find this is a useful exercise. It reminds practitioners that good statistical practices require far more than running software or an algorithm.”
Many vendors emphasize how easy smart analytics systems are to use. The outputs are presentation ready. Checks and balances are mostly pushed to the margins of the interface.
Here are the 10 rules.
- Statistical Methods Should Enable Data to Answer Scientific Questions
- Signals Always Come with Noise
- Plan Ahead, Really Ahead
- Worry about Data Quality
- Statistical Analysis Is More Than a Set of Computations
- Keep it Simple
- Provide Assessments of Variability
- Check Your Assumptions
- When Possible, Replicate!
- Make Your Analysis Reproducible
I think I can hear the guffaws from the analytics vendors now. I have tears in my eyes when I think about “statistical methods should enable data to answer scientific questions.” I could have sold that line to Jack Benny if he were still alive and doing comedy. Scientific questions from data which no human has checked for validity. Oh, my goodness. Then reproducibility. That’s a good one too.
Stephen E Arnold, June 28, 2016
Savanna 4.7 for External Content Links
June 22, 2016
The latest version of Savanna, the collaborative data-visualization platform from Thetus Corporation, has an important new feature—it can now link to external content. The press release at PR Newswire, “Savanna 4.7 Introduces Plugins, Opening ‘A World of New Content’ to Visual Analysis Software,” tells us:
“With Savanna, users can visualize data to document insights mined from complexity and analyze relationships. New in this release are Savanna Plugins. Plugins do more than allow users to import data. The game changer is in the ability to link to external content, leaving the data in its original source. Data lives in many places. Analyzing data from many sources often means full data transformation and migration into a new program. This process is daunting and exactly what Savanna 4.7 Plugins address. Whether on databases or on the web, users can search all of their sources from one application to enrich a living knowledge base. Plugins also enable Savanna to receive streams of information from sources like RSS, Twitter, geolocators, and others.”
Thetus’ CTO is excited about this release, calling the new feature “truly transformative.” The write-up notes that Plugins opens new opportunities for Thetus to partner with other organizations. For example, the company is working with the natural language processing firm Basis Technology to boost translation and text mining capacities. Founded in 2003, Thetus is based in Portland, Oregon.
Cynthia Murrell, June 22, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Enterprise Search Vendor Sinequa Partners with MapR
June 8, 2016
In the world of enterprise search and analytics, everyone wants in on the clients who have flocked to Hadoop for data storage. Virtual Strategy shared an article announcing Sinequa Collaborates With MapR to Power Real-Time Big Data Search and Analytics on Hadoop. A firm specializing in big data, Sinequa, has become certified with the MapR Converged Data Platform. The interoperation of Sinequa’s solutions with MapR will enable actionable information to be gleaned from data stored in Hadoop. We learned,
“By leveraging advanced natural language processing along with universal structured and unstructured data indexing, Sinequa’s platform enables customers to embark on ambitious Big Data projects, achieve critical in-depth content analytics and establish an extremely agile development environment for Search Based Applications (SBA). Global enterprises, including Airbus, AstraZeneca, Atos, Biogen, ENGIE, Total and Siemens have all trusted Sinequa for the guidance and collaboration to harness Big Data to find relevant insight to move business forward.”
Beyond all the enterprise search jargon in this article, the collaboration between Sinequa and MapR appears to offer an upgraded service to customers. As we all know at this point, unstructured data indexing is key to data intake. However, when it comes to output, technological solutions that can support informed business decisions will be unparalleled.
Megan Feil, June 8, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
GAO DCGS Letter B-412746
June 1, 2016
A few days ago, I stumbled upon a copy of a letter from the GAO concerning Palantir Technologies dated May 18, 2016. The letter became available to me a few days after the 18th, and the US holiday probably limited circulation of the document. The letter is from the US Government Accountability Office and signed by Susan A. Poling, general counsel. There are eight recipients, some from Palantir, some from the US Army, and two in the GAO.
Has the US Army put Palantir in an untenable spot? Is there a deus ex machina about to resolve the apparent checkmate?
The letter tells Palantir Technologies that its protest of the DCGS Increment 2 award to another contractor is denied. I don’t want to revisit the history or the details as I understand them of the DCGS project. (DCGS, pronounced “dsigs”, is a US government information fusion project associated with the US Army but seemingly applicable to other Department of Defense entities like the Air Force and the Navy.)
The passage in the letter I found interesting was:
While the market research revealed that commercial items were available to meet some of the DCGS-A2 requirements, the agency concluded that there was no commercial solution that could meet all the requirements of DCGS-A2. As the agency explained in its report, the DCGS-A2 contractor will need to do a great deal of development and integration work, which will include importing capabilities from DCGS-A1 and designing mature interfaces for them. Because the agency concluded that significant portions of the anticipated DCSG-A2 scope of work were not available as a commercial product, the agency determined that the DCGS-A2 development effort could not be procured as a commercial product under FAR part 12 procedures. The protester has failed to show that the agency’s determination in this regard was unreasonable.
The “importing” point is a big deal. I find it difficult to imagine that IBM i2 engineers will be eager to permit the Palantir Gotham system to work like one happy family. The importation and manipulation of i2 data in a third party system is more difficult than opening an RTF file in Word in my experience. My recollection is that the unfortunate i2-Palantir legal matter was, in part, related to figuring out how to deal with ANB files. (ANB is i2 shorthand for Analysts Notebook’s file format, a somewhat complex and closely-held construct.)
Net net: Palantir Technologies will not be the dog wagging the tail of IBM i2 and a number of other major US government integrators. The good news is that there will be quite a bit of work available for firms able to support the prime contractors and the vendors eligible and selected to provide for-fee products and services.
Was this a shoot-from-the-hip decision to deny Palantir’s objection to the award? No. I believe the FAR procurement guidelines and the content of the statement of work provided the framework for the decision. However, context is important as are past experiences and perceptions of vendors in the running for substantive US government programs.
Next-Generation Business Intelligence Already Used by Risk Analysis Teams
June 1, 2016
Ideas about business intelligence have certainly evolved with emerging technologies. Addressing this, an article, Why machine learning is the new BI from CIO, speaks to this transformation of the concept. The author describes how reactive analytics based on historical data do not optimally assist business decisions. Questions about customer satisfaction are best oriented toward proactive future-proofing, according to the article. The author writes,
“Advanced, predictive analytics are about calculating trends and future possibilities, predicting potential outcomes and making recommendations. That goes beyond the queries and reports in familiar BI tools like SQL Server Reporting Services, Business Objects and Tableau, to more sophisticated methods like statistics, descriptive and predictive data mining, machine learning, simulation and optimization that look for trends and patterns in the data, which is often a mix of structured and unstructured. They’re the kind of tools that are currently used by marketing or risk analysis teams for understanding churn, customer lifetimes, cross-selling opportunities, likelihood of buying, credit scoring and fraud detection.”
Does this mean that traditional business intelligence after much hype and millions in funding is a flop? Or will predictive analytics be a case of polishing up existing technology and presenting it in new packaging? After time — and for some after much money has been spent — we should have a better idea of the true value.
Megan Feil, June 1, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Financial Institutes Finally Realize Big Data Is Important
May 30, 2016
One of the fears of automation is that human workers will be replaced and there will no longer be any more jobs for humanity. Blue-collar jobs are believed to be the first jobs that will be automated, but bankers, financial advisors, and other workers in the financial industry have cause to worry. Algorithms might replace them, because apparently people are getting faster and better responses from automated bank “workers”.
Perhaps one of the reasons why bankers and financial advisors are being replaced is due to their sudden understanding that “Big Data And Predictive Analytics: A Big Deal, Indeed” says ABA Banking Journal. One would think that the financial sector would be the first to embrace big data and analytics in order to keep an upper hand on their competition, earn more money, and maintain their relevancy in an ever-changing world. They, however, have been slow to adapt, slower than retail, search, and insurance.
One of the main reasons the financial district has been holding back is:
“There’s a host of reasons why banks have held back spending on analytics, including privacy concerns and the cost for systems and past merger integrations. Analytics also competes with other areas in tech spending; banks rank digital banking channel development and omnichannel delivery as greater technology priorities, according to Celent.”
After the above quote, the article makes a statement about how customers are moving more to online banking over visiting branches, but it is a very insipid observation. Big data and analytics offer the banks the opportunity to invest in developing better relationships with their customers and even offering more individualized services as a way to one up Silicon Valley competition. Big data also helps financial institutions comply with banking laws and standards to avoid violations.
Banks do need to play catch up, but this is probably a lot of moan and groan for nothing. The financial industry will adapt, especially when they are at risk of losing more money. This will be the same for all industries, adapt or get left behind. The further we move from the twentieth century and generations that are not used to digital environments, the more we will see technology integration.
Whitney Grace, May 30, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
eBay Struggles with Cluttered, Unstructured Data, Deploys Artificial Intelligence Strategy
May 24, 2016
The article on Forbes titled eBay’s Next Move: Artificial Intelligence To Refine Product Searches predicts a strong future for eBay as the company moves further into machine learning. For roughly six years eBay has been working with Expertmaker, a Swedish AI and analytics company. Forbes believes that eBay may have recently purchased Expertmaker. The article explains the logic behind this logic,
“One of the key turnaround goals of eBay is to encourage sellers to define their products using structured data, making it easier for the marketplace to show relevant search results to buyers. The acquisition of Expertmaker should help the company in this initiative, given its expertise in artificial intelligence, machine learning and big data.”
The acquisition of Expertmaker should allow for a more comprehensive integration of eBay’s “noisy data.” Expertmaker’s AI strategy is based in genetics research, and has made great strides in extracting concealed value from data. For eBay, a company with hundreds of millions of listings clogging up the platform, Expertmaker’s approach might be the ticket to achieving a more streamlined, categorized search. If we take anything away from this, it is that eBay search currently does not work very well. At any rate, they are taking steps to improve their platform.
Chelsea Kerwin, May 24, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph