Palantir and the US Army: Procurement Thrills
July 6, 2016
I read “Palantir Takes Fight with Army to Federal Court.” The write up is quite useful because the reporter Jen Judson was able to glean some information from a document related to the Palantir versus US Army matter. When I looked for the document, it seemed to me that the complaint had been sealed. I learned from the article:
Palantir is arguing the way the Army wrote its requirements in a request for proposals to industry would shut out Silicon Valley companies that provide commercially available products. The company contended that the Army’s plan to award just one contract to a lead systems integrator means commercially available solutions would have to be excluded.
The Defense News story included some interesting factoids. Here are three I noted:
- Palantir perceives the US Army acting in what is described as an “irrational” way.
- The program for a database, analytics, and visualization tools has consumed billions of dollars and is a development project, not a commercial off the shelf deal.
- Some Army personnel requested Palantir’s software and found the request denied.
Let’s assume that the Army is trying to build a solution which delivers what Palantir Gotham offers as ready-to-roll system listed on the GSA schedule like photocopying machines.
The questions that rose from my addled goose brain were:
- Why is the Army reluctant to use commercial-off-the-shelf software? My narrow experience with government procurement suggests that there is some other factor or factors making the coding of a system from ground zero or cranking out scripts to hook existing systems together more attractive than buying something that pretty much works.
- Why is Palantir unable to play procurement ball with the other major defense contracting companies? Is there a trust issue in play? Palantir was caught in a sticky wicket with i2 Group over the Analyst’s Notebook file format. As a former adviser to i2 before it became part of IBM, I know that the file format was a bit of information Mike Hunter and his colleagues treated as a close hold.
- What issues do the major vendors involved in the Army’s program have with Palantir’s business methods? Most government centric vendors generally get along and take a live-and-let-live approach to big projects. If vendors are not willing to play in the same sandbox, some bad vibes exist for some reason.
Unfortunately I don’t have answers to these questions. My view is that tackling the US Army and procurement methods is likely to cause some consternation for folks involved in the statement of work, the procurement, and the legal machinations.
Plus, the procurement guidelines and the actual procurement processes are often complex and somewhat flexible. As a result, when a commercial company lets the legal eagles fly, the US government has some wiggle room.
Finally, this Palantir versus the Army strikes me as a reprise of Google’s grousing about its not winning the search project for the original version of USA.gov. Big Silicon Valley companies make assumptions. For example, Google tossed around the term rational and the word logical as I recall. The problem is that trust, fear, and revenue may not fit into a Venn diagram or a numerical recipe.
Will Silicon Valley triumph over the so called Beltway Bandits? Will Silicon Valley rationality emerge victorious in the joust with the Army? Stay tuned for the outcome unless the resolution is sealed just like the ANB file format once was.
Stephen E Arnold, July 6, 2016
Rocana and Its Remarkable Claims
July 6, 2016
I read “New Search Engine Makes Data Instantly Searchable, Increases Data Retention.” I like that instantly assertion. Keep in mind that for me “instantly” means immediately. Okay. I also noted the “everything.” Bold assertions.
According to the write up:
This search engine makes data instantly searchable and increases data retention. It’s also crafted so that the archived data doesn’t impact the current inflow. Another feature is an anomaly system, with everything available at the user’s fingertips.
What is Rocana besides a search engine? Well, it turns out that the company provides that :
Limitless online access and analysis of all operational data gives CIOs and technology leaders a distinct competitive edge in today’s digital economy.
So we have an “all” tossed in for good measure in the explanation of Rocana. A video explains the niche the company’s search technology targets: operational data and anomaly. The search engine scales to “volumes of data that none of our competitors can achieve.” The company delivers “results that matter.” The company is going “beyond search.” There you go. Instantly. Everything.
Stephen E Arnold, July 6, 2016
Yahoo Factoid: Email
July 6, 2016
I read “Marissa Mayer Says She’s ‘Heartened’ by Interest in Yahoo.” I noted a factoid I found interesting. Here’s the passage I highlighted:
Another question about Yahoo Mail revealed that about 1% of Yahoo Mail users actually pay for the service. But Yahoo Mail is much more important than that, Mayer said. “For every dollar that we make on Yahoo Mail on advertisements, we will make $3 elsewhere in our network on search or on some of our digital content,” she said. “So mail is incredibly important for us because of the frequency it drives and because of the strength it drives throughout the network.”
Email is a net revenue generator. Too bad some of that money is not invested in improving Yahoo email; for example, bulk deletes which are usable, a reasonable search system, and support for log ins from outside the US without wonky behavior. Heartened?
Stephen E Arnold, July 6, 2 016
OnionScan Checks for Falsely Advertised Anonymous Sites on Dark Web
July 6, 2016
Dark Web sites are not exempt from false advertising about their anonymity. A recently published article from Vice’s Motherboard shares a A Tool to Check If Your Dark Web Site Is Really Anonymous. The program is called OnionScan and it determines issues on sites that may unmask servers or reveal their owners. An example of this is that could potentially be metadata, such as photo location information, hidden in images on the site. Sarah Jamie Lewis, an independent security researcher who developed OnionScan, told Motherboard:
The first version of OnionScan will be released this weekend, Lewis said. “While doing some research earlier this year I kept coming across the same issues in hidden services—exposed Apache status pages, images not stripped of exif data, pages revealing information about the tools used to build it with, etc. The goal is [to] provide an easy way of testing these things to drive up the security bar,” Lewis added. It works “pretty much the same as any web security scanner, just tailored for deanonymization vectors,” she continued.”
It is interesting that it appears this tool has been designed to protect users from the mistakes made by website administrators who do not set up their sites properly. We suppose it’s only a matter of time before we start seeing researchers publish the number of truly secure and anonymous Dark Web sites versus those with outstanding issues.
Megan Feil, July 6, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
The Computer Chip Inspired by a Brain
July 6, 2016
Artificial intelligence is humanity’s attempt to replicate the complicated thought processes in their own brains through technology. IBM is trying to duplicate the human brain and they have been successful in many ways with supercomputer Watson. The Tech Republic reports that IBM has another success under their belt, except to what end? Check out the article, “IBM’s Brain-Inspired Chip TrueNorth Changes How Computers ‘Think,’ But Experts Question Its Purpose.”
IBM’s TrueNorth is the first computer chip with an one million neuron architecture. The chip is a collaboration between Cornell University and IBM with the BARPA SyNAPSE Program, using $100 million in public funding. Most computer chips use the Von Neumann architecture, but the TrueNorth chip better replicates the human brain. TrueNorth is also more energy efficient.
What is the purpose of the TrueNorth chip, however? IBM created an elaborate ecosystem that uses many state of the art processes, but people are still wondering what the real world applications are:
“ ‘…it provides ‘energy-efficient, always-on content generation for wearables, IoT devices, smartphones.’ It can also give ‘real-time contextual understanding in automobiles, robotics, medical imagers, and cameras.’ And, most importantly, he said, it can ‘provide volume-efficient, unprecedented neural network acceleration capability per unit volume for cloud-based streaming processing and provide volume, energy, and speed efficient multi-modal sensor fusion at an unprecedented neural network scale.’”
Other applications include cyber security, other defense goals, and large scale computing and hardware running on the cloud. While there might be practical applications, people still want to know why IBM made the chip?
” ‘It would be as if Henry Ford decided in 1920 that since he had managed to efficiently build a car, we would try to design a car that would take us to the moon,’ [said Nir Shavit, a professor at MIT’s Computer Science and Artificial Intelligence Laboratory]. ‘We know how to fabricate really efficient computer chips. But is this going to move us towards Human quality neural computation?’ Shavit fears that its simply too early to try to build neuromorphic chips. We should instead try much harder to understand how real neural networks compute.’”
Why would a car need to go to the moon? It would be fun to go to the moon, but it doesn’t solve a practical purpose (unless we build a civilization on the moon, although we are a long way from that). It continues:
” ‘The problem is,’ Shavit said, ‘that we don’t even know what the problem is. We don’t know what has to happen to a car to make the car go to the moon. It’s perhaps different technology that you need. But this is where neuromorphic computing is.’”
In other words, it is the theoretical physics of computer science.
Whitney Grace, July 6, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Government IT Procurement Wobble
July 5, 2016
I read “IT Showdown: Tech Giants Face Off against 18F.” What’s an 18F? If you do work for the US government, you associate 18 F with the address of the General Services Administration. The name now evokes some annoyance among established US government contractors. The term 18F refers to a group set up to reduce the time, cost, and hassle of getting IT “done”.
In the good old days, there were people in the US government who did things. Over the years, US government professionals rely on contractors to do certain types of work. In the information technology world, the things range from talking about how one might do something to actually setting up a system to deliver certain outputs.
Along the way, commercial enterprises provided hardware, software, and services. The hardware and software were, for many years, proprietary or custom crafted to meet the needs of a particular government entity. These statements of work made life difficult for a vendor who used what were often perceived as expensive solutions. License agreements made it tricky for a government entity to get another commercial outfit to modify or work around limitations of certain commercial systems.
According to the write up, some of the established vendors are grousing. I learned:
At a House subcommittee hearing on June 10, lobbyists from the IT Alliance for Public Sector (ITAPS) and the Software & Information Industry Association (SIIA) alleged that 18F is hindering profits by acting as both a procurement policymaker and as a tech competitor inside the General Services Administration (GSA). The two groups assert a conflict of interest, and in testimony, have submitted a list of grievances and recommendations intended to curtail 18F’s authority. The hearing was conducted jointly by the House Subcommittees of Government Operations and Information Technology to assess the effectiveness of 18F and the U.S. Digital Service (USDS) — a sister tech consultancy within the White House.
The industry group perceives the 18F outfit as a bit of a threat. Blanket purchase agreements, open source solutions, and giving certain contracts for small coding jobs to non traditional outfits are not what the established information technology vendors want to happen.
I find the dust up amusing. The revenues of established information technology vendors are not likely to suffer sharp declines overnight. The 18F initiative is an example of the US government trying to find a solution to escalating costs for information technology and the gap between the commercial solutions available and actual solutions deployed in a government entity.
Will 18F reduce the gap? One thing is certain. Some vendors associate the term “18F” with some different connotations. Imagine a government professional using a mobile phone app to perform a task for personal work and then using a mainframe act to perform a similar task in a government agency. Exciting.
Stephen E Arnold, July 5, 2016
Palantir Technologies: A Valuation Factoid
July 5, 2016
I read “Palantir Buyback Plan Shows Need for New Silicon Valley Pay System.” (You may have to view this write up. Don’t email me. I don’t think about “real” journalists.) Tucked into the somewhat humorous write up was a factoid. I want to capture it because “real” reporters and “real” information can be tough to track down using an online search system.
Here’s the factoid:
It [Palantir] is offering $7.40 a share to buy back up to 12.5 percent of an employee’s shares…Morgan Stanley recently marked down the value of Palantir’s shares to $5.92.
That $1.48 just hangs there. Too bad the write up did not answer this question:
What were the valuations Morgan Stanley assigned when Palantir Technologies had a valuation of $20 billion. I assume that rainbows, unicorns, and other “real” artifacts, one must assume that Palantir is zipping right along the information superhighway.
Stephen E Arnold, July 5, 2016
The Cloud: Yep, Flying Blind Is Fun
July 5, 2016
Most of the information technology disasters I know about have a common characteristic. Ready for it? Managers did not do their job. The reasons ranged from a lack of informed decision making (this is a nice way of saying “stupid”) to a desire to leave the details to others (this is a nice way of saying “shirk responsibility”). Example: Portland, Oregon’s incompetence.
I thought about information technology crash and burns when I read “75 Percent of IT Pros Lack Visibility into Their Hybrid Clouds.” What I think the write up is trying to say is, “Folks with clouds don’t know what’s happening in the mist and haze.” The desire to get out of having computer systems down the hall is an admirable one. When I fiddled with the IBM mainframe at my third rate university in the 1960s, who wanted one of these puppies at home. The cloud is little more, in my opinion, than a return to the mainframe type approach to computing of a half century ago. Life is just easier with a smartphone.
The write up reports:
The study from cloud governance specialist Netwrix reveals that almost 65 percent of organizations do not have complete visibility into user, IT and third-party activity in their IT infrastructure. In addition 75 percent of respondents have partial or no visibility into their cloud and hybrid IT environments. The survey of over 800 people across 30 industries worldwide shows a large majority of respondents (78 percent) saying they are unaware or only partly aware of what is happening across their unstructured data and file storage.
The painful reality is that people who are supposed to be professional struggle to know what the heck is going on with their cloud computing systems. MBAs and failed middle school teachers as well as bright young sprouts from prestigious university computer science programs have this characteristic too.
Understanding the limits of one’s own knowledge is a difficult undertaking. The confidence with which some “pros” talk about nifty technology is remarkable. The likelihood of a escalating costs, annoyed customers, grousing colleagues, and outright failure are highly likely events.
Whether it is the baloney about figuring out the context of a user query or an F 35 aircraft which cannot be serviced by a ground crew are examples of how arrogance or human behavior ensure information technology excitement.
Change human behavior or go with a Google and Facebook style smart system? Interesting choice or is it a dilemma.
Stephen E Arnold, July 5, 2016
Watson Weekly: IBM Watson Service for Use in the IBM Cloud: Bluemix Paas, IBM SPSS, Watson Analytics
July 5, 2016
The article on ComputerWorld titled Review: IBM Watson Strikes Again relates the recent expansions of Watson’s cloud service portfolio, who is still most famous for winning on Jeopardy. The article beings by evoking that event from 2011, which actually only reveals a small corner of Watson’s functions. The article mentions that to win Jeopardy, Watson basically only needed to absorb Wikipedia, since 95% of the answers are article titles. New services for use in the IBM Cloud include the Bluemix Paas, IBM SPSS, and Predictive Analytics. Among the Bluemix services is this gem,
“Personality Insights derives insights from transactional and social media data…to identify psychological traits, which it returns as a tree of characteristics in JSON format. Relationship Extraction parses sentences into their components and detects relationships between the components (parts of speech and functions) through contextual analysis. The Personality Insights API is documented for Curl, Node, and Java; the demo for the API analyzes the tweets of Oprah, Lady Gaga, and King James as well as several textual passages.”
Bluemix also consists of AlchemyAPI for ftext and image content reading, Concept Expansion and Concept Insights, which offers text analysis and linking of concepts to Wikipedia topics. The article is less kind to Watson Analytics, a Web app for data analysis with ML, which the article claims “tries too hard” and is too distracting for data scientists.
Chelsea Kerwin, July 5, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Wait, the Dark Web Is Legal?
July 5, 2016
For research purposes, I surf the Dark Web on a regular basis. It is like skulking around the back alleys of a major city and witnessing all types of crime, but keeping to yourself. I have seen a few Web sites that could be deemed as legal, but most of the content I peruse is illegal: child pornography, selling prescription drugs, and even a hitman service. I have begun to think that everything on the Dark Web is illegal, except Help Net Security tells me that “Dark Web Mapping Reveals That Half Of The Content Is Legal.”
The Centre for International Governance Innovation (CIGI) conducted global survey and discovered that seven in ten (71%) of the surveyors believe the Dark Web needs to be shut down. There is speculation if the participants eve had the right definition about what the Dark Web is and might have confused the terms “Dark Web” and “Dark Net”.
Darksum, however, mapped the Tor end of the Dark Web and discovered some interesting facts:
- “Of the 29,532 .onion identified during the sampling period – two weeks in February 2016 – only 46% percent could actually be accessed. The rest were likely stort-lived C&C servers used to manage malware, chat clients, or file-sharing applications.
- Of those that have been accessed and analyzed with the companies’ “machine-learning” classification method, less than half (48%) can be classified as illegal under UK and US law. A separate manual classification of 1,000 sites found about 68% of the content to be illegal under those same laws.”
Darksum’s goal is to clear up misconceptions about the Dark Web and to better understand what is actually on the hidden sector of the Internet. The biggest hope is to demonstrate the Dark Web’s benefits.
Whitney Grace, July 5, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph