Exclusive: Interview with DataWalk’s Chief Analytics Officer Chris Westphal, Who Guides an Analytics Rocket Ship
October 21, 2020
I spoke with Chris Westphal, Chief Analytics Officer for DataWalk about the company’s string of recent contract “wins.” These range from commercial engagements to heavy lifting for the US Department of Justice.
Chris Westphal, founder of Visual Analytics (acquired by Raytheon) brings his one-click approach to advanced analytics.
The firm provides what I have described as an intelware solution. DataWalk ingests data and outputs actionable reports. The company has leap-frogged a number of investigative solutions, including IBM’s Analyst’s Notebook and the much-hyped Palantir Technologies’ Gotham products. This interview took place in a Covid compliant way. In my previous Chris Westphal interviews, we met at intelligence or law enforcement conferences. Now the experience is virtual, but as interesting and information in July 2019. In my most recent interview with Mr. Westphal, I sought to get more information on what’s causing DataWalk to make some competitors take notice of the company and its use of smart software to deliver what customers want: Results, not PowerPoint presentations and promises. We spoke on October 8, 2020.
DataWalk is an advanced analytics tool with several important innovations. On one hand, the company’s information processing system performs IBM i2 Analyst’s Notebook and Palantir Gotham type functions — just with a more sophisticated and intuitive interface. On the other hand, Westphal’s vision for advanced analytics has moved past what he accomplished with his previous venture Visual Analytics. Raytheon bought that company in 2013. Mr. Westphal has turned his attention to DataWalk. The full text of our conversation appears below.
Facebook: Interesting Data If Accurate
October 16, 2020
DarkCyber spotted a factoid of interest to law enforcement professionals in “Facebook Responsible for 94% of 69 Million Child Sex Abuse Images Reported by US Tech Firms.”
Facebook has previously announced plans to fully encrypt communications in its Messenger app, as well as its Instagram Direct service – on top of WhatsApp, which is already encrypted – meaning no one apart from the sender and recipient can read or modify messages.
Now about Facebook’s content curation procedures? End-to-end encryption of ad supported private messaging services appears to benefit bad actors.
Stephen E Arnold, October 16, 2020
Facebook WhatsApp: Disappearing Media. Really? Gone for Good?
September 28, 2020
Facebook is endlessly entertaining. On one tentacle, the octopus company seeks to lessen the competitive threats from next generation social media like TikTok-type videos. On another tentacle, Facebook suggests that those in the European Union can do without Facebook. DarkCyber thinks of this as “the take my ball and go home” tactic. Ten year olds with minimal physical coordination but a stock of high end athletic equipment have been known to trot out this argumentative chestnut. Another tentacle semi cooperates with government officials. Another tentacle balances on a knife edge in order to keep employees happy with the wonderful social environment within the helpful company’s digital walled garden. There are other tentacles too, but I want to focus your attention on “WhatsApp Expiring Media Feature Details Tipped via New Beta Version.” Keep in mind that “beta” does not mean something a thumbtyper will be able to use.
The write up explains:
WhatsApp 2.20.201.6 beta for Android has been released with further references for a new feature called “Expiring Media.” The feature, as its name suggests, would make media files such as images, videos, and GIFs sent to the recipient’s phone over WhatsApp disappear once they are viewed.
Interesting. Just one question:
If media are disappeared for users, are those data deleted from the Facebook servers?
One hopes not; otherwise, some investigations will be slowed or halted.
Stephen E Arnold, September 28, 2020
Predictive Policing: A Work in Progress or a Problem in Action?
September 2, 2020
Amid this year’s protests of police brutality, makers of crime-predicting software took the occasion to promote their products as a solution to racial bias in law enforcement. The Markup ponders, “Data-Informed Predictive Policing Was Heralded as Less Biased. Is It?” Writer Annie Gilbertson observes, as we did, that more than 1,400 mathematicians signed on to boycott predictive policing systems. She also describes problems discovered by researchers at New York University’s AI Now Institute:
“‘Police data is open to error by omission,’ [AI Now Director Rashida Richardson] said. Witnesses who distrust the police may be reluctant to report shots fired, and rape or domestic violence victims may never report their abusers. Because it is based on crime reports, the data fed into the software may be less an objective picture of crime than it is a mirror reflecting a given police department’s priorities. Law enforcement may crack down on minor property crime while hardly scratching the surface of white-collar criminal enterprises, for instance. Officers may intensify drug arrests around public housing while ignoring drug use on college campuses. Recently, Richardson and her colleagues Jason Schultz and Kate Crawford examined law enforcement agencies that use a variety of predictive programs. They looked at police departments, including in Chicago, New Orleans, and Maricopa County, Ariz., that have had problems with controversial policing practices, such as stop and frisk, or evidence of civil rights violations, including allegations of racial profiling. They found that since ‘these systems are built on data produced during documented periods of flawed, racially biased, and sometimes unlawful practices and policies,’ it raised ‘the risk of creating inaccurate, skewed, or systemically biased data.’”
The article also looks at a study from 2016 by the Royal Statistical Society. Researchers supplied PredPol’s algorithm with arrest data from Oakland California, a city where estimated drug use is spread fairly evenly throughout the city’s diverse areas. The software’s results would have had officers target Black neighborhoods at about twice the rate of white ones. The team emphasized the documented harm over-policing can cause. The write-up goes on to cover a few more studies on the subject, so navigate there for those details. Gilberston notes that concerns about these systems are so strong that police departments in at least two major cities, Chicago and Los Angeles, have decided against them. Will others follow suit?
Cynthia Murrell, September 2, 2020
Lexipol: Facing Scrutiny?
September 1, 2020
Should a private company be writing policies for police departments? Increasingly that is the case and, many say, is a major reason it is so difficult to hold police accountable for using excessive force. Mother Jones invites us to “Meet the Company that Writes the Policies that Protect Cops.” Founded in 2003, Lexipol’s focus is unabashedly on crafting policies that protect officers and departments against lawsuits. Much hinges on definitions of words one would think should be straightforward, like “necessary” and “imminent.” In fact, company co-founder (and former cop) Bruce Praet seems especially proud of the slippery language that gives police “flexibility.”
When pressed, Lexipol insists it bases its policies on federal and state standards, laws, court rulings, and “best practices.” However, reporter Madison Pauly writes:
“Some of the company’s policies … depart in significant ways from recommendations by mainstream policing organizations. The National Consensus Policy on Use of Force, a collaboration between 11 major law enforcement groups, requires cops to try de-escalation techniques before using force if possible. Lexipol discourages police departments from requiring them. Lexipol’s policy allows officers to shoot at moving vehicles in some circumstances, a practice that the Police Executive Research Forum recommends against because it may injure or kill civilians and officers. The ACLU has contested Lexipol’s rules for handling immigration violations, which in some states include a provision allowing cops to consider ‘a lack of English proficiency’ when deciding whether someone may have entered the country illegally. Despite these challenges, the company has marketed its policies as a way to decrease cities’ liability in police misconduct lawsuits. In its communications with potential clients, Lexipol has claimed that agencies that use its policies are sued less frequently and pay out smaller settlements, according to a Texas Law Review analysis of public records. The company’s critics argue that it accomplishes this with vague or permissive rules that meet bare-minimum legal requirements rather than holding officers to a higher standard.”
According to the company, Lexipol has vended its policies, training, customizable handbooks, or other services to more than 8 thousand public safety agencies, including several large cities. These include corrections, fire, and EMS agencies alongside police departments. In California, it is estimated that about 95 percent of law enforcement are using Lexipol policies. See the article for examples where, we’re told, these policies have stood in the way of justice. As with the crafting of state legislation, we suspect many citizens are unaware how much influence these public agencies have handed over to a third party.
Cynthia Murrell, September 1, 2020
The Child Protection System Catches Pedophiles
August 11, 2020
Child pornography plagues the Internet’s underbelly, the Dark Web, per-to-per sharing networks, and even simple Google search. Law enforcement officials want to protect children and stop the spread of child pornography, so a new software called Child Protection System was created. NBC News shares details in the article, “Inside the Surveillance Software Tracking Child Porn Offenders Across the Globe.”
The Child Protection System was designed by the Florida nonprofit Child Rescue Coalition. It is a forensic tool that scans file sharing networks and chatrooms to locate computers that download child pornography. It is programmed to search for over two hundred terms related to child sex abuse. These scans are then used as probable cause to gain search warrants. Child Protection System’s scans were used to arrest over 12,000 people. The software can search down to the county level and it also looks for images of children deemed twelve and under. It saves a lot of investigation time:
“ ‘The Child Protection System “has had a bigger effect for us than any tool anyone has ever created. It’s been huge,’ said Dennis Nicewander, assistant state attorney in Broward County, Florida, who has used the software to prosecute about 200 cases over the last decade. ‘They have made it so automated and simple that the guys are just sitting there waiting to be arrested.’ The Child Rescue Coalition gives its technology for free to law enforcement agencies, and it is used by about 8,500 investigators in all 50 states. It’s used in 95 other countries, including Canada, the U.K. and Brazil. Since 2010, the nonprofit has trained about 12,000 law enforcement investigators globally.”
The Child Rescue Coalition wants to partner with social media platforms, schools, and more in order to discover who is downloading child pornography. These platforms often contain information of people discussing suspicious behavior, but does not indicate criminal activities. If data from the Child Protection System and these platforms were cross-matched it might indicate possible bad actors.
Some assert that surveillance software is that it is breaking privacy laws. Handing over all this surveillance power to governments requires safeguards to protect individuals’ privacy.
Whitney Grace, August 11, 2020
WhatsApp: Expiring Messages
August 3, 2020
DarkCyber noted “WhatsApp Is Working on Message Deletion Feature.” Encrypted messaging is a communications channel with magnetism. I pointed out in one of my recent lectures that:
messaging can provide many of the functions associated with old style Dark Web sites.
Messaging applications permit encrypted groups, in-app ecommerce, and links which effectively deliver digital content to insiders or customers.
Facebook, owner of WhatsApp, according to the article:
is working on an Expiring Messages feature.
The idea is that the Facebook system will:
“automatically delete a particular message for the sender and the receiver after a particular time.”
The innovation, if it is under development, begs the question, “Will Facebook retain copies of the deleted content?”
Stephen E Arnold, August 3, 2020
European Union Tries Panenka to Score Against Encrypted Data
July 31, 2020
Let’s assume this write up is accurate: “EU Plans to Use Supercomputers to Break Encryption But Also Wants Platforms to Create Opportunities to Snoop on End-to-End Communications.”
The “going dark” argument is not moving fast enough for European Union regulators. The fix is a “decryption platform.” The idea is to decrypt certain messages. The interesting part of the tactic is summarized in this passage:
Internet service providers such as Google, Facebook and Microsoft are to create opportunities to read end-to-end encrypted communications. If criminal content is found, it should be reported to the relevant law enforcement authorities. To this end, the Commission has initiated an “expert process” with the companies in the framework of the EU Internet Forum, which is to make proposals in a study. This process could later result in a regulation or directive that would force companies to cooperate.
The article points out:
There’s no way to “create opportunities” to read end-to-end encrypted communications without weakening the latter.
Worth monitoring the idea and its implementation and its opportunities.
Stephen E Arnold, July 31, 2020
Are High-Technology Companies Obstructionist?
June 11, 2020
When one talks about high-technology companies, the conversation skips over outfits like Siemens, China Mobile, and Telefónica. The parties to the conversation understand that the code “high tech” refers to Amazon, Apple, Facebook, Google, Twitter, and similar outfits. Baidu and TenCent type outfits cooperate in some countries, just less so in other countries.
“ASIO Chief Hits Out at Obstructive Tech Companies” is an amplification of US Department of Justice calls for providing enforcement officials with backdoors when data are encrypted. Australia’s intelligence agency is called the Australian Security Intelligence Organisation, shortened to ASIO. The article points out hat in December 2018 laws came into effect that added “encryption busting powers.”
The bulk of the article consists of statements made by Australia’s spy agency chief Mike Burgess. The selected comments make clear that some high-technology companies have been slow to cooperate. The write up reports:
“As a society, whether we know it or not, we’ve accepted the fact that the police or ASIO can get a warrant to bug someone’s car or someone’s house. Why should cyberspace be any different? “Yet every time we have these conversations with the private sector companies they kind of push back and say, ‘Uh, no, we’re not so sure about that’.”—Mr. Burgess.
The main point is that high-technology companies often adopt the apocryphal mañana approach when minutes count.
DarkCyber anticipates increased requests for encryption backdoors from other members of the Five Eyes. Some of those involved with this group are not amused with going slow.
Stephen E Arnold, June 11, 2020
DarkCyber Exclusive: Steele Aims for the Hearts of Wall Street Short Sellers
May 23, 2020
We posted a follow up interview with Robert David Steele, a former CIA professional. This video expands on the allegations of wide spread, systemic fraud. Steele explains why a government task force is needed. He describes the scope of the audit, involving six financial giants and a back office operation. If you are interested in learning about alleged skyscraper-sized financial misbehavior, you can view the video on Vimeo at this link.
Stephen E Arnold, May 23, 2020