Exclusive: Interview with DataWalk’s Chief Analytics Officer Chris Westphal, Who Guides an Analytics Rocket Ship

October 21, 2020

I spoke with Chris Westphal, Chief Analytics Officer for DataWalk about the company’s string of recent contract “wins.” These range from commercial engagements to heavy lifting for the US Department of Justice.

Chris Westphal, founder of Visual Analytics (acquired by Raytheon) brings his one-click approach to advanced analytics.

The firm provides what I have described as an intelware solution. DataWalk ingests data and outputs actionable reports. The company has leap-frogged a number of investigative solutions, including IBM’s Analyst’s Notebook and the much-hyped Palantir Technologies’ Gotham products. This interview took place in a Covid compliant way. In my previous Chris Westphal interviews, we met at intelligence or law enforcement conferences. Now the experience is virtual, but as interesting and information in July 2019. In my most recent interview with Mr. Westphal, I sought to get more information on what’s causing DataWalk to make some competitors take notice of the company and its use of smart software to deliver what customers want: Results, not PowerPoint presentations and promises. We spoke on October 8, 2020.

DataWalk is an advanced analytics tool with several important innovations. On one hand, the company’s information processing system performs IBM i2 Analyst’s Notebook and Palantir Gotham type functions — just with a more sophisticated and intuitive interface. On the other hand, Westphal’s vision for advanced analytics has moved past what he accomplished with his previous venture Visual Analytics. Raytheon bought that company in 2013. Mr. Westphal has turned his attention to DataWalk. The full text of our conversation appears below.

Read more

Tickeron: The Commercial System Which Reveals What Some Intel Professionals Have Relied on for Years

October 16, 2020

Are you curious about the capabilities of intelware systems developed by specialized services firms? You can gat a good idea about the type of information available to an authorized user:

  • Without doing much more than plugging in an entity with a name
  • Without running ad hoc queries like one does on free Web search systems unless there is a specific reason to move beyond the provided output
  • Without reading a bunch of stuff and trying to figure out what’s reliable and what’s made up by a human or a text robot
  • Without having to spend time decoding a table of numbers, a crazy looking chart, or figuring out weird colored blobs which represent significant correlations.

Sound like magic?

Nope, it is the application of pattern matching and established statistical methods to streams of data.

The company delivering this system, tailored to Robinhood-types and small brokerages, has been assembled by Tickeron. There’s original software, some middleware, and some acquired technology. Data are ingested and outputs indicate what to buy or sell or to know, as a country western star crooned, “know when to hold ‘em.”

A rah rah review appeared in The Stock Dork. “Tickeron Review: An AI-Powered Trading Platform That’s Worth the Hype” provides a reasonably good overview of the system. If you want to check out the system, navigate to Tickeron’s Web site.

Here’s an example of a “card,” the basic unit of information output from the system:

image

The key elements are:

  • Icon to signal “think about buying” the stock
  • A chart with red and green cues
  • A hot link to text
  • A game angle with the “odds” link
  • A “more” link
  • Hashtags (just like Twitter).

Now imaging this type of data presented to an intel officer monitoring a person of interest. Sound useful? The capability has been available for more than a decade. It’s interesting to see this type of intelware finds its way to those who want to invest like the wizards at the former Bear Stearns (remember that company, the bridge players, the implosion?).

DarkCyber thinks that the high-priced solutions available from Wall Street information providers may wonder about the $15 a month fee for the Tickeron service.

Keep in mind that predictions, if right, can allow you to buy an exotic car, an island, and a nice house in a Covid-free location. If incorrect, there’s van life.

The good news is that the functionality of intelware is finally becoming more widely available.

Stephen E Arnold, October 16, 2020

Rah, Rah, Sis Boom Analytics. No, Wait. Boo, Boo, Hiss, Hiss Analytics

October 16, 2020

One of the DarkCyber researchers alerted me to “Most CMOs Disappointed with Analytics Results.” We are wrapping up an interview with one of the senior technologists at Datawalk, and the topic of complexity in easy-to-use analytics systems was a topic of discussion. Watch for this revealing interview in an upcoming issue of DarkCyber.

The article about disappointed CMOs is not surprising. What is surprising is that individuals with expectations that smart software will generate just the answer one needs to generate bigly sales are so widespread.

The write up reports citing a study by the mid-tier consulting firm Gartner Group:

“Though CMOs understand the importance of applying analytics throughout the marketing organization, many struggle to quantify the relationship between insights gathered and their company’s bottom line. In fact, nearly half of respondents in this year’s survey say they’re unable to measure marketing ROI,” says Lizzy Foo Kune, senior director analyst in the Gartner Marketing practice. “This inability to measure ROI tarnishes the perceived value of the analytics team.”

Other findings from the study of 415 marketing “leaders” are:

  • Training staff is not a priority
  • Data science and campaigns are behind other analytic use cases
  • Most organizations will spend more for analytics.

These types of surveys deliver results that gild the available lilies.

For those without numerical skills and training, many of today’s analytic tools are like to disappoint. The digital oracle of Delphi is not working particularly well for many users. Even individuals with a couple of statistics courses on their record have to spend time familiarizing themselves with the analytic tools and their options. Plus if bad data go in, not even a super smart system can produce silk purses from chubby data pigs. Nevertheless, MBAs believe in analytics and, of course, magic.

Stephen E Arnold, October 16, 2020

Spreadsheet Fever Case Example

October 12, 2020

I have been using the phrase “spreadsheet fever” to describe the impact of fiddling with numbers in Microsoft Excel has on MBAs. With Excel providing the backbone for numerous statistical confections, the sugar hit of magic assumptions cannot be under-estimated. The mental structure of a crazed investment analyst brooks no interference from common sense.

Excel: Why Using Microsoft’s Tool Caused Covid-19 Results to Be Lost” provides a possible case example of what happens when thumbtypers and over-confident innumerates tangle with a digital spreadsheet. No green eyeshades and no pencils needed. Calculators? One can hear a 22 year old ask, “What’s a calculator? I have one on my iPhone?”

The Beeb reports:

PHE [Public Health England, a fine UK entity] had set up an automatic process to pull this data together into Excel templates so that it could then be uploaded to a central system and made available to the NHS Test and Trace team, as well as other government computer dashboards.

And what tool did these over confident wizards use?

Microsoft Excel, the weapon of choice for business and STEM analysis, of course.

How did the experts wander off the information highway into a thicket of errors? The Beeb explains:

The problem is that PHE’s own developers picked an old file format to do this – known as XLS. As a consequence, each template could handle only about 65,000 rows of data rather than the one million-plus rows that Excel is actually capable of. And since each test result created several rows of data, in practice it meant that each template was limited to about 1,400 cases. When that total was reached, further cases were simply left off.

The fix? Can kicking perhaps:

But insiders acknowledge that the current clunky system needs to be replaced by something more advanced that excludes Excel, as soon as possible.

Righto.

Stephen E Arnold, October 12, 2020

9 21

September 20, 2020

One of the DarkCyber research team came across this chart on the Datawrapper Web site. Datawrapper provides millennial-ready analysis tools. With some data and the firm’s software, anyone can produce a chart like this one with green bars for negative numbers.

datawrapper chicago

What is the chart displaying. The odd green bar shows the decline in job postings. Why green? No idea. What is the source of the data? Glassdoor, a job listings site. The data apply only to Chicago, Illinois. The time period is August 2020 versus August 2019. The idea is that the longer the bar, the greater the decline. Why is the bar green? Isn’t red a more suitable color for negative numbers?

Shown in this image are the top 12 sectors for job loss. To be clear, the longer the bar, the fewer job postings. Fewer job postings, one assumes, translates to reduced opportunities for employment.

What’s interesting is that accounting, consulting, information technology, telecommunications, and computer software and hardware are big losers. Those expensive MBAs, the lost hours studying for the CPA examination, and thumb typing through man pages are gone for now.

Observations:

  • The colors? Red maybe.
  • The decline in high technology work and knowledge work is interesting.
  • The “open jobs” numbers are puzzling. Despite declines, Chicago – the city of big shoulders and big challenges – has thousands of jobs in declining sectors.

Net net: IT and computer software and hardware look promising. The chart doesn’t do the opportunities justice. And the color?

Stephen E Arnold, September 20, 2020

Count Bayesie Speaks Truth

September 10, 2020

Navigate to “Why Bayesian Stats Needs More Monte Carlo Methods.” Each time I read an informed write up about the 18th century Presbyterian minister who could do some math, I think about a fellow who once aspired to be the Robert Maxwell of content management. Noble objective is it not?

That person grew apoplectic when I explained how Autonomy in the early 1990s was making use of mathematical procedures crafted in the 18th century. I wish I have made a TikTok video of his comical attempt to explain that a human or software system should not under any circumstances inject a data point that was speculative.

Well, my little innumeric content management person, get used to Bayes. Plus there’s another method at which you can rage and bay. Yep, Monte Carlo. If you were horrified by the good Reverend’s idea, wait until you did into Monte Carlo. Strapping these two stastical stallions to the buggy called predictive analytics is commonplace.

The write up closes poetically, which may be more in line with the fuzzy wuzzy discipline of content management:

It may be tempting to blame the complexity of the details of Bayesian methods, but it’s important to realize that when we are taught the beauty of calculus and analytical methods we are often limited to a relatively small set of problems that map well to the solutions of calc 101. When trying to solve real world problems mathematically complex problems pop up everywhere and analytical solutions either escape or fail us.

Net net: Use what matches the problem. Also, understand the methods. Key word: Understand.

Stephen E Arnold, September 10, 2020

Data Brokers: A Partial List

September 7, 2020

DarkCyber has fielded several inquiries in the last three months about data brokers. My response has been to point out that some data brokers are like quinoa farmers near Cusco: Small, subsistence data reselling; others are like Consolidated Foods, the industrialized outfits.

Yon can review a partial list of data brokers on this Github page. However, I want to point out:

  • Non US data brokers have information as well. Some of that information is particularly interesting, and it is unlikely that the average email phisher or robocall outfit will have access to these data. (No, I am not listing some of these interesting firms.)
  • There are several large data brokers not on this list. In my lectures I mention a giant data broker wanna be, but in most cases when I say “Amazon”, the response is, “My family uses Amazon a couple of times a week.” I don’t push back. I just move forward. What one does not know does not exist for some people.
  • Aggregating services with analytics plumbing are probably more important than individual chunks of data from either the quinoa farmers or from a combine. Why? With three items of data and a pool of “maybe useful” content, it is possible to generate some darned interesting outputs.

Putting the focus on a single type of digital artifact is helpful, sometimes interesting, and may be a surprise to some uninformed big time researcher. But the magic of applied analytics is where the oomph is.

Stephen E Arnold, September 7, 2020

Facial Recognition: Who Is Against Early Diagnosis of Heart Disease?

September 3, 2020

The anti-facial recognition cohort may have a new challenge on their capable hands. Facial recognition is controversial. What if analysis of a face — for instance, in a selfie — can lead to an early diagnosis of heart disease. The person is alerted to visit a doctor. What if a life is saved? Is facial recognition granted a hall pass for a medical application?

I don’t want to dwell on fencing applications of pattern recognition. I would suggest that a quick look at “AI Expected to Detect Heart Disease via Selfies: Chinese Researchers” might be interesting. The write up states:

Facial appearance has long been identified as an indicator of cardiovascular risk. Features such as male pattern baldness, earlobe crease, xanthelasmata (yellowish deposit of fat around or on the eyelids) and skin wrinkling are the most common predictors.

And what about accuracy?

According to the results published in the European Heart Journal, the algorithm had a sensitivity of 80 percent and specificity of 54 percent, outperforming the traditional prediction model of coronary artery disease. Sensitivity refers to the algorithm’s ability to designate a patient with a disease as positive, while specificity is the test’s ability to designate a patient without disease as negative.

Interesting. How will anti-FR cohorts deal with medical technology which finds its way into different government agencies? DarkCyber does not have an answer, but perhaps pattern recognition will be banned? Perhaps not, however?

Stephen E Arnold, September 3, 2020

Bringing IT Department into Analytics Decisions: Seems Reasonable

August 25, 2020

Woe to the company that implements a data analysis solution without consulting its IT department. That is the moral of the IT Brief write-up, “Extracting Insights from Data Requires More than Just a Pretty Dashboard.” A slick dashboard is nice to have, and it can offer non-technical workers the comfort of pretty graphs, projections, and generated reports. But what happens when users do not understand the data that underlies these results? Contributor Steve Singer writes:

“If you’re not sure where your data comes from, or how clean it is, you can’t trust the reports you generate from it. In some cases, if you don’t know what you have, you don’t even know how to ask the right questions. Somehow, we all have to get smarter about our approaches to all the data in our organizations and our development of the skill sets needed to capitalize on dashboard analytics. … In some businesses, decisions on dashboard purchases and deployment are made with little or no consultation with the IT department and data specialists. No one carefully considers whether the stores of data are in a suitable form or location to support the new tools. All too often they are not. Business decision-makers then find themselves disappointed when the tools fail to deliver the benefits they expected. Avoiding this scenario requires business units discuss their objectives with IT so that together they can decide on the most effective products and approaches. Data specialists must be able to assess whether tools are fit for purpose and able to be linked to the organization’s existing IT infrastructure.”

A company’s IT department is (or should be) a wealth of technical expertise at decision-makers’ fingertips. Singer offers four tips for working together to make the best choices: Begin with a clear plan that defines objectives, then decide whether infrastructure changes are needed; examine data sources and stores; establish a trust score for available data; then, and only then, select the appropriate dashboard or toolset. Though such collaboration would be a drastic change for some companies, it is well worth the effort when data projects actually product the desired results. That beats flashy but meaningless graphs any day.

Cynthia Murrell, August 25, 2020

Predictive Analytics: A Time and a Place, Not Just in LE?

August 17, 2020

The concept seems sound: analyze data from past crimes to predict future crimes and stop them before they happen. However, in practice the reality is not so simple. That is, as Popular Mechanics explains, “Why Hundreds of Mathematicians Are Boycotting Predictive Policing.” Academic mathematicians are in a unique position—many were brought into the development of predictive policing algorithms in 2016 by The Institute for Computational and Experimental Research in Mathematics (ICERM). One of the partners, PredPol, makes and sells predictive policing tools. Reporter Courtney Linder informs us:

“Several prominent academic mathematicians want to sever ties with police departments across the U.S., according to a letter submitted to Notices of the American Mathematical Society on June 15. The letter arrived weeks after widespread protests against police brutality, and has inspired over 1,500 other researchers to join the boycott. These mathematicians are urging fellow researchers to stop all work related to predictive policing software, which broadly includes any data analytics tools that use historical data to help forecast future crime, potential offenders, and victims. … Some of the mathematicians include Cathy O’Neil, author of the popular book Weapons of Math Destruction, which outlines the very algorithmic bias that the letter rallies against. There’s also Federico Ardila, a Colombian mathematician currently teaching at San Francisco State University, who is known for his work to diversify the field of mathematics.”

Linder helpfully explains what predictive policing is and how it came about. The embedded four-minute video is a good place to start (interestingly, it is produced from a pro-predictive policing point of view). The article also details why many object to the use of this technology. Chicago’s Office of the Inspector General has issued an advisory with a list of best practices to avoid bias, while Santa Cruz has banned the software altogether. We’re told:

“The researchers take particular issue with PredPol, the high-profile company that helped put on the ICERM workshop, claiming in the letter that its technology creates racist feedback loops. In other words, they believe that the software doesn’t help to predict future crime, but instead reinforces the biases of the officers.”

Structural bias also comes into play, as well as the consideration that some crimes go underreported, skewing data. The piece wraps up by describing how widespread this technology is, an account that can be summarized by quoting PredPol’s own claim that one in 33 Americans are “protected” by its software.

With physics and other disciplines like Google online advertising based on probabilities and predictive analytics, what’s the scientific limit on real world applications? Subjective perceptions?

Cynthia Murrell, August 17, 2020

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta