Smart Software and an Intentional Method to Increase Revenue

July 6, 2020

The excellent write up titled “How Researchers Analyzed Allstate’s Car Insurance Algorithm.” My suggestion? Read it.

The “how to” information is detailed and instructive. The article reveals the thought process and logical thinking that allows a giant company with “good hands” to manipulate its revenues.

Here’s the most important statement in the article:

In other words, it appears that Allstate’s algorithm built a “suckers list” that would simply charge the big spenders even higher rates.

The information in the article illustrates how difficult it may be for outsiders to figure out how some smart numerical procedures are assembled into “intentional machines.”

The idea is that data allow the implementation of quite simple big ideas in a slick, automated, obfuscated way.

As my cranky grandfather observed, “It all comes down to money.”

Stephen E Arnold, July 6, 2020

Australia: Facial Recognition Diffuses

June 17, 2020

Facial recognition is in the new in the US. High-profile outfits have waved brightly colored virtue signaling flags. The flags indicate, “We are not into this facial recognition thing.” Interesting if accurate. “Facial Surveillance Is Slowly Being Trialed around the Country” provides some information about using smart software to figure out who is who. (Keep in mind that Australia uses the Ripper device to keep humans from becoming a snack for a hungry shark.)

The write up reports:

Facial recognition technology uses artificial intelligence to identify individuals based on their unique facial features and match it with existing photos on a database, such as a police watch list. While it’s already part of our everyday lives, from tagging photos on Facebook to verifying identities at airport immigration, its use by law enforcement via live CCTV is an emerging issue.

That’s the spoiler. Facial recognition is useful and the technology is becoming a helpful tool, like a flashlight or hammer.

The article explains that “All states and territories [in Australia] are using facial recognition software.”

Police in all states and territories confirmed to 7.30 they do use facial recognition to compare images in their databases, however few details were given regarding the number of live CCTV cameras which use the technology, current trials and plans for its use in the future.

The interesting factoid in the write up is that real time facial recognition systems are now in use in Queensland and Western Australia and under consideration in in New South Wales.

The article points out:

Real-time facial recognition software can simply be added to existing cameras, so it is difficult to tell which CCTV cameras are using the technology and how many around the country might be in operation.

DarkCyber believes that this means real time facial recognition is going to be a feature update, not unlike getting a new swipe action with a mobile phone operating system upgrade.

The article does not identify vendors providing these features, nor are data about accuracy, costs, and supporting infrastructure required.

What’s intriguing is that the article raises the thought that Australia might be on the information highway leading to a virtual location where Chinese methods are part of the equipment for living.

Will Australia become like China?

Odd comparison that. There’s the issue of population, the approach to governance, and the coastline to law enforcement ratio.

The write up also sidesteps the point that facial recognition is a subset of pattern recognition, statistical cross correlation, and essential plumbing for Ripper.

Who provides the smart software for that shark spotting drone? Give up? Maybe Amazon, the company not selling facial recognition to law enforcement in the US.

Interesting, right?

Stephen E Arnold, June 17, 2020


Rounding Error? Close Enough for Horse Shoes in Michigan

June 9, 2020

Ah, Michigan. River Rouge, the bridge to Canada, and fresh, sparkling water. These cheerful thoughts diminished when I read “Government’s Use of Algorithm Serves Up False Fraud Charges.”

The write up describes a smart system. The smart system was not as smart as some expected. The article states:

While the agency still hasn’t publicly released details about the algorithm, class actions lawsuits allege that the system searched unemployment datasets and used flawed assumptions to flag people for fraud, such as deferring to an employer who said an employee had quit — and was thus ineligible for benefits — when they were really laid off.

Where did the system originate? A D student in the University of Michigan’s Introduction to Algorithms class? No. The article reports:

The state’s unemployment agency hired three private companies to develop MiDAS, as well as additional software. The new system was intended to replace one that was 30 years old and to consolidate data and functions that were previously spread over several platforms, according to the agency’s 2013 self-nomination for an award with the National Association of State Chief Information Officers. The contract to build the system was for more than $47 million. At the same time as the update, the agency also laid off hundreds of employees who had previously investigated fraud claims.

Cathy O’Neil may want to update her 2016 “Weapons of Math Destruction.” Michigan has produced some casualties. What other little algorithmic surprises are yet to be discovered? Will online learning generate professionals who sidestep these types of mathiness? Sure.

Stephen E Arnold, June 9, 2020

Mathematica Not Available? Give Penrose a Whirl

June 7, 2020

If you want to visualize mathematical procedures, you can use any number of tools. Wolfram Mathematica is a go to choice for some folks. However, Penrose, a new tool, is available. The system is described in “CMU’s ‘Penrose’ Turns Complex Math Notations Into Illustrative Diagrams.” The article reports:

The CMU team similarly designed Penrose to codify the best practices of mathematical illustrators in a way that is reusable and widely accessible. Ye says Penrose enables users to create diagrams by simply typing in mathematical expressions that describe relationships, whereupon “the tool automatically takes care of laying everything out.”

More information is available at this link.

Stephen E Arnold, June 7, 2020

Facial Recognition: A Partial List

June 3, 2020

DarkCyber noted “From RealPlayer to Toshiba, Tech Companies Cash in on the Facial Recognition Gold Rush.” The write up provides two interesting things and one idea which is like a truck tire retread.

First, the write up points out that facial recognition or FR is a “gold rush.” That’s a comparison which eluded the DarkCyber research team. There’s no land. No seller of heavy duty pants. No beautiful scenery. No wading in cold water. No hydro mining. Come to think of it, FR is not like a gold rush.

Second, the write up provides a partial list of outfits engaged in facial recognition. The word partial is important. There are some notable omissions, but 45 is an impressive number. That’s the point. Just 45?

The aspect of the write the DarkCyber team ignored is this “from the MBA classroom” observation:

Despite hundreds of vendors currently selling facial recognition technology across the United States, there is no single government body registering the technology’s rollout, nor is there a public-facing list of such companies working with law enforcement. To document which companies are selling such technology today, the best resource the public has is a governmental agency called the National Institute of Standards and Technology.

Governments are doing a wonderful job it seems. Perhaps the European Union should step forward? What about Brazil? China? Russia? The United Nations? With Covid threats apparently declining, maybe the World Health Organization? Yep, governments.

Then, after wanting a central listing of FR vendors, this passage snagged one of my researcher’s attention:

NIST is a government organization responsible for setting scientific measurement standards and testing novel technology. As a public service, NIST also provides a rolling analysis of facial recognition algorithms, which evaluates the accuracy and speed of a vendor’s algorithms. Recently, that analysis has also included aspects of facial recognition field like algorithmic bias based on race, age, and sex. NIST has previously found evidence of bias in a majority of algorithms studied.

Yep, NIST. The group has done an outstanding job for enterprise search. Plus the bias in algorithms has been documented and run through the math grinding wheel for many years. Put in snaps of bad actors and the FR system does indeed learn to match one digital watermark with a similar digital watermark. Run kindergarten snaps through the system and FR matches are essentially useless. Bias? Sure enough.

Consider these ideas:

  • An organization, maybe Medium, should build a database of FR companies
  • An organization, maybe Medium, should test each of the FR systems using available datasets or better yet building a training set
  • An organization, maybe Medium, should set up a separate public policy blog to track government organizations which are not doing the job to Medium’s standards.

There is an interest in facial recognition because there is a need to figure out who is who. There are some civil disturbances underway in a certain high profile country. FR systems may not be perfect, but they may offer a useful tool to some. On the other hand, why not abandon modern tools until they are perfect.

We live in an era of good enough, and that’s what is available.

Stephen E Arnold, June 3, 2020

Google and Its Hard-to-Believe Excuse of the Week

May 27, 2020

I taught for one or two years when I was in graduate school. Did I ever hear a student say, “My dog ate my homework”? I sure did. I heard other excuses as well; for example, “I was shot on Thanksgiving Day (a true statement. The student showed me the bullet wound in his arm.) I also heard, “I had to watch my baby sister, and she was sick so I couldn’t get the homework.” True. As it turned out, the kid was an only child.

But I never heard, “The algorithm did it.”

Believe it or not, Engadget reported this: “YouTube Blames Bug for Censoring Comments on China’s Ruling Party.” I think Engadget should have written “about China’s” but these real journalists use Grammarly, like, you know.

The article states:

Whatever way the error made its way into YouTube, Google has been slow to address it.

For DarkCyber, the important point is that software, not a work from home or soon to be RIFed human made the error.

The Google plays the “algorithm did it.”

Despite Google’s wealth of smart software, the company’s voice technology has said nothing about the glitch.

Stephen E Arnold, May 27, 2020

Discrete Mathematics: Free and Useful

May 18, 2020

DarkCyber notes that Oscar Levin’s Discrete Mathematics: An Open Introduction can be downloaded from this link. The volume, now in its third edition, includes new exercises. Our favorite section address Graph Theory. There are exercises and old chestnuts like coloring a map. If you want to work on those policeware relationship map, you will find the book useful. In fact, the book provides enough information to allow one to understand the basics of determining who knows whom and other interesting insights in data from Facebook-type aggregators.

Stephen E Arnold, May 18, 2020

Bayesian Math: Useful Book Is Free for Personal Use

May 11, 2020

The third edition of Bayesian Data Analysis (updated on February 13, 2020) is available at this link. The authors are Andrew Gelman, John B. Carlin, Hal S. Stern, David B. Dunson, Aki Vehtari, and Donald B. Rubin. With the Bayes’ principles in hand, making sense of some of the modern smart systems becomes somewhat easier. The book covers the basics and advanced computation. One of the more interesting sections is Part V: Nonlinear and Nonparametric Models. You may want to add this to your library.

Stephen E Arnold, May11, 2020

LAPD Shutters Predictive Policing During Shutdown

May 7, 2020

Police departments are not immune to the economic impact of this pandemic. We learn the Los Angeles Police Department is shutting down its predictive policing program, at least for now, in TechDirt’s write-up, “LAPD’s Failed Predictive Policing Program the Latest COVID-19 Victim.” Writer Tim Cushing makes it perfectly clear he has never been a fan of the analytics approach to law enforcement:

“For the most part, predictive policing relies on garbage data generated by garbage cops, turning years of biased policing into ‘actionable intel’ by laundering it through a bunch of proprietary algorithms. More than half a decade ago, early-ish adopters were expressing skepticism about the tech’s ability to suss out the next crime wave. For millions of dollars less, average cops could have pointed out hot crime spots on a map based on where they’d made arrests, while still coming nothing close to the reasonable suspicion needed to declare nearly everyone in a high crime area a criminal suspect. The Los Angeles Police Department’s history with the tech seems to indicate it should have dumped it years ago. The department has been using some form of the tech since 2007, but all it seems to be able to do is waste limited law enforcement resources to violate the rights of Los Angeles residents. The only explanations for the LAPD’s continued use of this failed experiment are the sunk cost fallacy and its occasional use as a scapegoat for the department’s biased policing.”

Now, though, an April 15 memo from the LAPD declares the department is ceasing to use the PredPol software immediately due to COVID-19 related financial constraints. As one might suppose, Cushing hopes the software will remain off the table once the shutdown is lifted. Hey, anything is possible.

Cynthia Murrell, May 7, 2020

Google Recommendations: A Digital Jail Cell?

May 5, 2020

A team of researchers in at the Centre Marc Bloch in Berlin have closely studied filter bubbles (scientifically called “confinement”) on YouTube. While the phenomenon of filter bubbles across the Web has been a topic of study for several years, scientists Camille Roth, Antoine Mazieres, and Telmo Menezes felt the role of the recommendation algorithm on YouTube had been under-examined. In performing research to plug this gap, they found the dominant video site may produce the most confining bubbles of all. The team shares their main results in “Tubes and Bubbles: Topological Confinement of Recommendations on YouTube.” They summarize:

“Contrarily to popular belief about so-called ‘filter bubbles’, several recent studies show that recommendation algorithms generally do not contribute much, if at all, to user confinement; in some cases, they even seem to increase serendipity [see e.g., 1, 2, 3, 4, 5, 6]. Our study demonstrates however that this may not be the case on YouTube: be it in topological, topical or temporal terms, we show that the landscape defined by non-personalized YouTube recommendations is generally likely to confine users in homogeneous clusters of videos. Besides, content for which confinement appears to be most significant also happens to garner the highest audience and thus plausibly viewing time.”

The abstract to the team’s paper on the study describes their approach:

“Starting from a diverse number of seed videos, we first describe the properties of the sets of suggested videos in order to design a sound exploration protocol able to capture latent recommendation graphs recursively induced by these suggestions. These graphs form the background of potential user navigations along non-personalized recommendations. From there, be it in topological, topical or temporal terms, we show that the landscape of what we call mean-field YouTube recommendations is often prone to confinement dynamics.”

To read about the study in great, scientific detail, complete with illustrations, turn to the full paper published at the PLOS ONE peer-reviewed journal site. Established in 2012, The Centre Marc Bloch’s Computational Social Science Team enlists social scientists alongside computer scientists and modelers to study the social dynamics of today’s digital landscapes. If you are curious what that means, exactly, their page includes an interesting five-minute video describing their work.

Cynthia Murrell, May 5, 2020

Next Page »

  • Archives

  • Recent Posts

  • Meta