False Positives: The New Normal

January 1, 2019

And this is why so many people are wary of handing too much power to algorithms. TechDirt reports, “School Security Software Decided Innocent Parent Is Actually a Registered Sex Offender.” That said, it seems some common sense on the part of the humans involved would have prevented the unwarranted humiliation. The mismatch took place at an Aurora, Colorado, middle school event, where parent Larry Mitchell presumably just wanted to support his son. When office staff scanned his license, however, the Raptor system flagged him as a potential offender. Reporter Tim Cushing writes:

“Not only did these stats [exact name and date of birth] not match, but the photos of registered sex offenders with the same name looked nothing like Larry Mitchell. The journalists covering the story ran Mitchell’s info through the same databases — including Mitchell’s birth name (he was adopted) — and found zero matches. What it did find was a 62-year-old white sex offender who also sported the alias ‘Jesus Christ,’ and a black man roughly the same age as the Mitchell, who is white. School administration has little to say about this botched security effort, other than policies and protocols were followed. But if so, school personnel need better training… or maybe at least an eye check. Raptor, which provides the security system used to misidentify Mitchell, says photo-matching is a key step in the vetting process….

We also noted:

“Even if you move past the glaring mismatch in photos (the photos returned in the Sentinel’s search of Raptor’s system are embedded in the article), neither the school nor Raptor can explain how Raptor’s system returned results that can’t be duplicated by journalists.”

This looks like a mobile version of the PEBCAK error, and such mistakes will only increase as these verification systems continue to be implemented at schools and other facilities across the country. Cushing rightly points to this problem as “an indictment of the security-over-sanity thinking.” Raptor, a private company, is happy to tout its great success at keeping registered offenders out of schools, but they do not reveal how often their false positives have ruined an innocent family’s evening, or worse. How much control is our society willing to hand over to AIs (and those who program them)?

Cynthia Murrell, January 1, 2018

Comments

One Response to “False Positives: The New Normal”

  1. test on January 17th, 2019 2:21 pm

    In fact when someone doesn’t be aware of then its up to other people that they will assist, so here it takes place. http://www.google.de

  • Archives

  • Recent Posts

  • Meta