Algorithms Are Objective As Long As You Write Them
September 8, 2015
I read “Big Data’s Neutral Algorithms Could Discriminate against Most Vulnerable.” Ridiculous. Objective procedures cannot discriminate. The numerical recipes do what they do.
Ah, but when a human weaves together methods and look up tables, sets thresholds, and uses Bayesian judgments, well, maybe a little bit of bias can be baked in.
The write up reports:
So how will the courts address algorithmic bias? From retail to real estate, from employment to criminal justice, the use of data mining, scoring software and predictive analytics programs is proliferating at an exponential rate. Software that makes decisions based on data like a person’s ZIP code can reflect, or even amplify, the results of historical or institutional discrimination.”[A]n algorithm is only as good as the data it works with,” Solon Barocas and Andrew Selbst write in their article “Big Data’s Disparate Impact,” forthcoming in the California Law Review. “Even in situations where data miners are extremely careful, they can still affect discriminatory results with models that, quite unintentionally, pick out proxy variables for protected classes.”
And I liked this follow on:
It’s troubling enough when Flickr’s auto-tagging of online photos label pictures of black men as “animal” or “ape,” or when researchers determine that Google search results for black-sounding names are more likely to be accompanied by ads about criminal activity than search results for white-sounding names. But what about when big data is used to determine a person’s credit score, ability to get hired, or even the length of a prison sentence?
Shift gears. Navigate to “Microsoft Is Trying to Stop Users from Downloading Chrome or Firefox.” Objective, right?
Two thoughts. The math oriented legal eagles will sort this out. Lawyers are really good at math. Also, write your own algorithm and tune it to deliver what you want. No bias there. You are expressing your inner self.
It’s just a process and billable.
Stephen E Arnold, September 8, 2015