Computers Learn Discrimination from Their Programmers

September 14, 2015

One of the greatest lessons one take learn from the Broadway classic South Pacific is that children aren’t born racist, rather they learn about racism from their parents and other adults.  Computers are supposed to be infallible, objective machines, but according to Gizmodo’s article, “Computer Programs Can Be As Biased As Humans.”  In this case, computers are “children” and they observe discriminatory behavior from their programmers.

As an example, the article explains how companies use job application software to sift through prospective employees’ resumes.  Algorithms are used to search for keywords related to experience and skills with the goal of being unbiased related to sex and ethnicity.  The algorithms could also be used to sift out resumes that contain certain phrases and other information.

“Recently, there’s been discussion of whether these selection algorithms might be learning how to be biased. Many of the programs used to screen job applications are what computer scientists call machine-learning algorithms, which are good at detecting and learning patterns of behavior. Amazon uses machine-learning algorithms to learn your shopping habits and recommend products; Netflix uses them, too.”

The machine learning algorithms are mimicking the same discrimination habits of humans.  To catch these computer generated biases, other machine learning algorithms are being implemented to keep the other algorithms in check.  Another option to avoid the biases is to reload the data in a different manner so the algorithms do not fall into the old habits.  From a practical stand point it makes sense: if something does not work the first few times, change the way it is done.

Whitney Grace, September 14, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta