Google Is Taught Homosexuality Is Bad
December 12, 2017
The common belief is that computers and software are objectives, inanimate objects capable of greater intelligence than humans. The truth is that humans developed computers and software, so the objective, inanimate objects are only as smart as their designers. What is even more hilarious is the sentiment analysis AI development process requires tons of data for the algorithms to read and teach itself to recognize patterns. The data used is “contaminated” with human emotion and prejudices. Motherboard wrote about how artificial bias pollutes AI in the article, “Google’s Sentiment Analyzer Thinks Being Gay Is Bad.”
The problem when designing AI is that if it is programmed with polluted and biased data, then these super intelligent algorithms will discriminate against people rather than being objective. Google released its Cloud Natural Language API that allows developers to add Google’s deep learning models into their own applications. Along with entity recognition, the API included a sentiment analyzer that detected when text contained a positive or negative sentiment. However, it has a few bugs and returns biased results, such as saying being gay is bad, certain religions are bad, etc.
It looks like Google’s sentiment analyzer is biased, as many artificially intelligent algorithms have been found to be. AI systems, including sentiment analyzers, are trained using human texts like news stories and books. Therefore, they often reflect the same biases found in society. We don’t know yet the best way to completely remove bias from artificial intelligence, but it’s important to continue to expose it.
The problem with programming AI algorithms is that it is difficult to feed it data free of human prejudices. It is difficult to work around these prejudices, because they are so ingrained in most data. Programmers are kept on their toes to find a solution, but it is not a one size fits all one. Too bad they cannot just stick with numbers and dictionaries.
Whitney Grace, December 12, 2017