Machine Learning Search Algorithms Reflect Female Stereotypes
August 26, 2016
The article on MediaPost titled Are Machine Learning Search Algorithms To Blame for Stereotypes? poses a somewhat misleading question about the nature of search algorithms such as Google and Bing in the area of prejudice and bias. Ultimately they are not the root, but rather a reflection on their creators. Looking at the images that are returned when searching for “beautiful” and “ugly” women, researchers found the following.
“In the United States, searches for “beautiful” women return pictures that are 80% white, mostly between the ages of 19 and 28. Searches for “ugly” women return images of those about 60% white and 20% black between the ages of 30 to 50. Researchers admit they are not sure of the reason for the bias, but conclude that they may stem from a combination of available stock photos and characteristics of the indexing and ranking algorithms of the search engines.”
While it might be appealing to think that machine learning search algorithms have somehow magically fallen in line with the stereotypes of the human race, obviously they are simply regurgitating the bias of the data. Or alternately, perhaps they learn prejudice from the humans selecting and tuning the algorithms. At any rate, it is an unfortunate record of the harmful attitudes and racial bias of our time.
Chelsea Kerwin, August 26, 2016