Algorithms: Bias and Shoes for the Shoemaker

July 13, 2015

I read the Slashdot item “CSTA: Google Surveying Educators on Unconscious Biases of Students, Parents.” Interesting, but I think there are biases in curricula, textbooks, and instructors. My hunch is that some schools have biases baked into the walls like the odor of grade school hall at 10 am on a snowy day in November.

I thought of a shoemaker whose children had lousy shoes. Did the family focus its attention on meeting the needs of customers? Did the family just forget that the children’s shoes might help sell more shoes if those shoes were collectible sneaks with the image of a basketball star on them?

I thought about this item from the gray lady: “When Algorithms Discriminate.” You may have to pay for this gem. Don’t hassle me if the link goes dead. Collar a New York Times’ executive and express your opinion about disappeared content, pay walls, and the weird blend of blog and “real” content. I don’t care.

The gray lady’s write up points out that

Google’s online advertising system, for instance, showed an ad for high-income jobs to men much more often than it showed the ad to women, a new study by Carnegie Mellon University researchers found.

So the idea is that Google’s algorithms discriminate because humans wrote the code?

Will Google (the shoemaker) turn its attention to its children’s shoes?

I include in my forthcoming column for Information Today “The Bing Summer Fling” that Bing also fiddles search results.

I know search systems are their human side. Perhaps Microsoft and Google can cooperate to determine how much discrimination surfaces in their next generation, state of the art, smart, objective, super duper systems?

My hunch is that the financial requirements may make such introspection unpopular. That’s why it is far safer to research students and parents. Who wants to look closely at his and her shoes?

Stephen E Arnold, July 13, 2015

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta