Gender Bias in Voice Recognition Software

February 21, 2017

A recent study seems to confirm what some have suspected: “Research Shows Gender Bias in Google’s Voice Recognition,” reports the Daily Dot. Not that this is anything new. Writer Selena Larson reminds us that voice recognition tech has a history of understanding men better than women, from a medical tracking system to voice-operated cars.  She cites a recent study by linguist researcher Rachael Tatman, who found that YouTube’s auto captions performed better on male voices than female ones by about 13 percent—no small discrepancy. (YouTube is owned by Google.)

Though no one is accusing the tech industry of purposely rendering female voices less effective, developers probably could have avoided this problem with some forethought. The article explains:

’Language varies in systematic ways depending on how you’re talking,’ Tatman said in an interview. Differences could be based on gender, dialect, and other geographic and physical attributes that factor into how our voices sound. To train speech recognition software, developers use large datasets, either recorded on their own, or provided by other linguistic researchers. And sometimes, these datasets don’t include diverse speakers.

Tatman recommends a purposeful and organized approach to remedying the situation. Larson continues:

Tatman said the best first step to address issues in voice tech bias would be to build training sets that are stratified. Equal numbers of genders, different races, socioeconomic statuses, and dialects should be included, she said.

Automated technology is developed by humans, so our human biases can seep into the software and tools we are creating to supposedly to make lives easier. But when systems fail to account for human bias, the results can be unfair and potentially harmful to groups underrepresented in the field in which these systems are built.

Indeed, that’s the way bias works most of the time—it is more often the result of neglect than of malice. To avoid it requires realizing there may be a problem in the first place, and working to avoid it from the outset. I wonder what other technologies could benefit from that understanding.

Cynthia Murrell, February 21, 2017

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta