Amazon and Google Voice Recognition Easily Fooled

January 25, 2018

Voice recognition technology has vastly improved over the past decade, but it still has a long way to go before it responds like a quick-thinking science-fiction computer.  CNET shares how funny and harmful voice recognition technology can be in the article, “Fooling Amazon and Googles’ Voice Recognition Isn’t Hard.”  What exactly is the problem with voice recognition technology?  If someone sounds like you, smart speakers like Google Home or Amazon Echo with Alexa will allow that person to use your credit cards and access your personal information.

The smart speakers can be trained to recognize voices so that they can respond according to an individual.  For example, families can program the smart speakers to recognize individual members so each person can access their personal information.  It is quite easy to fool Alexa and Googles’ voice recognition.  Purchases can be made vocally and personal information can be exposed.  There are ways to take precautions, such as disabling voice purchasing and there are features to turn of broadcasting your personal information.

In their defense, Google said voice recognition should not be used as a security feature:

Google warns you when you first set up voice recognition that a similar voice might be able to access your info. In response to this story, Kara Stockton on the Google Assistant team offered the following statement over email: Users shouldn’t rely upon Voice Match as a security feature. It is possible for a user to not be identified, or for a guest to be identified as a connected user. Those cases are rare, but they do exist and we’re continuing to work on making the product better.’

Maybe silence is golden after all.  It keeps credit cards and purchases free from vocal stealing.

Whitney Grace, January 25, 2018

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta