AIs Newest Hurdle Happens When the Machines Hallucinate
November 27, 2017
Artificial Intelligence has long been thought of as an answer to airport security and other areas. The idea of intelligent machines finding the bad guys is a good one in theory. But what if the machines aren’t as clever as we think? A stunning new article in The Verge, “Google’s AI Thinks This Turtle is a Gun and That’s a Problem,” made us sit up and take notice.
As you can guess by the title, Google’s AI made a huge flub recently:
This 3D-printed turtle is an example of what’s known as an “adversarial image.” In the AI world, these are pictures engineered to trick machine vision software, incorporating special patterns that make AI systems flip out. Think of them as optical illusions for computers. You can make adversarial glasses that trick facial recognition systems into thinking you’re someone else, or can apply an adversarial pattern to a picture as a layer of near-invisible static. Humans won’t spot the difference, but to an AI it means that panda has suddenly turned into a pickup truck.
This adversarial image news is especially concerning when you consider how quickly airports are implementing this technology. Dubai International airport is already using self-driving carts for luggage. It’s only a matter of time until security screening goes the same way. You’d best hope they iron out adversarial image issues before we do.
Patrick Roland, November 27, 2017
Comments
2 Responses to “AIs Newest Hurdle Happens When the Machines Hallucinate”
So you see, these outbound links have a likelihood of being matched
by a GSA Ser service (Shaunte) engine.
Usage URLs for worldwide website lists if enabled – As opposed to utilizing internet
search engine to get target URLs for your job, you can use your global website listings i.e.
Identified”, Sent”, Validated”, as well as Fell short”.
My homepage klicka här (Jody)