Neural Network Image Analysis: A Resource for Comedy Central

September 18, 2015

I read “This Neural Network’s Hilariously Bad Image Descriptions Are Still Advanced AI .” Describing a parrot as a cat sitting on a commode struck me as clever.

The write up reminded me:

Samim [a narrative engineer whatever that is] recently asked a neural net to caption a series of pop culture videos and clips from movies to illustrate the huge variance in how accurate these algorithms are—producing some amazing stupid and funny machine-written descriptions of Kanye West, Luke Skywalker, and even Big Dog.

I learned:

His most recent computational comedy project popped up on his blog last week. In it, he set up an experiment that tested how well neural networks could caption videos from pop culture. He used an open source model developed by Google and Stanford called NeuralTalk, which looks at an image and describes it with a brief caption.

The result is digital humor.

One quick question: What if smart software focused on curing cancer like Watson or reversing a genetic disorder like Alphabet wants to do stand up instead of filling out required forms? Subrogation may not have much of a funny bone.

Stephen E Arnold, September 18, 2015

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta