Have You Heard the AI Joke about? Yeah, Over and Over Again

June 23, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_t[1]Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Developers have been unable to program one key facet of human intelligence into AI: a sense of humor. Oh, ChatGPT has jokes, but its repertoire is limited. And when asked to explain why something is or is not funny, it demonstrates it just doesn’t get it. Ars Technica informs us, “Researchers Discover that ChatGPT Prefers Repeating 25 Jokes Over and Over.”

6 17 jokes suck

A young person in the audience says to the standup comedian: “Hey, dude. Your jokes suck. Did an AI write them for you?” This illustration, despite my efforts to show the comedian getting bombarded with apple cores, bananas, and tomatoes, would only produce this sanitized image. It’s great, right? Thanks, MidJourney.

Reporter Benj Edwards writes:

“Two German researchers, Sophie Jentzsch and Kristian Kersting, released a paper that examines the ability of OpenAI’s ChatGPT-3.5 to understand and generate humor. In particular, they discovered that ChatGPT’s knowledge of jokes is fairly limited: During a test run, 90 percent of 1,008 generations were the same 25 jokes, leading them to conclude that the responses were likely learned and memorized during the AI model’s training rather than being newly generated.”

See the article, if curious, for the algorithm’s top 10 dad jokes and their frequencies within the 1,008 joke sample. There were a few unique jokes in the sample, but the AI seems to have created them by combining elements of others. And often, those mashups were pure nonsense. We learn:

“The researchers found that the language model’s original creations didn’t always make sense, such as, ‘Why did the man put his money in the blender? He wanted to make time fly.’ When asked to explain each of the 25 most frequent jokes, ChatGPT mostly provided valid explanations according to the researchers’ methodology, indicating an ‘understanding’ of stylistic elements such as wordplay and double meanings. However, it struggled with sequences that didn’t fit into learned patterns and couldn’t tell when a joke wasn’t funny. Instead, it would make up fictional yet plausible-sounding explanations.”

Plausible sounding, perhaps, but gibberish nonetheless. See the write-up for an example. ChatGPT simply does not understand what it means for something to be funny. Humor, after all, is a quintessentially human characteristic. Algorithms may get better at mimicking it, but we must never lose sight of the fact that AI is software, incapable of amusement. Or any other emotion. If we begin thinking of AI as human, we are in danger of forgetting the very real limits of machine learning as a lens on the world.

Cynthia Murrell, June 23, 2023


Comments are closed.

  • Archives

  • Recent Posts

  • Meta