ChatGPT’s Use Goes Up But Election Info, Not Trusted

April 19, 2024

ChatGPT was released more than a year ago and Americans usage of the generative content engine increases. The Pew Research Center found that 23% of American adults used ChatGPT, up from 18% in July 2023. While the amount of people using ChatGPT continues to rise, many users are skeptical about the information it shares particularly related to election. The Pew Research Center posted a press release about this topic: “Americans’ Use of ChatGPT Is Ticking Up, But Few Trust Its Election Information.”

The Pew Research Center conducted a survey in February 2024 about how they use ChatGPT, such as for fun, learning, or workplace tasks. The respondents said they use the AI chatbot for these activities but they’re wary about trusting any information it spits out about the US 2024 presidential election. Four in ten adults have not too much or no trust in ChatGPT for accurate election information. Only 2% have a great deal or quite a bit of trust in the chatbot.

Pew found that 43% of younger adults (those under thirty years old) are more likely to use ChatGPT. That’s a ten point increase from 2023. Other age groups are using the chatbot more but the younger crowd remains the largest. Also Americans with more education are likely to use ChatGPT at 37% with postgraduate or other advanced degrees.

It’s also interesting to see how Americans are using ChatGPT: for entertainment, learning, or work.

“The share of employed Americans who have used ChatGPT on the job increased from 8% in March 2023 to 20% in February 2024, including an 8-point increase since July. Turning to U.S. adults overall, about one-in-five have used ChatGPT to learn something new (17%) or for entertainment (17%). These shares have increased from about one-in-ten in March 2023. Use of ChatGPT for work, learning or entertainment has largely risen across age groups over the past year. Still, there are striking differences between these groups (those 18 to 29, 30 to 49, and 50 and older)."

When it comes to the 2024 election, 38% or four in ten Americans don’t trust ChatGPT information, more specifically 18% don’t have too much trust and 20% have zero trust. The 2% outliers have a great deal/quite a bit of trust, while 10% of Americans have some trust. The other outlier groups are 15% of Americans who aren’t sure if they should trust ChatGPT or 34% who never heard of the chatbot. Regardless of political party, four in ten Republicans and Democrats don’t trust ChatGPT. It’s also noteworthy that very few people have turned to ChatGPT for election information.

Tech companies have pledged to prevent AI from being misused, but talk is cheap. Chatbots and big tech are programmed to return information that will keep users’ eyes glued to screen in the same vein as clickbait. Information does need to be curated, verified, and controlled to prevent misinformation. However, it draws a fine line between freedom of speech and suppression of information.

Whitney Grace, April 19, 2024

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta