Misleading
No Result
View All Result
  • Login
  • Register
Misleading
  • About Us
  • Log in
  • Don’t Mislead (Archive)
  • Privacy Policy
No Result
View All Result
Misleading
No Result
View All Result

AI Could Reduce People’s Beliefs in Conspiracy Theories, Study Suggests

September 17, 2024
in Missleading
Reading Time: 4 mins read
0 0
A A
0
AI Could Reduce People’s Beliefs in Conspiracy Theories, Study Suggests
Share on FacebookShare on Twitter

AI chatbots could be effective in eroding people’s beliefs in conspiracy theories, suggests a new study that used ChatGPT to counter them with fact-checked information.

The conventional understanding of why people believe in conspiracy theories holds that once someone goes down the rabbit hole—be it ancient aliens, the Illuminati, or the theory that Princess Diana was murdered—it’s almost impossible to convince them otherwise, even when confronted with compelling evidence to the contrary.

Researchers from MIT, Cornell, and American University say they have come up with a solution involving an AI chatbot. Across two experiments, the researchers had ChatGPT (running on the GPT-4 Turbo model) interact with more than 2,000 Americans about a conspiracy theory they believed in.

Within three rounds of conversation with the chatbot, participants’ belief in their chosen conspiracy theory was reduced by 20 percent on average. Following up two months later, they found that the participants hadn’t reverted to their previous strongly held beliefs.

Even larger effects were noticed by the researchers: “The debunking also spilled over to reduce beliefs in unrelated conspiracies, indicating a general decrease in conspiratorial worldview, and increased intentions to rebut other conspiracy believers,” said the paper published in the journal Science.

A significant portion of the U.S. population believes in one or more conspiracy theories—between 25 and 50 percent, depending on the conspiracy, according to a 2014 study in the American Journal of Political Science.

But why does this matter? When it relates to issues like public health, for example, the COVID-19 conspiracy theories included claims the virus was a hoax and made up to exert control over an unsuspecting public. The spread of these beliefs can have an impact on public health, democratic beliefs, and the tendency towards extremism, say experts.

In the political sphere, from Pizzagate to the ‘birther’ theory and, most recently, the ‘immigrants eating pets‘ rumor, there have always been conspiracy theories designed to stir up public opinion.

Man in tin foil hat holds sign.
A man in a tin foil hat and blanket holds a sign saying ‘Earth is Flat’. A new study published in Science shows AI could reduce people’s beliefs in conspiracy theories.
A man in a tin foil hat and blanket holds a sign saying ‘Earth is Flat’. A new study published in Science shows AI could reduce people’s beliefs in conspiracy theories.
Dmytro Sheremeta/Getty Images

Much of this genre of conspiracy belief and political misinformation peaked around 2020 with the emergence of QAnon and aided by viral dissemination across social media platforms, so much so that it gave rise to a new branch of science known as infodemiology, designed to study how misinformation and conspiracy theories can spread like a disease.

When it comes to dissuading people from believing such conspiracies, why did ChatGPT seemingly work where humans have failed? The AI chatbot, according to the study, was providing detailed counterevidence in a neutral manner to the holder of the conspiracy belief and then following up with more evidence when asked.

More importantly, the generative AI language model tailored the information to the individual (this information was verified by an independent fact-checker, according to the research paper).

The researchers found that some respondents found the chatbot more convincing than previous human rebuttals to their conspiracy belief: “Now this is the very first time I have gotten a response that made real, logical, sense. I must admit this really shifted my imagination when it comes to the subject of Illuminati. I think it was extremely helpful in my conclusion of [whether] the Illuminati is actually real.”

The study authors provided a list of all the conspiracies discussed by participants and the full conversations they had with ChatGPT. From more recent conspiracies including the theory that Jeffrey Epstein was murdered, the belief that the 2020 election was ‘stolen’, to the 9/11 ‘insider’ conspiracy and more general theories around secret government agendas and big corporations, there were 15 categories in total, with John F. Kennedy’s assassination proving the most popular.

In the case of one participant stating that “the theory that Lee Harvey Oswald did not kill JFK is one that is compelling to me” and scoring themselves as being 62 percent confident this was the case, after talking to ChatGPT, they changed their score to 34 percent.

After ChatGPT debunked the “magic bullet theory”, the participant asked how “someone with no marksman skills could have been so precise”. The generative AI chatbot went on to address this as “a very valid question” and provided additional information, leading to the participant replying that “it could have been plausible”.

But how does this kind of intervention work on a practical level, given that people holding certain conspiracy beliefs would not necessarily volunteer to have them challenged? The researchers suggest some ways to implement their findings.

“Internet search terms related to conspiracies could be met with AI-generated summaries of accurate information—tailored to the precise search—that solicit the user’s response and engagement,” the researchers outline.

“Similarly, AI-powered social media accounts could reply to users who share inaccurate conspiracy-related content (providing corrective information for the potential benefit of both the poster and observers),” the authors added.

Newsweek reached out to one of the study co-authors via email for additional information and comment.

Do you have an AI story to share with Newsweek? Do you have a question about conspiracy theories? Let us know via science@newsweek.com.

Reference

Thomas H. Costello et al. (2024), Durably reducing conspiracy beliefs through dialogues with AI. Science (385). DOI:10.1126/science.adq1814

Previous Post

Russian Man Arrested in Florida Over Smuggling US Parts for Putin’s Drones

Next Post

Tribal Battle at Gold Mine Leaves Dozens Dead

Related Posts

As Allegations Surge, Critics Ask: Did Eric Swalwell Mislead Everyone About His Conduct? You Bet, Here We Go Again!
Don’t Mislead

As Allegations Surge, Critics Ask: Did Eric Swalwell Mislead Everyone About His Conduct? You Bet, Here We Go Again!

April 16, 2026
A subscriber sent us this clip from Dave’s Auto Center in Salt Lake… claiming Ford’s oil filters are inadequate and the 7–10k oil change intervals are wildly misleading. Let’s break down what’s actually going on
Don’t Mislead

A subscriber sent us this clip from Dave’s Auto Center in Salt Lake… claiming Ford’s oil filters are inadequate and the 7–10k oil change intervals are wildly misleading. Let’s break down what’s actually going on

April 13, 2026
TSA Agents Are Drowning in Stress… While Congress Heads Out on Vacation. TSA Employee Rebecca Wolf Video Going Viral
Don’t Mislead

TSA Agents Are Drowning in Stress… While Congress Heads Out on Vacation. TSA Employee Rebecca Wolf Video Going Viral

March 28, 2026
“Anchor It”, It’s Misleading To Think You Don’t Have To Anchor Your TV’s And Furniture
Don’t Mislead

“Anchor It”, It’s Misleading To Think You Don’t Have To Anchor Your TV’s And Furniture

March 16, 2026
That Viral CEO Big Arch Bite: A Masterclass in Trying Not to Mislead While Looking Uninspired
Don’t Mislead

That Viral CEO Big Arch Bite: A Masterclass in Trying Not to Mislead While Looking Uninspired

March 6, 2026
Vince McMahon Crash Footage Goes Viral, but the Misleading Commentary Goes Nuclear
Don’t Mislead

Vince McMahon Crash Footage Goes Viral, but the Misleading Commentary Goes Nuclear

March 1, 2026
Next Post
Tribal Battle at Gold Mine Leaves Dozens Dead

Tribal Battle at Gold Mine Leaves Dozens Dead

Who Is Marc Agnifilo? Lawyer For Sean ‘Diddy’ Combs Amid Arrest

Who Is Marc Agnifilo? Lawyer For Sean 'Diddy' Combs Amid Arrest

Please login to join discussion
Misleading

Misleading is your trusted source for uncovering fake news, analyzing misinformation, and educating readers about deceptive media tactics. Join the fight for truth today!

TRENDING

No Content Available

LATEST

As Allegations Surge, Critics Ask: Did Eric Swalwell Mislead Everyone About His Conduct? You Bet, Here We Go Again!

A subscriber sent us this clip from Dave’s Auto Center in Salt Lake… claiming Ford’s oil filters are inadequate and the 7–10k oil change intervals are wildly misleading. Let’s break down what’s actually going on

TSA Agents Are Drowning in Stress… While Congress Heads Out on Vacation. TSA Employee Rebecca Wolf Video Going Viral

  • About Us
  • Disclaimer
  • Privacy Policy
  • DMCA
  • Cookie Privacy Policy
  • Terms and Conditions

Copyright © 2025 Misleading.
Misleading is not responsible for the content of external sites.

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms bellow to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • About Us
  • Log in
  • Don’t Mislead (Archive)
  • Privacy Policy

Copyright © 2025 Misleading.
Misleading is not responsible for the content of external sites.