EGU2026 - Presentation about the Skeptical Science Experiment

Skeptical Science Puts Climate Rebuttals to the Test: Key Insights from a Four-Year Experiment

Sharing is caring!

EGU2026 - Presentation about the Skeptical Science Experiment

EGU2026 – Presentation about the Skeptical Science Experiment – Image for illustrative purposes only (Image credits: Unsplash)

Vienna — Researchers at the European Geosciences Union General Assembly revealed results from a novel online experiment that probed the power of targeted climate messaging. Skeptical Science, a long-standing platform dedicated to countering misinformation, tracked how its detailed rebuttals influenced visitors’ views on contentious climate topics. The findings, shared in a session on geoethics, highlight both successes in shifting perceptions and areas where refinements could amplify impact.

Origins of a Volunteer-Driven Effort

Skeptical Science began as a solo venture in 2007 when John Cook launched the site to tackle persistent myths about human-driven climate change. Over time, it grew into a collaborative non-profit with global volunteers maintaining a library exceeding 250 rebuttals, each grounded in peer-reviewed studies. The platform draws millions of organic visitors, primarily through search engines seeking answers to common skeptic arguments.

Organizers grew curious about their content’s real-world influence. They sought evidence on whether rebuttals not only diminished myth acceptance but also bolstered factual understanding. Such data promised to guide improvements, reveal audience profiles, and quantify broader effects on public discourse.

Crafting the Real-World Survey

The experiment targeted English-language rebuttal pages from November 2021 through July 2025. Upon arrival via Google search, users encountered an invitation to join a brief survey via a pop-up modal. Those who consented first rated agreement with a statement tied to the page’s topic—randomly assigned as either a scientific fact or a corresponding myth—using a six-point Likert scale ranging from strongly agree to strongly disagree.

After scrolling through the full rebuttal, participants repeated the identical survey. The setup captured reading times and ensured brevity to encourage completion. A companion list of all statements appears in the project’s published analysis, covering topics like climate model reliability.

Participant Snapshot and Baseline Views

Out of 858,016 visitors shown the prompt, 13,432 agreed to the pre-survey, with 6,261 finishing the post-rebuttal version. Roughly equal numbers rated facts (3,146) and myths (3,115). Most arrived leaning toward scientific consensus: nearly half (46.3 percent) strongly endorsed facts or rejected myths outright.

This skew suggested the site primarily served those already aligned with climate science, potentially reinforcing their positions rather than converting skeptics. Still, a meaningful portion identified as undecided or dismissive, offering a window into varied starting points. The data underscored the platform’s role in equipping convinced readers with ammunition against misinformation.

Measuring Change: Progress with Pockets of Backslide

Overall, the rebuttals yielded positive shifts. Myth belief declined across the board, including among initially dismissive users who strongly backed myths or rejected facts pre-reading. Accuracy improved on average, signaling that detailed explanations resonated even with resistant audiences.

Challenges emerged in subsets. Highly certain fact-agreers occasionally dipped in accuracy post-rebuttal, an unexpected outcome researchers attributed to nuanced content overwhelming firm priors. Among rebuttals with at least 50 completions, patterns sharpened the picture. High performers consistently supplied a clear replacement fact and pinpointed logical fallacies in myths. Low performers often skipped these elements, correlating with perception declines.

To illustrate:

  • Top rebuttals (positive shifts): Emphasized substitute facts; routinely flagged fallacies like false dichotomies or appeals to ignorance.
  • Bottom rebuttals (negative shifts): Lacked explicit fact replacements; seldom dissected myth logic.

This dissection, drawn from comparing the top three and bottom three performers, pointed to structural tweaks for better outcomes. Without direct feedback questions—a deliberate choice for brevity—interpreters relied on these correlations, preserving the survey’s high participation.

Relaunch and Road Ahead

The experiment coincided with a full website overhaul at Skeptical Science. New rebuttals now adopt a fact-myth-fallacy framework, directly informed by the data to enhance debunking potency. A related blog details these updates, timed alongside the EGU presentation.

Future iterations will resume post-relaunch with refined surveys, including targeted or open-ended prompts to clarify motivations and influences. The volunteer team behind the setup combined expertise: Cook supplied research framing and statements; Doug Bostrom handled backend infrastructure; Collin Maessen and Timo Lubitz managed programming integration.

Full results appeared open-access in Geoscience Communication on April 2, 2026, titled “Quantifying the impact of Skeptical Science rebuttals in reducing climate misperceptions.” The EGU talk, titled “Results of the Skeptical Science experiment and impacts on relaunched website,” occurred in session EOS4.1 on Geoethics. As misinformation persists, these insights equip communicators with evidence-based tools to foster clearer climate understanding.

About the author
Lucas Hayes

Leave a Comment