The backfire effect is when a correction increases belief in the very misconception it is attempting to correct, and it is often used as a reason not to correct misinformation. The current study aimed to test whether correcting misinformation increases belief more than a no-correction control.
Patterns in candidate emergence affect who voters can choose from, and thus the quality of representative democracy. Despite extensive work concerning patterns in interest in running for office and, separately, patterns in emerged candidacies, there is little empirical evidence regarding the transition from being interested in running for office to emerging as a candidate.
Following the 2020 general election, Republican elected officials, including then-President Donald Trump, promoted conspiracy theories claiming that Joe Biden's close victory in Georgia was fraudulent. Extant literature suggests multiple hypotheses regarding effects these conspiracy theories could have had on Republican turnout in the Senate runoff elections that took place the following January.
Science rarely proceeds beyond what scientists can observe and measure, and sometimes what can be observed proceeds far ahead of scientific understanding. The twenty-first century offers such a moment in the study of human societies. A vastly larger share of behaviours is observed today than would have been imaginable at the close of the twentieth century.
This report is based on work supported by the National Science Foundation under grants SES2029292 and SES-2029297. Any opinions, findings, and conclusions or recommendations expressed here are those of the authors and do not necessarily reflect the views of the National Science Foundation. This research will also be supported in part by a generous grant from the Knight Foundation.
An individual’s issue preferences are non-separable when they depend on other issue outcomes (Lacy 2001a), presenting measurement challenges for traditional survey research. We extend this logic to the broader case of conditional preferences, in which policy preferences depend on the status of conditions with inherent levels of uncertainty -- and are not necessarily policies themselves.
Social media platforms rarely provide data to misinformation researchers. This is problematic as platforms play a major role in the diffusion and amplification of mis- and disinformation narratives. Scientists are often left working with partial or biased data and must rush to archive relevant data as soon as it appears on the platforms, before it is suddenly and permanently removed by deplatforming operations.
One of the most concerning notions for science communicators, fact-checkers, and advocates of truth, is the backfire effect; this is when a correction leads to an individual increasing their belief in the very misconception the correction is aiming to rectify.
Agent-based models present an ideal tool for interrogating the dynamics of communication and exchange. Such models allow individual aspects of human interaction to be isolated and controlled in a way that sheds new insight into complex behavioral phenomena. This approach is particularly valuable in settings beset by confounding factors and mixed empirical evidence.
Research at the intersection of machine learning and the social sciences has provided critical new insights into social behavior. At the same time, a variety of issues have been identified with the machine learning models used to analyze social data.