New work led by Miranda Na

The first article resulting from a collaboration led by my graduate student Miranda Na has been accepted for publication published in the Journal of Health Communication. The article introduces the idea that “emotional congruence” can influence whether or not someone believes an unsubstantiated claim. When the emotions associated with the claim are consistent with the emotions that the individual is already experiencing, belief becomes more likely. The results of the study, which used a simulated health crisis as a test, are consistent with this idea.

This is the first in a series of studies, and it is at the core of Miranda’s dissertation.  Here’s the abstract:

Rumors pose a significant challenge to officials combating a public health crisis. The flow of unsubstantiated and often inaccurate information can dilute the effects of more accurate messaging. Understanding why rumors thrive in this context is a crucial first step to constraining them. We propose a novel mechanism for explaining rumor acceptance during a health crisis, arguing that the congruence between one’s emotional state and the emotion induced by a rumor leads people to believe the rumor. Data collected using a novel experimental design provide preliminary evidence for our emotional congruence hypothesis. Participants who felt angry were more likely to accept anger-inducing rumors than those who were not angry. We discuss the implications of this insight for public health officials combating rumors during a health crisis.

Na, K., Garrett, R. K., & Slater, M. D. (2018). Rumor Acceptance during Public Health Crises: Testing the Emotional Congruence Hypothesis. Journal of Health Communication, 23(8), 791-799. doi: 10.1080/10810730.2018.1527877

New papers on misinformation by OSU alum

Congratulations to Rachel Neo and Dustin Carnahan on their recent publications related to misinformation. Rachel and Dustin both worked with me when they were grad students at OSU, and I’m delighted to see their work getting this well deserved recognition.

Dustin’s paper, “Feeling Fine about Being Wrong: The Influence of Self-Affirmation on the Effectiveness of Corrective Information” (DOI:10.1093/hcr/hqy001) was published in the July issue of Human Communication Research. The paper uses a pair of experiments to demonstrate that self-affirmation can reduce biased responses to corrective information.

Rachel’s paper, “The Limits of Online Consensus Effects: A Social Affirmation Theory of How Aggregate Online Rating Scores Influence Trust in Factual Corrections” (DOI: 10.1177/0093650218782823) is online at Communication Research. This paper adds nuance to our understanding of bandwagon effects by demonstrating that people’s faith in online ratings can be influenced by their political views.

Policy brief now available at the Scholars Strategy Network

I recently joined the Scholars Strategy Network, an organization that “seeks to improve public policy and strengthen democracy by connecting scholars and their research to policymakers, citizens associations, and the media.” I’ve written a short essay the draws on several of my research projects to help make sense of “fake news” and the post-truth era.

Fake News is a Symptom— Not the Cause— of Americans’ Growing Reluctance to Accept Shared Facts

Journal issue focuses on “post-truth” era

If you’re interested in misinformation, you may want to check out the December 2017 issue of the Journal of Applied Research in Memory and Cognition.  Stephan Lewandowsky and colleagues wrote the lead article, “Beyond Misinformation“, and invited several other scholars across a range of fields to comment on their work.  The result is a collection of 11 articles by 19 authors dealing with the question of how we “understand and cope with the ‘post-truth’ era”. In my response, “The ‘Echo Chamber’ Distraction”, I argue that we need to focus less on “echo chambers” and more on disinformation campaigns. Audiences aren’t as fragmented as many people seem to think, but efforts to spread politically motivated falsehoods are evolving rapidly.

For the next 50 days, access to the article is free using this link: https://authors.elsevier.com/a/1WD5w7spf36NAm

News media coverage for PLOS ONE study

My work with Brian Weeks has generated a bit of news coverage this past week. The work has been covered by U.S. News & World Report, among others.  It also lead to a couple TV spots. There was a short segment on the local ABC affiliate, and I was a panelist on Face the State, a Sunday talk show produced by the Columbus CBS station.

I’ve also written a short essay discussing the work for The Conversation, which has since been picked up by Salon.  You can hear a brief interview with me about the work over at BYU radio.

New paper on epistemic beliefs & misperceptions

Brian Weeks and I have a new paper in PLOS ONE that describes a set of measured that can be used to assess people’s epistemic beliefs–their beliefs about the nature of knowledge–and uses these measures to help predict Americans beliefs in conspiracy theories and high-profile political and scientific falsehoods. Here’s the abstract and link:

Widespread misperceptions undermine citizens’ decision-making ability. Conclusions based on falsehoods and conspiracy theories are by definition flawed. This article demonstrates that individuals’ epistemic beliefs–beliefs about the nature of knowledge and how one comes to know–have important implications for perception accuracy. The present study uses a series of large, nationally representative surveys of the U.S. population to produce valid and reliable measures of three aspects of epistemic beliefs: reliance on intuition for factual beliefs (Faith in Intuition for facts), importance of consistency between empirical evidence and beliefs (Need for evidence), and conviction that “facts” are politically constructed (Truth is political). Analyses confirm that these factors complement established predictors of misperception, substantively increasing our ability to explain both individuals’ propensity to engage in conspiracist ideation, and their willingness to embrace falsehoods about high-profile scientific and political issues. Individuals who view reality as a political construct are significantly more likely to embrace falsehoods, whereas those who believe that their conclusions must hew to available evidence tend to hold more accurate beliefs. Confidence in the ability to intuitively recognize truth is a uniquely important predictor of conspiracist ideation. Results suggest that efforts to counter misperceptions may be helped by promoting epistemic beliefs emphasizing the importance of evidence, cautious use of feelings, and trust that rigorous assessment by knowledgeable specialists is an effective guard against political manipulation.

DOI: 10.1371/journal.pone.0184733

Partisan media contribute to misperceptions; not as simple as “echo chambers”

I have a new paper out, in collaboration with Brian Weeks and Rachel Neo, which argues that using partisan news sites can encourage users to adopt beliefs that are inconsistent with what they know about the evidence. The paper is forthcoming in the Journal of Computer-Mediated Communication, and an electronic version is available now:  dx.doi.org/10.1111/jcc4.12164.  An OSU press release summarizing the work is also available here: https://news.osu.edu/news/2016/08/10/media-wedge/

Selective exposure workshop in Israel

I was very fortunate to have the opportunity to participate in a workshop, “New Frontiers in Selective Exposure Research”, organized by Yariv Tsfati, Shira Dvir-Gvirsman, and Lilach Nir. There was an amazing group of scholars in attendance, the presentations were provocative, and the conversations lively. It was a great opportunity for Cornelia Mothes and I to get some feedback on our on-going collaboration.