Understanding Problematic Sharing Behavior on Facebook

A research team that I lead is one of 12 inaugural recipients of the Social Media and Democracy Research grants from the Social Science Research Council and their partner Social Science One. The team includes Rob Bond (Ohio State University), Ceren Budak (University of Michigan), Jason Jones (Stony Brook University), and Drew Margolin (Cornell University).

The award provides unprecedented access to anonymous data from Facebook on the sharing of online content. These data will be used to examine a variety of behaviors that could be harm people’s understanding of science, politics and their community, notably including sharing inaccurate information.

A brief description of the project is here: https://www.ssrc.org/fellowships/view/social-media-and-democracy-research-grants/grantees/garrett/

Other grantees are listed here: https://www.ssrc.org/fellowships/view/social-media-and-democracy-research-grants/grantees/

And more details about the types of data to be analyzed are here: https://newsroom.fb.com/news/2019/04/election-research-grants/

Social media’s contribution to political misperceptions

I have a new paper just published that provides more evidence that the effects of social media on political beliefs have tended to be small. Others scholars have published work reaching similar conclusions, but I think that seeing consistent results across a range of methods is useful. This paper uses fixed effects models of panel data collected from large, representative samples during both the 2012 and 2016 elections. The paper is open-access, and is available here:

https://doi.org/10.1371/journal.pone.0213500

UPDATE: Tom Jacobs provides a nice overview of the work at Pacific Standard, and there’s a short write up by Alex Lardieri in US News & World Reports, too.

New work led by Miranda Na

The first article resulting from a collaboration led by my graduate student Miranda Na has been accepted for publication published in the Journal of Health Communication. The article introduces the idea that “emotional congruence” can influence whether or not someone believes an unsubstantiated claim. When the emotions associated with the claim are consistent with the emotions that the individual is already experiencing, belief becomes more likely. The results of the study, which used a simulated health crisis as a test, are consistent with this idea.

This is the first in a series of studies, and it is at the core of Miranda’s dissertation.  Here’s the abstract:

Rumors pose a significant challenge to officials combating a public health crisis. The flow of unsubstantiated and often inaccurate information can dilute the effects of more accurate messaging. Understanding why rumors thrive in this context is a crucial first step to constraining them. We propose a novel mechanism for explaining rumor acceptance during a health crisis, arguing that the congruence between one’s emotional state and the emotion induced by a rumor leads people to believe the rumor. Data collected using a novel experimental design provide preliminary evidence for our emotional congruence hypothesis. Participants who felt angry were more likely to accept anger-inducing rumors than those who were not angry. We discuss the implications of this insight for public health officials combating rumors during a health crisis.

Na, K., Garrett, R. K., & Slater, M. D. (2018). Rumor Acceptance during Public Health Crises: Testing the Emotional Congruence Hypothesis. Journal of Health Communication, 23(8), 791-799. doi: 10.1080/10810730.2018.1527877

New papers on misinformation by OSU alum

Congratulations to Rachel Neo and Dustin Carnahan on their recent publications related to misinformation. Rachel and Dustin both worked with me when they were grad students at OSU, and I’m delighted to see their work getting this well deserved recognition.

Dustin’s paper, “Feeling Fine about Being Wrong: The Influence of Self-Affirmation on the Effectiveness of Corrective Information” (DOI:10.1093/hcr/hqy001) was published in the July issue of Human Communication Research. The paper uses a pair of experiments to demonstrate that self-affirmation can reduce biased responses to corrective information.

Rachel’s paper, “The Limits of Online Consensus Effects: A Social Affirmation Theory of How Aggregate Online Rating Scores Influence Trust in Factual Corrections” (DOI: 10.1177/0093650218782823) is online at Communication Research. This paper adds nuance to our understanding of bandwagon effects by demonstrating that people’s faith in online ratings can be influenced by their political views.

Policy brief now available at the Scholars Strategy Network

I recently joined the Scholars Strategy Network, an organization that “seeks to improve public policy and strengthen democracy by connecting scholars and their research to policymakers, citizens associations, and the media.” I’ve written a short essay the draws on several of my research projects to help make sense of “fake news” and the post-truth era.

Fake News is a Symptom— Not the Cause— of Americans’ Growing Reluctance to Accept Shared Facts

Journal issue focuses on “post-truth” era

If you’re interested in misinformation, you may want to check out the December 2017 issue of the Journal of Applied Research in Memory and Cognition.  Stephan Lewandowsky and colleagues wrote the lead article, “Beyond Misinformation“, and invited several other scholars across a range of fields to comment on their work.  The result is a collection of 11 articles by 19 authors dealing with the question of how we “understand and cope with the ‘post-truth’ era”. In my response, “The ‘Echo Chamber’ Distraction”, I argue that we need to focus less on “echo chambers” and more on disinformation campaigns. Audiences aren’t as fragmented as many people seem to think, but efforts to spread politically motivated falsehoods are evolving rapidly.

For the next 50 days, access to the article is free using this link: https://authors.elsevier.com/a/1WD5w7spf36NAm