New Paper – Better Crowdcoding: Strategies for Promoting Accuracy in Crowdsourced Content Analysis

I have a new paper out in Communication Methods and Measuresproduced in collaboration with Ceren Budak and Daniel Sude. The work is the first in a series of studies that I’ve conducted with faculty at the University of Michigan’s School of Information examining strategies for improving content analysis conducted using crowdsourced workers (e.g., MTurk). The publisher has provided a limited number of free eprints. If you are interested, you can download a copy here.

Abstract:

In this work, we evaluate different instruction strategies to improve the quality of crowdcoding for the concept of civility. We test the effectiveness of training, codebooks, and their combination through 2 × 2 experiments conducted on two different populations – students and Amazon Mechanical Turk workers. In addition, we perform simulations to evaluate the trade-off between cost and performance associated with different instructional strategies and the number of human coders. We find that training improves crowdcoding quality, while codebooks do not. We further show that relying on several human coders and applying majority rule to their assessments significantly improves performance.

Facebook funding

Erik Nisbet, Rob Bond, and I have received a grant from Facebook for about $50K to study misinformation in the 2020 election. Ours was one of 25 proposal funded out of a pool of more than 1,000 applications. The project focuses on quantifying the harms of misinformation during the election. The announcement is here: https://research.fb.com/blog/2020/08/announcing-the-winners-of-facebooks-request-for-proposals-on-misinformation-and-polarization/

The effect of on-line or memory-based processing when correcting misinformation

Dustin Carnahan and I have a new paper out in the International Journal of Public Opinion Research that uses a pair of experiments to assess how judgment processing strategies influence individuals’ response to corrections. We find that on-line processing is associated with less bias when updating beliefs in response to corrections than memory-based processing. The article doi is 10.1093/ijpor/edz037

The limited influence of corrective messages is one of the most striking observations in the misperceptions literature. We elaborate on this well-known outcome, showing that correction effectiveness varies according to recipients’ judgment strategy. Using data from two online experiments, we demonstrate that individuals’ responses to corrective messages are less biased by prior attitudes when they engage in on-line rather than memory-based processing. We also show that individuals are more responsive to one-sided messages under conditions of on-line rather than memory-based processing. Unexpectedly, two-sided messages, which repeat the inaccuracy before correcting it, performed better than one-sided messages among individuals using memory-based processes. These findings contribute to our understanding of fact-checking, and suggest strategies that could help promote greater responsiveness to corrective messages.

New paper linking partisan media to misperceptions via affective polarization

We have a new paper in the Journal of Communication arguing that one way that partisan media may be promoting misperceptions is by increasing affective polarization among those who consume it.

Garrett, R. K., Long, J., & Jeong, M. (2019). From Partisan Media to Misperception: Affective Polarization as Mediator. Journal of Communication. doi:10.1093/joc/jqz028 (free access link)

This article provides evidence that affective polarization is an important mechanism linking conservative media use to political misperceptions. Partisan media’s potential to polarize is well documented, and there are numerous ways in which hostility toward political opponents might promote the endorsement of inaccurate beliefs. We test this mediated model using data collected via nationally representative surveys conducted during two recent U.S. presidential elections. Fixed effects regression models using three-wave panel data collected in 2012 provide evidence that conservative media exposure contributes to more polarized feelings toward major-party presidential candidates, and this growing favorability gap is associated with misperceptions critical of the Democrats. Further, these effects are more pronounced among Republicans than among Democrats. Cross-sectional analyses using data collected in 2016 provide additional evidence of the mediated relationship. The theoretical and real-world significance of these results are discussed.

Note: The supplemental information is not yet posted on the publishers website. It is available here.

Ostracism and falsehood endorsement

We have a new paper in Political Communication demonstrating that being socially excluded can make people less receptive to a fact check of a belief that is widely held within their party.

50 free eprints are available here: https://www.tandfonline.com/eprint/BATWR42RRJBJDF7HRJ5B/full?target=10.1080/10584609.2019.1666943

Research suggests that ostracism could promote endorsement of partisan falsehoods. Socially excluded individuals are uniquely attentive to distinctions between in-groups and out-groups, and act in ways intended to promote group belonging, potentially including a greater willingness to accept claims made by other group members. We test this assertion with a 2 (ostracism) X 2 (anonymity) X 2 (topic) mixed factorial design using the Ostracism Online paradigm with a demographically diverse online sample of Americans (N = 413). Results suggest that when ostracized, both Democrats and Republicans are more likely to endorse party-line falsehoods about the 2016 U.S. Presidential election. These effects are contingent on several individual-level differences, including strength of ideological commitment, cognitive reflection, and faith in intuition for facts. These patterns failed to replicate with fracking, a politically charged science topic.

Flagging falsehoods on Facebook

Shannon Poulsen and I have a new paper out in the Journal of Computer-Mediated Communication. We tested several different ways of flagging false information posted on Facebook and found that people were most responsive to warnings indicated that the content came from a satirical news site. The open-access paper is available here: 10.1093/jcmc/zmz012

Update: There’s a short write up of the research over at Fast Company.

Update 2: The publisher hasn’t yet upload the Supplemental Information file. It is also available here.

New collaboration with Brian Weeks

Brian Weeks and I have a chapter in James Katz’s new collection, Social media and journalism’s search for truth. In it we consider how the emotionally evocative nature of social media may be contributing to users’ propensity to hold misperceptions. It’s an impressive collection of essays; just look at the book’s table of contents. If you’re interested, you can see a preview of the chapter here.

Weeks, B., & Garrett, R. K. (2019). The Emotional Characteristics of Social Media and Political Misperceptions. In J. E. Katz (Ed.), Social media and journalism’s search for truth (pp. 236-250). Oxford University Press.