I delighted to share the news that two of my students were recently recognized for their exceptional contributions to the OSU School of Communication. Qin Li received the Outstanding Peer Award to her efforts to mentor and support other students in the graduate program. And Jorge Cruz Ibarra received the Albert Warren Scholarship Award for his exceptional teaching.
Category Archives: Research News
New work on information sharing
A team that I’m collaborating with has just published a new article in Scientific Reports that compares the influence of novelty to that of belief consistency on sharing. Using observational data collected on Twitter and a pair of experiments, we demonstrate that belief consistency tends to be the stronger predictor.
Here’s the abstract:
In the classical information theoretic framework, information “value” is proportional to how novel/surprising the information is. Recent work building on such notions claimed that false news spreads faster than truth online because false news is more novel and therefore surprising. However, another determinant of surprise, semantic meaning (e.g., information’s consistency or inconsistency with prior beliefs), should also influence value and sharing. Examining sharing behavior on Twitter, we observed separate relations of novelty and belief consistency with sharing. Though surprise could not be assessed in those studies, belief consistency should relate to less surprise, suggesting the relevance of semantic meaning beyond novelty. In two controlled experiments, belief-consistent (vs. belief-inconsistent) information was shared more despite consistent information being the least surprising. Manipulated novelty did not predict sharing or surprise. Thus, classical information theoretic predictions regarding perceived value and sharing would benefit from considering semantic meaning in contexts where people hold pre-existing beliefs.
Social Science One paper now out
A paper based on collaborative project using data provided by Social Science One is now available. We used news stories shared on Facebook to capture the publication practices of both high and low-credibility outlets and analyzed the subject matter of those stories. Results suggest that although the two news ecosystems exhibit similar patterns in terms of the volume of publications, there are striking differences in the subject matter. There’s a press release describing the work here and the article can be found here.
New works published
I’ve done a poor job of announcing my publications recently, so here’s a quick summary of new work.
Today a paper led by a talented graduate student, Qin Li, was published in the Journal of Communication. We use a pair of panel studies collected by Rob Bond, Erik Nisbet, and me in 2019 and 2020 to assess when regional geographic difference might help to explain Americans’ ability to distinguish between political truths and falsehoods. We find that both battleground state status and state-level political homogeneity were influential in the election year. We take these results to indicate that the political and social communication contexts in which Americans’ live have a meaningful influence on their belief sensitivity.
Before that, Rob Bond and I had a paper published in PNAS Nexus which shed new light on the virality of true and false claims. Analyzing observation data collected on Reddit, we find that fact-checked posts that were found to be true elicit wider reaching, longer lasting conversations than posts found to be false. This is in stark contrast to other well known research on this topic using Twitter data, suggesting that the sociotechnical context matter when assessing virality.
Finally, back in January my accomplished (now former) graduate student, Dr. Shannon Poulsen, led a paper published in PLOS One examining whether Americans’ beliefs in false claims consistent with interpreting satire literally differ from their beliefs in false claims based on other types of misleading content. The evidence suggests that misperceptions based on satire are not as widespread as those based on other sources, but that there are systematic differences in who holds these two kinds of misperceptions. For example, Republicans are more likely to believe false claims with non-satiric origins than Democrats. And social media engagement is more strongly correlated with belief in satire than other types of misperceptions.
Abstracts and links to all the papers can be found on the papers section of this website.
NSF Convergence Accelerator
I’m delighted to announce that I am part of an interdisciplinary team that has received an NSF award as part of the Convergence Accelerator program.
Here’s the project abstract:
High volume, rapidly changing, diverse information, which often includes misinformation, can easily overwhelm decision makers during a crisis. Decisions made both during and long after a crisis, affect the trust between responsible decision makers and citizens (many from vulnerable populations), who are impacted by those decisions. This project seeks to help decision makers manage information, promoting reliance on authentic knowledge production processes while also reducing the impact of intentional disinformation and unintended misinformation. The project team will develop a suite of prototype tools that bring timely, high-quality integrated content to bear on decision making and governance, as a routine part of operations, and especially during a crisis. Integrated and authenticated content comprising scientific facts and technical information coupled with citizen and stakeholder viewpoints assure the accuracy of safety decisions and the appropriate prioritization of relief efforts. The project team will synthesize convergent expertise across multiple disciplines; engage and build stakeholder communities through partnerships with government and industry to guide tool development; build a prototype tool for authenticating data and managing misinformation; and validate the tool using real world crisis scenarios.
The project team will create use-inspired personalized AI-driven sensemaking prototype tools for decision-makers to comprehend and authenticate dynamic, uncertain, and often contradictory information to facilitate effective decisions during crises. The tools will focus on curation while accounting for source and explainable content credibility. Guidance from community stakeholders obtained using ethnographic methods will ensure that the resulting tools are practical, timely, and relevant for informed decision making. These tools will capitalize on features of the information environment, human cognitive abilities and limitations, and algorithmic approaches to managing information. In particular, content and network analyses can reveal constellations of sources with a higher probability of producing credible information, while knowledge graphs can help surface and organize important materials being shared while facilitating explainability. The project team will also design and develop a microworld environment to examine and improve tool robustness while simultaneously helping to train decision makers in real-world settings such as school districts and public health settings. This project represents a convergence of disciplines spanning expertise in computer science, social sciences, linguistics, network science, public health, cognitive science, operations, and communication that are necessary to achieve its goals. Partnerships between communities, government industry, and academia will ensure the deliverables are responsive to stakeholder needs.
Interim Director
I’m honored to have been asked to serve as Interim Director of the School of Communication at Ohio State University.
Are US conservatives more susceptible to misinformation?
Rob Bond and I have a new paper in Science Advances that explores this question. The short answer is yes, but maybe not for the reasons you might think. Asked to evaluate the veracity of hundreds of political claims over a six month period, conservatives were consistently less accurate that liberals. We further demonstrate that the media environment plays a significant role in explaining why this is. We based the statements we asked people to evaluate on news stories that got the most engagement on social media, and we found that falsehoods in that collection most often benefited conservatives, while truths tended to benefit liberals. Ultimately, our results suggest that both liberals and conservatives are biased, but that these biases have different implications for the two groups. The more conservatives believe claims that are good for their in-group, the less accurate they are; liberal who exhibit a similar bias become more accurate.
Report assessing the CPD’s response to 2020 BLM protests
Researchers at the Ohio State University have released a report examining the Columbus Police Department’s response to the protests that followed the murder of George Floyd. The report is based on interviews with more than 170 people, including police officers and protesters, and is divided into several chapters, covering themes such citizen-police relations, city leadership, policy and training, etc. The report culminates in more than two dozen recommendations. This is an important document, and I am honored to have played a small role as a member of the advisory team. Here’s how the report begins:
The murder of George Floyd, a Black man, by Derek Chauvin, a White Minneapolis, Minnesota, police officer on May 25, 2020, sparked months-long protests about racism and policing across the country and around the globe, including Columbus, Ohio. Captured on video and spread quickly through social media, Floyd’s death galvanized Americans to take to the streets in the midst of a global health pandemic to voice their anger and frustration about the many Black Americans who had been killed by police. The fairness of policing practice as applied to communities of color, particularly Black communities, and more fundamentally, the existence of the police as a legally sanctioned public institution were the clear motivations for the protests.
A recording of a video conference held shortly before the public release of the report is available here.
New Paper – Better Crowdcoding: Strategies for Promoting Accuracy in Crowdsourced Content Analysis
I have a new paper out in Communication Methods and Measures, produced in collaboration with Ceren Budak and Daniel Sude. The work is the first in a series of studies that I’ve conducted with faculty at the University of Michigan’s School of Information examining strategies for improving content analysis conducted using crowdsourced workers (e.g., MTurk). The publisher has provided a limited number of free eprints. If you are interested, you can download a copy here.
Abstract:
In this work, we evaluate different instruction strategies to improve the quality of crowdcoding for the concept of civility. We test the effectiveness of training, codebooks, and their combination through 2 × 2 experiments conducted on two different populations – students and Amazon Mechanical Turk workers. In addition, we perform simulations to evaluate the trade-off between cost and performance associated with different instructional strategies and the number of human coders. We find that training improves crowdcoding quality, while codebooks do not. We further show that relying on several human coders and applying majority rule to their assessments significantly improves performance.
Congratulations to Chloe Mortenson
Chloe has successfully defended her MA thesis, “The Indirect Threat of Misinformation to Democracy.” She will join the doctoral program at Northwestern University’s School of Communication this fall, where she will be working with Erik Nisbet.