Sandra González-Bailón, David Lazer, Pablo Barberá, William Godel, Hunt Allcott, Taylor Brown, Adriana Crespo-Tenorio, Deen Freelon, Matthew Gentzkow, Andrew M. Guess, Shanto Iyengar, Young Mie Kim, Neil Malhotra, Devra Moehler, Brendan Nyhan, Jennifer Pan, Carlos Velasco Rivera, Jaime Settle, Emily Thorson, Rebekah Tromble, Arjun Wilkins, Magdalena Wojcieszak, Chad Kiewiet de Jonge, Annie Franco, Winter Mason, Natalie Jomini Stroud, Joshua A. Tucker
Sociological Science December 11, 2024
10.15195/v11.a41
Abstract
Social media creates the possibility for rapid, viral spread of content, but how many posts actually reach millions? And is misinformation special in how it propagates? We answer these questions by analyzing the virality of and exposure to information on Facebook during the U.S. 2020 presidential election. We examine the diffusion trees of the approximately 1 B posts that were re-shared at least once by U.S.-based adults from July 1, 2020, to February 1, 2021. We differentiate misinformation from non-misinformation posts to show that (1) misinformation diffused more slowly, relying on a small number of active users that spread misinformation via long chains of peer-to-peer diffusion that reached millions; non-misinformation spread primarily through one-to-many affordances (mainly, Pages); (2) the relative importance of peer-to-peer spread for misinformation was likely due to an enforcement gap in content moderation policies designed to target mostly Pages and Groups; and (3) periods of aggressive content moderation proximate to the election coincide with dramatic drops in the spread and reach of misinformation and (to a lesser extent) political content.
Social media creates the possibility for rapid, viral spread of content, but how many posts actually reach millions? And is misinformation special in how it propagates? We answer these questions by analyzing the virality of and exposure to information on Facebook during the U.S. 2020 presidential election. We examine the diffusion trees of the approximately 1 B posts that were re-shared at least once by U.S.-based adults from July 1, 2020, to February 1, 2021. We differentiate misinformation from non-misinformation posts to show that (1) misinformation diffused more slowly, relying on a small number of active users that spread misinformation via long chains of peer-to-peer diffusion that reached millions; non-misinformation spread primarily through one-to-many affordances (mainly, Pages); (2) the relative importance of peer-to-peer spread for misinformation was likely due to an enforcement gap in content moderation policies designed to target mostly Pages and Groups; and (3) periods of aggressive content moderation proximate to the election coincide with dramatic drops in the spread and reach of misinformation and (to a lesser extent) political content.
This work is licensed under a Creative Commons Attribution 4.0 International License. |
Reproducibility Package: Deidentified data and analysis code from this study are deposited in the Social Media Archive at ICPSR, part of the University of Michigan Institute for Social Research. The data are available for university IRB-approved research on elections or to validate the findings of this study. ICPSR will receive and vet all applications for data access. Access through the ICPSR Archive ensures that the data and code are used only for the purposes for which they were created and collected. The code would also be more difficult to navigate separately from the data, which is why both are housed in the same space. Website: https://socialmediaarchive.org/collection/US2020.
- Citation: González-Bailón, Sandra, David Lazer, Pablo Barberá et al. 2024. “The Diffusion and Reach of (Mis)Information on Facebook During the U.S. 2020 Election.” Sociological Science 11: 1124-1146.
- Received: September 9, 2024
- Accepted: October 24, 2024
- Editors: Arnout van de Rijt, Cristobal Young
- DOI: 10.15195/v11.a41
No reactions yet.