Author Archive | Parker Webservices

Inequality and Total Effect Summary Measures for Nominal and Ordinal Variables

Trenton D. Mize, Bing Han

Sociological Science February 5, 2025
10.15195/v12.a7


Many of the topics most central to the social sciences involve nominal groupings or ordinal rankings. There are many cases in which a summary of a nominal or ordinal independent variable’s effect, or the effect on a nominal or ordinal outcome, is needed and useful for interpretation. For example, for nominal or ordinal independent variables, a single summary measure is useful to compare the effect sizes of different variables in a single model or across multiple models, as with mediation. For nominal or ordinal dependent variables, there are often an overwhelming number of effects to examine and understanding the holistic effect of an independent variable or how effect sizes compare within or across models is difficult. In this project, we propose two new summary measures using marginal effects (MEs). For nominal and ordinal independent variables, we propose ME inequality as a summary measure of a nominal or ordinal independent variable’s holistic effect. For nominal and ordinal outcome models, we propose a total ME measure that quantifies the comprehensive effect of an independent variable across all outcome categories. The added benefits of our methods are both intuitive and substantively meaningful effect size metrics and approaches that can be applied across a wide range of models, including linear, nonlinear, categorical, multilevel, longitudinal, and more.
Creative Commons LicenseThis work is licensed under a Creative Commons Attribution 4.0 International License.

Trenton D. Mize: Departments of Sociology & Statistics (by courtesy) and The Methodology Center at Purdue University
E-mail: tmize@purdue.edu

Bing Han: Department of Sociology, Purdue University
E-mail: han644@purdue.edu

Acknowledgements: We thank Shawn Bauldry and the audience at The Methodology Center at Purdue’s work-in-progress series for their helpful comments on this article. We also thank Jonathan Horowitz for a well-timed question that pushed us to further develop the methods for nominal and ordinal outcomes.

Reproducibility Package: All data and coding files needed to reproduce all results shown in this article are available at both www.trentonmize.com/research and OSF (osf.io/myehf/). In addition to the replication files, simplified template/example Stata and R files are also available in the same locations.

  • Citation: Mize, Trenton D., Bing Han. 2025. “Inequality and Total Effect Summary Measures for Nominal and Ordinal Variables” Sociological Science 12: 115-157.
  • Received: November 27, 2024
  • Accepted: January 7, 2025
  • Editors: Arnout van de Rijt, Kristian B. Karlson
  • DOI: 10.15195/v12.a7

0

Validating Factorial Survey Experiments: Response to Comment

Andrea G. Forster, Martin Neugebauer

Sociological Science January 27, 2025
10.15195/v12.a6


In Forster and Neugebauer (2024), we examine to what extent a factorial survey (FS) on invitations of fictitious applicants can replicate the findings of a nearly identical field experiment conducted with the same employers. In addition to exploring the conditions under which FSs provide valid behavioral predictions, we varied the topic sensitivity and tested whether behavioral predictions were more accurate after filtering out respondents who provided socially desirable answers or did not exert sufficient effort in responding to FS vignettes. Across these conditions, the FS results did not align well with the real-world benchmark. We conclude that researchers must exercise caution when using FSs to study (hiring) behavior. In this rejoinder, we respond to the critique of our study by Pickett (2025).
Creative Commons LicenseThis work is licensed under a Creative Commons Attribution 4.0 International License.

Andrea G. Forster: Utrecht University
E-mail: a.g.forster@uu.nl (Corresponding author)

Martin Neugebauer: Karlsruhe University of Education
E-mail: martin.neugebauer@ph-karlsruhe.de

  • Citation: Forster, Andrea G., Martin Neugebauer. 2024. “Validating Factorial Survey Experiments: Response to Comment” Sociological Science 12: 106-114.
  • Received: December 6, 2024
  • Accepted: December 6, 2024
  • Editors: Arnout van de Rijt
  • DOI: 10.15195/v12.a6



Factorial Survey Experiments to Predict Real-World Behavior: A Cautionary Tale from Hiring Studies

0

Invalidating Factorial Survey Experiments Using Invalid Comparisons Is Bad Practice: Learning from Forster and Neugebauer (2024)

Justin T. Pickett

Sociological Science January 27, 2025
10.15195/v12.a5


Forster and Neugebauer’s (2024) invalidation study is invalid. Their conclusion that factorial survey (FS) experiments “are not suited for studying hiring behavior” (P. 901) is unjustified, because their claim that they conducted a field experiment (FE) and FS with “nearly identical” designs is false (P. 891). The two experiments included: (1) different factor levels (for three factors), (2) different unvalidated applicant names (to manipulate ethnicity), (3) different applicant photos, (4) different fixed factors (e.g., applicant stories about moving), and (5) different experimental settings (e.g., testing, instrumentation, and conditions of anonymity). In the current article, I discuss each of these major design differences and explain why it invalidates Forster and Neugebauer’s (2024) comparison of their FE and FS findings. I conclude by emphasizing that social scientists are better served by asking why FE and FS findings sometimes differ than by assuming that any difference in findings across the experimental designs invalidates FS.
Creative Commons LicenseThis work is licensed under a Creative Commons Attribution 4.0 International License.

Justin T. Pickett: School of Criminal Justice, University at Albany, SUNY
E-mail: jpickett@albany.edu

  • Citation: Pickett, T. Justin. 2025. “Invalidating Factorial Survey Experiments Using Invalid Comparisons Is Bad Practice: Learning from Forster and Neugebauer (2024)” Sociological Science 12:97-105.
  • Received: September 27, 2024
  • Accepted: October 1, 2024
  • Editors: Arnout van de Rijt, Stephen Vaisey
  • DOI: 10.15195/v12.a5



Factorial Survey Experiments to Predict Real-World Behavior: A Cautionary Tale from Hiring Studies

0

The Genetics of Partnership Dissolution

Ruth Eva Jørgensen, Rosa Cheesman, Ole A. Andreassen, Torkild Hovde Lyngstad

Sociological Science January 20, 2025
10.15195/v12.a4


There is a genetic component to divorce risk, but little is known about which and how genetically influenced traits are involved. This study makes three major contributions to address these gaps. First, we link genetic data from the Norwegian Mother, Father, and Child Cohort Study (MoBa) to population register data and estimate the total influence of common genetic variants on partnership dissolution (N = 121, 408). Then, we identify heritable traits associated with partnership dissolution using event-history analysis and a broad set of polygenic indices. Finally, we assess whether associations are robust to controls for confounding in within-sibling models. Significant heritability estimates were found for both females (h2SNP = 0.09; SE = 0.01; p < 0.0001) and males (h2SNP = 0.03; SE = 0.01; p < 0.0001). Genetic dispositions for educational attainment and other sociodemographic factors decrease the probability of partnership dissolution, whereas dispositions for internalizing symptoms and risk behavior increase the likelihood of partnership dissolution. Integrating genetics and sociodemographic approaches can shed new light on the causes of partnership dynamics by helping us understand what drives the selection processes throughout the life course.
Creative Commons LicenseThis work is licensed under a Creative Commons Attribution 4.0 International License.

Ruth Eva Jørgensen: Department of Sociology and Human Geography, University of Oslo.*
E-mail: r.e.jorgensen@sosgeo.uio.no

Rosa Cheesman: PPROMENTA, Department of Psychology, University of Oslo
E-mail: r.c.g.cheesman@psykologi.uio.no

Ole A. Andreassen: NORMENT, Department of Clinical Medicine, University of Oslo
E-mail: ole.andreassen@medisin.uio.no

Torkild Hovde Lyngstad: Department of Sociology and Human Geography, University of Oslo
E-mail: t.h.lyngstad@sosgeo.uio.no

Acknowledgements: This research is part of the OPENFLUX project, which has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation program (grant agreement no. 818420). The Norwegian Mother, Father, and Child Cohort Study is supported by the Norwegian Ministry of Health and Care Services and the Ministry of Education and Research. We are grateful to all the participating families in Norway who take part in this on-going cohort study. We thank the Norwegian Institute of Public Health (NIPH) for generating high-quality genomic data. This research is part of the HARVEST collaboration, supported by the Research Council of Norway (#229624). We also thank the NORMENT Centre for providing genotype data, funded by the Research Council of Norway (#223273), Southeast Norway Health Authority, and KG Jebsen Stiftelsen. We further thank the Center for Diabetes Research, the University of Bergen, for providing genotype data and performing quality control and imputation of the data funded by the ERC AdG project SELECTionPREDISPOSED, Stiftelsen Kristian Gerhard Jebsen, Trond Mohn Foundation, the Research Council of Norway, the Novo Nordisk Foundation, the University of Bergen, and the Western Norway Health Authorities (Helse Vest). The version of the genotype data used is MobaPyschGen v1. We are very grateful to Alexandra Havdahl and Elizabeth Corfield for access to this version of the data. The authors are grateful to the Sociological Science editors and reviewers for feedback.

Supplemental Materials

Reproducibility Package: See paper for data availability statement. Access to administrative data from Statistics Norway can be applied for at Statistics Norway (https://www.ssb.no/mikrodata/) and access to MoBa Genetics can be applied for at the Norwegian Public Health Institute (https://www.fhi.no/studier/moba/). Code for data preparation and analysis is available at https://github.com/torkildl/genetics-dissolution.

  • Citation: Jørgensen, Ruth Eva, Rosa Cheesman, Ole A. Andreassen, Torkild Hovde Lyngstad. 2025. “The Genetics of Partnership Dissolution” Sociological Science 12: 76-96.
  • Received: October 24, 2024
  • Accepted: December 7, 2024
  • Editors: Arnout van de Rijt, Bart Bonikowski
  • DOI: 10.15195/v12.a4

0

Straight Jacket: The Implications of Multidimensional Sexuality for Relationship Quality and Stability

Yue Qian, Yang Hu

Sociological Science January 15, 2025
10.15195/v12.a3


The quality and stability of couple relationships have far-reaching consequences for the well-being of individual partners and patterns of family change. Although much research has compared the quality and stability of same-sex and different-sex relationships, the multidimensional nature of sexuality has received insufficient attention in this scholarship. Individuals in same-sex (different-sex) partnerships do not necessarily identify as gay/lesbian (straight) or report exclusive same-sex (different-sex) attraction—a phenomenon we term “identity/attraction–partnership inconsistency.” By analyzing nationally representative longitudinal data collected between 2017 and 2022, we show that identity/attraction–partnership inconsistency is common among U.S. adults, ranging from 2 percent of men in different-sex partnerships to 41 percent of women in same-sex partnerships. Regression results show that such inconsistency is associated with lower relationship quality and higher relationship instability, and these negative ramifications are particularly pronounced among individuals, notably men, in different-sex partnerships. Our findings uncover the implications of multidimensional sexuality for relationship dynamics and outcomes given the rigid institutionalization of different-sex couplehood and the close normative regulation of men’s heterosexuality. Our study highlights the importance of incorporating multiple dimensions of sexuality and their interplays into research on couple relationships and family change.
Creative Commons LicenseThis work is licensed under a Creative Commons Attribution 4.0 International License.

Yue Qian: Department of Sociology, The University of British Columbia
E-mail: yue.qian@ubc.ca

Yang Hu: Department of Sociology, Lancaster University
E-mail: yang.hu@lancaster.ac.uk

Acknowledgements: Yue Qian and Yang Hu share joint first authorship.

Supplemental Materials

Reproducibility Package: The data analyzed in this article are available from the Stanford Libraries Social Science Data Collection at https://data.stanford.edu/hcmst2017. All replication codes for this project are available at https://osf.io/n32ty/.

  • Citation: Qian, Yue, Yang Hu. 2025. “Straight Jacket: The Implications of Multidimensional Sexuality for Relationship Quality and Stability” Sociological Science 12: 51-75.
  • Received: October 21, 2024
  • Accepted: December 7, 2024
  • Editors: Ari Adut, Michael Rosenfeld
  • DOI: 10.15195/v12.a3


0

Getting a Foot in the Door: A Meta-Analysis of U.S. Audit Studies of Gender Bias in Hiring

So Yun Park, Eunsil Oh

Sociological Science January 9, 2025
10.15195/v12.a2


For the past three decades, scholars have conducted field experiments to examine gender-based hiring discrimination in the United States. However, these studies have produced mixed results. To further interpret these findings, we performed a meta-analysis of 37 audit studies conducted between 1990 and 2022. Using an aggregated sample of 243,202 fictitious job applications, the study finds no evidence of statistically significant gender discrimination at the study level. However, a series of more focused meta-analyses reveal important variations in the extent of discrimination by occupation type and applicant race. First, the gender composition of an occupation predicts gender bias in hiring. Second, the intersection of gender and race is critical—in female-dominated jobs, White female applicants receive more callbacks than their male counterparts, but Black female applicants experience no such benefit. The study contributes to the literature on labor market and gender (in)equality by synthesizing the findings of field experiments.
Creative Commons LicenseThis work is licensed under a Creative Commons Attribution 4.0 International License.

So Yun Park: Department of Sociology, University of Wisconsin–Madison
E-mail: spark553@wisc.edu

Eunsil Oh: Department of Sociology, University of Wisconsin–Madison
E-mail: eunsil.oh@wisc.edu

Acknowledgements: We wish to thank Kangwook Lee, Ziqian Lin, Yuchen Zeng, and the members of the University of Wisconsin-Madison Gender Seminar (FemSem) for their helpful feedback. We also appreciate comments provided by the editors and reviewers of the Sociological Science. Support for this research was provided by the University of Wisconsin-Madison Office of the Vice Chancellor for Research and Graduate Education with funding from the Wisconsin Alumni Research Foundation (grant #AAI9746).

Supplemental Materials

Reproducibility Package: Data and code are available at the Open Science Framework via https://osf.io/kp5df/.

  • Citation: Park, So Yun, Eunsil Oh. 2025. “Getting a Foot in the Door: A Meta-Analysis of U.S. Audit Studies of Gender Bias in Hiring” Sociological Science 12:26-50.
  • Received: September 28, 2024
  • Accepted: November 14, 2024
  • Editors: Ari Adut, Vida Maralani
  • DOI: 10.15195/v12.a2


0

The Risk Creates the Reward: Reputational Returns to Legal and Quality Risks in Online Illegal Drug Trade

William Holtkamp, Scott Duxbury, Dana L. Haynie

Sociological Science January 6, 2025
10.15195/v12.a1


Although buyers in unregulated markets depend heavily on reputational information in the absence of state oversight, few studies examine how the riskiness of a good may condition reputational effects on prices. We capitalize on novel data on 10,465 illegal drug exchanges on one online “darknet” illegal drug market and computational text analysis to evaluate how distinct types of legal and quality risks moderate reputational effects on illegal drug prices. Our results suggest that quality risk considerations are especially acute, where the effect of numeric sales ratings and the sentiment expressed in sales review text are both increased for non-prescription drugs and attenuated for prescription drugs. In contrast, we find limited evidence that legal risks moderate reputational effects on illegal drug prices. These results underscore the importance of quality risks in illegal purchasing decisions, identify quality risk as a determinant of reputational premiums for illegal drug prices, and shed light on how the riskiness of a specific good can guide economic action in unregulated trade settings.
Creative Commons LicenseThis work is licensed under a Creative Commons Attribution 4.0 International License.

William Holtkamp: University of North Carolina at Chapel Hill
E-mail: whkamp@live.unc.edu

Scott Duxbury: University of North Carolina at Chapel Hill
E-mail: duxbury@email.unc.edu

Dana L. Haynie: The Ohio State University
E-mail: haynie.7@osu.edu

Acknowledgements: We thank the editors and reviewers for their helpful comments, the audience at the annual meeting of the American Sociological Association in the section on Social Psychology, Status, Trust, and Social Exchange for discussions and feedback, and Srinivasan Parthasarathy and Mohit Jangid for coding assistance. This research was supported by two National Science Foundation grants (00046370 and 1949037).

Supplemental Materials

Reproducibility Package: All data, code, and analysis are available on Dataverse, https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/E3UDF0.

  • Citation: Holtkamp, William, Scott Duxbury, Dana L. Haynie. 2025. “The Risk Creates the Reward: Reputational Returns to Legal and Quality Risks in Online Illegal Drug Trade” Sociological Science 12: 1-25.
  • Received: June 21, 2024
  • Accepted: November 7, 2024
  • Editors: Ari Adut, Werner Raub
  • DOI: 10.15195/v12.a1


0

New OMB’s Race and Ethnicity Standards Will Affect How Americans Self-Identify

René D. Flores, Edward Telles, Ilana M. Ventura

Sociological Science December 16, 2024
10.15195/v11.a42


In March 2024, the U.S. Office of Management and Budget (OMB) approved major changes to the ethnic and racial self-identification questions used by all federal agencies, including the U.S. Census Bureau. These modifications include merging the separate race and Hispanic ethnicity questions into a single combined question and adding a Middle Eastern and North African category. Government officials and researchers have requested evidence on how Americans might react to these changes. We conducted a survey experiment with a nationally representative sample of 7,350 adult Americans. Participants were randomly assigned to answer either the existing separate race and ethnicity questions or a combined question proposed by the OMB. We find that the combined question decreases the percentage of Americans identifying as white and as some other race. We identify the key mechanism driving these effects: Hispanics decrease their identification in other categories when a Hispanic category is available in the combined question format. This results in statistically significant decreases in key minority populations, including Afro-Latinos and indigenous Latinos.
Creative Commons LicenseThis work is licensed under a Creative Commons Attribution 4.0 International License.

René D. Flores: Department of Sociology, University of Chicago
E-mail: renedf@uchicago.edu

Edward Telles: Department of Sociology, University of California, Irvine
E-mail: e.telles@uci.edu

Ilana M. Ventura: NORC at the University of Chicago
E-mail: ventura-ilana@norc.org

Acknowledgements: We thank Maria Abascal, Constance Citro, Steven Pedlow, and Abigail Weitzman for their valuable comments and suggestions. YouGov staff provided expert data collection assistance. The Center for the Study of Race, Politics, and Society and the Neubauer Family Assistant Professors Program at the University of Chicago and the University of California at Irvine provided generous support. Flores thanks the Center for Advanced Study in the Behavioral Sciences at Stanford University for granting a year of leave. All errors are uniquely ours.

Competing Interest: The authors declare that there are no competing interests.

Supplemental Materials

Reproducibility Package: A replication package containing all data and code used in this analysis is available through the Harvard Dataverse: https://doi.org/10.7910/DVN/NLDF3N.

  • Citation: Flores, D. René, Edward Telles, and Ilana M. Ventura. 2024. “New OMB’s Race and Ethnicity Standards Will Affect How Americans Self-Identify.” Sociological Science 11: 1147-1169.
  • Received: August 4, 2024
  • Accepted: October 21, 2024
  • Editors: Arnout van de Rijt, Vida Maralani
  • DOI: 10.15195/v11.a42


0

The Diffusion and Reach of (Mis)Information on Facebook During the U.S. 2020 Election

Sandra González-Bailón, David Lazer, Pablo Barberá, William Godel, Hunt Allcott, Taylor Brown, Adriana Crespo-Tenorio, Deen Freelon, Matthew Gentzkow, Andrew M. Guess, Shanto Iyengar, Young Mie Kim, Neil Malhotra, Devra Moehler, Brendan Nyhan, Jennifer Pan, Carlos Velasco Rivera, Jaime Settle, Emily Thorson, Rebekah Tromble, Arjun Wilkins, Magdalena Wojcieszak, Chad Kiewiet de Jonge, Annie Franco, Winter Mason, Natalie Jomini Stroud, Joshua A. Tucker

Sociological Science December 11, 2024
10.15195/v11.a41


Social media creates the possibility for rapid, viral spread of content, but how many posts actually reach millions? And is misinformation special in how it propagates? We answer these questions by analyzing the virality of and exposure to information on Facebook during the U.S. 2020 presidential election. We examine the diffusion trees of the approximately 1 B posts that were re-shared at least once by U.S.-based adults from July 1, 2020, to February 1, 2021. We differentiate misinformation from non-misinformation posts to show that (1) misinformation diffused more slowly, relying on a small number of active users that spread misinformation via long chains of peer-to-peer diffusion that reached millions; non-misinformation spread primarily through one-to-many affordances (mainly, Pages); (2) the relative importance of peer-to-peer spread for misinformation was likely due to an enforcement gap in content moderation policies designed to target mostly Pages and Groups; and (3) periods of aggressive content moderation proximate to the election coincide with dramatic drops in the spread and reach of misinformation and (to a lesser extent) political content.
Creative Commons LicenseThis work is licensed under a Creative Commons Attribution 4.0 International License.

Sandra González-Bailón: lead author with control rights; Annenberg School for Communication, University of Pennsylvania
E-mail: sandra.gonzalez.bailon@asc.upenn.edu

David Lazer: lead author with control rights; Network Science Institute, Northeastern University
E-mail: d.lazer@northeastern.edu

Pablo Barberá: lead Meta author; Meta
E-mail: us2020research@meta.com

William Godel: lead Meta author; Meta
E-mail: us2020research@meta.com

Hunt Allcott: Environmental and Energy Policy Analysis Center, Stanford University
E-mail: allcott@stanford.edu

Taylor Brown: Meta
E-mail: us2020research@meta.com

Adriana Crespo-Tenorio: Meta
E-mail: us2020research@meta.com

Deen Freelon: Annenberg School for Communication, University of Pennsylvania
E-mail: dfreelon@upenn.edu

Matthew Gentzkow: Department of Economics, Stanford University
E-mail: gentzkow@stanford.edu

Andrew M. Guess: Department of Politics and School of Public and International Affairs, Princeton University
E-mail: aguess@princeton.edu

Shanto Iyengar: Department of Political Science, Stanford University
E-mail: siyengar@stanford.edu

Young Mie Kim: School of Journalism and Mass Communication, University of Wisconsin-Madison
E-mail: ymkim5@wisc.edu

Neil Malhotra: Graduate School of Business, Stanford University
E-mail: neilm@stanford.edu

Devra Moehler: Meta
E-mail: us2020research@meta.com

Brendan Nyhan: Department of Government, Dartmouth College
E-mail: nyhan@dartmouth.edu

Jennifer Pan: Department of Communication, Stanford University
E-mail: jp1@stanford.edu

Carlos Velasco Rivera: Meta
E-mail: us2020research@meta.com

Jaime Settle: Department of Government, William & Mary
E-mail: jsettle@wm.edu

Emily Thorson: Department of Political Science, Syracuse University
E-mail: ethorson@gmail.com

Rebekah Tromble: School of Media and Public Affairs and Institute for Data, Democracy, and Politics, The George Washington University
E-mail: rtromble@email.gwu.edu

Arjun Wilkins: Meta
E-mail: us2020research@meta.com

Magdalena Wojcieszak: Department of Communication, University of California, Davis Center for Excellence in Social Science, University of Warsaw
E-mail: mwojcieszak@ucdavis.edu

Chad Kiewiet de Jonge: Meta research lead; Meta
E-mail: us2020research@meta.com

Annie Franco: Meta research lead; Meta
E-mail: us2020research@meta.com

Winter Mason: Meta research lead; Meta
E-mail: us2020research@meta.com

Natalie Jomini Stroud: co-last author and academic research lead; Moody College of Communication and Center for Media Engagement, University of Texas at Austin
E-mail: tstroud@austin.utexas.edu

Joshua A. Tucker: co-last author and academic research lead; Wilf Family Department of Politics and Center for Social Media and Politics, New York University
E-mail: joshua.tucker@nyu.edu

Acknowledgements: The Facebook Open Research and Transparency (FORT) team provided substantial support in executing the overall project. We are grateful for support on various aspects of project management from Chaya Nayak, Sadaf Zahedi, Lama Ahmad, Akshay Bhalla, Clarice Chan, Andrew Gruen, Bennet Hillenbrand, Pamela McLeod, and Dáire Rice; engineering and research management from Da Li and Itamar Rosenn; engineering from Yuxi Chen, Shiyang Chen, Tegan Lohman, Robert Pyke, and Yixin Wan; data engineering from Suchi Chintha, John Cronin, Devanshu Desai, Vikas Janardhanan, Yann Kiraly, Xinyi Liu, Anastasiia Molchanov, Sandesh Pellakuru, Akshay Tiwari, Chen Xie, and Beixian Xiong; data science and research from Hannah Connolly-Sporing; academic partnerships from Rachel Mersey, Michael Zoorob, Lauren Harrison, Simone Aisiks, Yair Rubinstein, and Cindy Qiao; privacy and legal assessment from Kamila Benzina, Frank Fatigato, John Hassett, Subodh Iyengar, Payman Mohassel, Ali Muzaffar, Ananth Raghunathan and Annie Sun; and content design from Caroline Bernard, Jeanne Breneman, Denise Leto, and Melanie Jennings. NORC at the University of Chicago partnered with Meta on this project to conduct the fieldwork with the survey participants. We are particularly grateful for the partnership of NORC Principal Investigator J.M. Dennis and NORC Project Director Margrethe Montgomery.

Supplemental Materials

Reproducibility Package: Deidentified data and analysis code from this study are deposited in the Social Media Archive at ICPSR, part of the University of Michigan Institute for Social Research. The data are available for university IRB-approved research on elections or to validate the findings of this study. ICPSR will receive and vet all applications for data access. Access through the ICPSR Archive ensures that the data and code are used only for the purposes for which they were created and collected. The code would also be more difficult to navigate separately from the data, which is why both are housed in the same space. Website: https://socialmediaarchive.org/collection/US2020.

  • Citation: González-Bailón, Sandra, David Lazer, Pablo Barberá et al. 2024. “The Diffusion and Reach of (Mis)Information on Facebook During the U.S. 2020 Election.” Sociological Science 11: 1124-1146.
  • Received: September 9, 2024
  • Accepted: October 24, 2024
  • Editors: Arnout van de Rijt, Cristobal Young
  • DOI: 10.15195/v11.a41


0

The Multiracial Complication: The 2020 Census and the Fictitious Multiracial Boom

Paul Starr, Christina Pao

Sociological Science December 3, 2024
10.15195/v11.a40


The Census Bureau set off reports of a “multiracial boom” when it announced that, according to the 2020 census, multiracial people accounted for 10.2 percent of the U.S. population. Only the year before, the bureau’s American Community Survey had estimated their share as 3.4 percent. We provide evidence that the multiracial boom was largely a statistical illusion resulting from methodological changes that confounded ancestry with identity and mistakenly equated national origin with race. Under a new algorithm, respondents were auto-recoded as multiracial if, after marking a single race, they listed an “origin” that the algorithm did not recognize as falling within that race. However, origins and identity are not the same; confounding the two did not improve racial statistics. The fictitious multiracial boom highlights the power of official statistics in framing public and social-science understanding and the need to keep ancestry and identity distinct in both theory and empirical practice.
Creative Commons LicenseThis work is licensed under a Creative Commons Attribution 4.0 International License.

Paul Starr: Sociology Department, Princeton University
E-mail: starr@princeton.edu

Christina Pao: Sociology Department, Princeton University
E-mail: christina.pao@princeton.edu

Acknowledgements: Presentations by Ricardo Lowe and colleagues helped inform the empirical strategy in our article. We also thank participants in the May 2024 Conference of the American Association for Public Opinion Research for their responses to our research.

Reproducibility Package: R code for replication is available on the Open Science Framework (OSF), https://osf.io/8ebup/?view_only=67a953b996684d128c9384d4841ed1c5. Data are available from IPUMS USA (Ruggles et al. 2024): https://usa.ipums.org/usa/index.shtml

  • Citation: Starr, Paul, and Christina Pao. 2024. “The Multiracial Complication: The 2020 Census and the Fictitious Multiracial Boom.” Sociological Science 11: 1107-1123.
  • Received: September 17, 2024
  • Accepted: October 21, 2024
  • Editors: Ari Adut, Michael Rosenfeld
  • DOI: 10.15195/v11.a40


0
SiteLock