Tag Archives | Online Experiment

The Social Contagion of Antisocial Behavior

Milena Tsvetkova, Michael W. Macy

Sociological Science, February 4, 2015
DOI 10.15195/v2.a4

Previous research has shown that reciprocity can be contagious when there is no option to repay the benefactor and the recipient instead channels repayment toward strangers. In this study, we test whether retaliation can also be contagious. Extending previous work on “paying it forward,” we tested two mechanisms for the social contagion of antisocial behavior: generalized reciprocity (a victim of antisocial behavior is more likely to pay it forward) and third-party influence (an observer of antisocial behavior is more likely to emulate it). We used an online experiment with randomized trials to test the two hypothesized mechanisms and their interaction by manipulating the extent to which participants experienced and observed antisocial behavior. We found that people are more likely to harm others if they have been harmed and they are less likely to do so if they observe that others do not harm.
Milena Tsvetkova: Department of Sociology, Cornell University  E-mail: mvt9@cornell.edu

Michael W. Macy: Department of Sociology and Department of Information Science, Cornell University  Email: m.macy@cornell.edu

  • Citation: Tsvetkova, Milena, and Michael W. Macy. 2015. “The Social Contagion of Antisocial Behavior.” Sociological Science 2:36-49
  • Received: November 24, 2014
  • Accepted: January 5, 2015
  • Editors: Jesper Sørensen,  Gabriel Rossman
  • DOI: 10.15195/v2.a4

0

Comparing Data Characteristics and Results of an Online Factorial Survey between a Population-Based and a Crowdsource-Recruited Sample

Jill Weinberg, Jeremy Freese, David McElhattan

Sociological Science, August 4, 2014
DOI 10.15195/v1.a19

Compared to older kinds of sample surveys, online platforms provide a fast and low-cost platform for factorial surveys, as well as a more demographically diverse alternative to student samples. Two distinct strategies have emerged for recruitment: using panels based on population-based samples versus recruiting people actively seeking to complete online tasks for money. The latter is much cheaper but prompts various concerns about data quality and generalizability. We compare results of three vignette experiments conducted using the leading online panel that uses a population-based paradigm (Knowledge Networks, now GfK) and the leading platform for crowdsource recruitment (Amazon Mechanical Turk). Our data show that, while demographic differences exist, most notably in age, the actual results of our experiments are very similar, especially once these demographic differences have been taken into account. Indicators of data quality were actually slightly better among the crowdsource subjects. Although more evidence is plainly needed, our results support the accumulating evidence for the promise of crowdsource recruitment for online experiments, including factorial surveys.

Jill D. Weinberg: Northwestern University: American Bar Foundation. E-mail: jweinberg@abfn.org

Jeremy Freese: Northwestern University. Email: jfreese@northwestern.edu

David McElhattan: Northwestern University. Email: DavidMcElhattan2017@northwestern.edu

  • Citation: Weinberg, Jill D., Jeremy Freese, and David McElhattan 2014. “Comparing Data Characteristics and Results of an Online Factorial Survey between a Population-Based and a Crowdsource-Recruited Sample.” Sociological Science 1: 292-310.
  • Received: April 23, 2014
  • Accepted: May 22, 2014
  • Editors: Jesper Sørensen, Delia Baldassarri
  • DOI: 10.15195/v1.a19

0
SiteLock