Algorithmic bias in social research: A meta-analysis

PLoS One. 2020 Jun 8;15(6):e0233625. doi: 10.1371/journal.pone.0233625. eCollection 2020.

Abstract

Both the natural and the social sciences are currently facing a deep "reproducibility crisis". Two important factors in this crisis have been the selective reporting of results and methodological problems. In this article, we examine a fusion of these two factors. More specifically, we demonstrate that the uncritical import of Boolean optimization algorithms from electrical engineering into some areas of the social sciences in the late 1980s has induced algorithmic bias on a considerable scale over the last quarter century. Potentially affected are all studies that have used a method nowadays known as Qualitative Comparative Analysis (QCA). Drawing on replication material for 215 peer-reviewed QCA articles from across 109 high-profile management, political science and sociology journals, we estimate the extent this problem has assumed in empirical work. Our results suggest that one in three studies is affected, one in ten severely so. More generally, our article cautions scientists against letting methods and algorithms travel too easily across disparate disciplines without sufficient prior evaluation of their suitability for the context in hand.

Publication types

  • Meta-Analysis
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Bias*
  • Reproducibility of Results
  • Research Design*
  • Social Sciences*

Grants and funding

The Swiss National Science Foundation has generously funded this research under grant award number PP00P1_170442 to AT. URL: http://www.snf.ch The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.