FIND DATA: By Journal | Sites   ANALYZE DATA: Help with R | SPSS | Stata | Excel   WHAT'S NEW? US Politics | IR | Law & Courts🎵
   FIND DATA: By Journal | Sites   WHAT'S NEW? US Politics | IR | Law & Courts🎵
WHAT'S NEW? US Politics | IR | Law & Courts🎵
If this link is broken, please report as broken. You can also submit updates (will be reviewed).

Sensitivity Bias in Surveys: New Insights from 30 Years of List Experiments

List ExperimentsSocial reference theorysensitivity biasmeasurement approachMethodology@APSRDataverse
Methodology subfield banner

This paper addresses how people respond to sensitive survey questions.

➡️ New Theory

The authors introduce a social reference theory that explains why respondents might hide their true views on topics they fear others will judge, calling this bias sensitivity bias.

➡️ Practical Challenge

Researchers face trade-offs between asking direct versus indirect ("list experiment") questions—a key consideration for designing political surveys.

➡️ Meta-Analysis Findings

Analyzing all available list experiments over three decades reveals that the impact of social desirability bias is often small—less than 10 percentage points—and sometimes negligible.

Article card for article: When to Worry About Sensitivity Bias: A Social Reference Theory and Evidence from 30 Years of List Experiments
When to Worry About Sensitivity Bias: A Social Reference Theory and Evidence from 30 Years of List Experiments was authored by Graeme Blair, Alexander Coppock and Margaret Moor. It was published by Cambridge in APSR in 2020.
Find on Google Scholar
Find on JSTOR
Find on CUP
American Political Science Review
Edit article record marker