FIND DATA: By Journal | Sites   ANALYZE DATA: Help with R | SPSS | Stata | Excel   WHAT'S NEW? US Politics | IR | Law & Courts🎵
   FIND DATA: By Journal | Sites   WHAT'S NEW? US Politics | IR | Law & Courts🎵
WHAT'S NEW? US Politics | IR | Law & Courts🎵
If this link is broken, please
You can also
(will be reviewed).

Partisan 'Cheerleading' Accounts For About a Quarter of Survey Bias

Political Behavior subfield banner

What the Study Asks

Matthew Graham (APSR) reexamines two decades of research on partisan expressive responding—the idea that people sometimes report more partisan beliefs on surveys than they actually hold. The paper asks how large and consistent expressive responding is across studies, and whether common explanations for it (misreporting, aka "cheerleading," and congenial inference) match the empirical evidence.

How Evidence Was Compiled

This article presents a meta-reanalysis of 44 studies drawn from 25 published articles, covering 242 survey questions where researchers deployed experimental treatments intended to reduce expressive responding. Rather than pooling raw data into a single effect, the analysis compares how measured partisan differences change when researchers implement designs meant to curb partisan misreporting.

Key Findings

  • Treatments aimed at reducing expressive responding shrink measured partisan bias by about 25% on average.
  • Across the 242 questions, these treatments increase the correlation between the average Democrat’s and Republican’s reported beliefs from 0.81 to 0.86, indicating greater agreement when expressive responding is reduced.
  • Contrary to predictions from the two leading theories—misreporting ("cheerleading") and congenial inference—there is no evidence that expressive responding grows with partisan identity strength or with respondents’ educational attainment.

Implications for Scholars and Survey Practice

The results show that expressive responding is a real and measurable phenomenon, but one that reduces—rather than overturns—observed partisan disagreement in surveys by a modest amount. This matters for researchers who use surveys to map public opinion: measurement design can meaningfully change estimated polarization, but expressive responding alone does not appear to explain most partisan gaps.

What Comes Next

Graham argues that the field should move toward more design-based tests of mechanisms: targeted experiments that isolate why respondents give partisan answers and whether those survey-driven effects reflect differences in real-world political judgments. Such work will clarify when and how survey evidence on polarization reflects durable belief differences versus expressive signaling in the interview context.

Article card for article: Partisan Expressive Responding: Lessons from Two Decades of Research
Partisan Expressive Responding: Lessons from Two Decades of Research was authored by Matthew Graham. It was published by Cambridge in APSR in 2026.
Find on Google Scholar
Find on Cambridge University Press
American Political Science Review