
Why This Problem Matters
Counterfactual inference—asking “what if” questions and estimating causal effects—underpins prediction and policy advice in political science. Gary King and Langche Zeng show that when proposed counterfactuals lie far outside the observed data, conclusions from even well-specified statistical models can rest on speculation and untestable modeling assumptions rather than empirical evidence. This undermines confidence in many high-stakes claims about institutional change and international interventions.
How the Authors Tackle It
King and Zeng develop straightforward, easy-to-apply diagnostic methods that reveal when counterfactual claims are driven by model-dependence rather than by the data. Crucially, these procedures do not require performing sensitivity tests over pre-specified classes of alternative models. Instead, the diagnostics assess whether the data contain enough information to support the counterfactual inferences researchers are drawing.
What They Do in Practice
Key Findings
Practical Tools and Implications
King and Zeng provide free software implementing all suggested diagnostics, enabling researchers to check the robustness of counterfactual claims before drawing policy-relevant conclusions. Their work encourages more cautious interpretation of “what if” arguments and offers concrete tools to improve transparency and credibility in causal inference across comparative and international political science.

| When Can History Be Our Guide? The Pitfalls of Counterfactual Inference was authored by Gary King and Langche Zeng. It was published by Oxford in ISQ in 2007. |