Resources to support systematic reviews
9 point guide to spotting a dodgy statistic< Back to search results
The Guardian’s guide to spotting dodgy statistics, written by David Spiegelhalter.
- Format Texts
- Language/s English
- Target Audience Self-directed learning
- Duration 5-15 mins
- Difficulty Intermediate
Key Concepts addressed
- 1-2c Association is not the same as causation
- 2-3d Fair comparisons with few people or outcome events can be misleading
Numbers allow us to get a sense of magnitude, to measure change, to put claims in context. But despite their bold and confident exterior, numbers are delicate things and that’s why it upsets me when they are abused. And since there’s been a fair amount of number abuse going on recently, it seems a good time to have a look at the classic ways in which politicians and spin doctors meddle with statistics.
Every statistician is familiar with the tedious “Lies, damned lies, and statistics” gibe, but the economist, writer and presenter of Radio 4’s More or Less, Tim Harford, has identified the habit of some politicians as not so much lying – to lie means having some knowledge of the truth – as “bullshitting”: a carefree disregard of whether the number is appropriate or not.
So here, with some help from the UK fact-checking organisation Full Fact, is a nine-point guide to what’s really going on.