There’s a particular form of serendipity that comes from learning something in one area which resolves a puzzle, or fills a gap in your thinking, in another area entirely. It’s all the more serendipitous – and pleasant – if you didn’t realise the gap was there.
This line of thought was prompted by this piece on the excellent FactCheck blog, which made me realise that I’d always been a bit dubious about the notion of “policy-based evidence”. OK, it’s a neat reversal – and all too often people who say they’re making evidence-based policy are doing nothing of the sort – but is the alternative really policy-based evidence? Doesn’t that amount to accusing them of just making it up?
Thanks to Cathy Newman at FactCheck, I realise now that I was looking at this question the wrong way. Actually “policy-based evidence” means something quite specific, and it hasn’t (necessarily) got anything to do with outright fraud. Watch closely:
Iain Duncan Smith has been celebrating the government’s benefits cap. Part of the welfare reform bill, state handouts will be capped at £26,000 a year so that “no family on benefits will earn more than the average salary of a working family,” i.e. £35,000 a year before tax.
Today, the work and pensions secretary was delighted to cite figures released by his department which he said were evidence that the policy is already driving people back into work. Of 58,000 claimants sent a letter saying their benefits were to be capped, 1,700 subsequently moved into work. Another 5,000 said they wanted support to get back into work, according to the figures.
OK, this is fairly simplistic thinking – We did a new thing! Something happened! Our thing worked! – but it’s something like a legitimate way to analyse what’s going on, surely. It may need more sophisticated handling, but the evidence is there, isn’t it?
Well, no, it isn’t.
In order to know how effective the policy had been, we would need to know the rate at which people on benefits worth more than £26,000 went into work before the letter announcing the changes was sent, and compare it to after the letter was received. But those figures aren’t available.
“[These figures do] not reveal the effect of the policy,” Robert Joyce, senior researcher at the Institute for Fiscal Studies told us. Mr Joyce went on: “Indeed, this number is consistent with the policy having had no effect at all. Over any period, some fraction of an unemployed group will probably move into work, regardless of whether a benefits cap is about to be implemented. The number of people who moved into work as a result of the policy is 1,700 minus the number of people who would have moved into work anyway. We do not know the latter number, so we do not know the effect of the policy.”
The number of people, in a given group of claimants, who signed off over a given period is data. Collecting data is the easy part: take five minutes and you can do it now if you like. (Number of objects on your desk: data. Number of stationary cars visible from your window: data. Number of heartbeats in five minutes: data.) It’s only when the data’s been analysed – it’s only when we’ve compared the data with other conditions, compared variations in the data with variations in those conditions and eliminated chance fluctuations – that data turns into evidence. The number of people who moved into work as a result of the policy is 1,700 minus the number of people who would have moved into work anyway: that number would be evidence, if we had it (or had reliable means of estimating it). The figure of 1,700 is data.
One final quote:
A spokesman for the Department for Work and Pensions said: “The Secretary of State believes that the benefits cap is having an effect.”
Et voilà: policy-based evidence.