Building on its success so far, Full Fact has revamped its website, making it more user friendly and easier to find fact checked information on specific topics. It is also ramping up its ability to quash statistical howlers by live blogging and tracking of major events in real time.
To find out more about where Full Fact has been and what the future holds, we talked to director Will Moy about keeping media and political rhetoric accurate.
What kind of examples of statistical spin or fabrication did you see before Full Fact came about in the UK?
One of the arguments made in favour of the previous Health and Social Care Act, which merged three regulators, was that there were too many organisations that had the right to review hospitals - 69 was the number that emerged. After some digging we found the source in the appendix of a report. Included in that number were not only organisations you’d expect like the Healthcare Commission, but others with a limited involvement in healthcare, such as the Information Commissioner, and some far-fetched candidates, like the National Autistic Society which might at some point, be invited to visit a hospital.
If we’d existed in 2007, we would have fact checked this. Regardless of whether or not it was the right decision to integrate health and social care regulation, the debate needn’t have rested upon make-believe numbers. Today our biggest frustration is seeing claims made that we don’t have the resources or expertise to tackle, which is why we are recruiting a network of experts to help us.
How did media organisations and politicians treat Full Fact when it first started out and how long was it before they began to sit up and take notice of your work?
Initially we were met with bafflement: Ministers’ press officers would tell us, ‘no one else has asked that.’ But it was striking how quickly things changed and we saw the ‘they know we check’ effect taking hold - in our first year someone at a think tank asked, ‘I suppose a back of the envelope calculation wouldn’t be good enough?’ And when we live-fact checked BBC Question Time as its 'Extra Guest', it was striking how many times the panellists mentioned that they were being checked.
We’ve talked to a lot of journalists and editors at different papers who have been constructive and taken the trouble to correct inaccuracies - and there’s no neat split between tabloids and broadsheets. When we fact checked the polemicist Toby Young, he responded by writing an article with the title ‘Putting the record straight’. There are lots of people in the media who want to report accurately. That said, not all responses are like that. We gave over 80 pages of examples to the Leveson Inquiry including details of more obstructive cases.
People began responding to our work when we started seeking corrections. After the 2008 election the US fact checking site FactCheck.org carried out a poll. Even though they were doing more aggressive fact checking than ever before, they found that ‘millions of voters were bamboozled anyway’, by a series of inaccurate claims from both sides. Instead of being discouraged by this, we concluded that we had to get claims corrected at source, before they’re spread further and drift into public consciousness.
How do you decide which newspaper stories or political pronouncements to investigate?
Each morning and throughout the day we monitor a range of sources for possible claims. Potential claims have to meet various practical criteria, such as being susceptible to fact checking (for example, we can’t fact check claims that rely on future evidence), the availability of primary sources, avoiding trivia and so on. Overall we look for claims that are interesting, important, or potentially influential.
We monitor Hansard, government and party press releases, major speeches, national daily papers and significant news and current affairs programmes, and other things depending on circumstance. For example, around Budget day we might pay more attention to economic publications. We also rely on suggestions from experts and the public, but have to take care that we don’t prioritise any particular interest group.
We focus on five core subject areas which consistently top Ipsos MORI’s Issues Facing Britain poll, which we’ve translated into the following topics: the economy, health and social care, immigration, crime and justice, and education. We’ve tried to make our website more relevant to the things the public cares about by restructuring it around issues, rather than politics more generally.
What is your view of the government's current open data initiative?
Open data is a great thing of which we have seen too little, too late, and in the case of statistics particularly, too poorly done. Because it has been poorly done the take up has been limited. But if we blindly apply the broader work of open data to the particular challenge of making official statistics open we risk becoming a country that knows the format of everything and the meaning of nothing.
A statistic isn’t a number on its own. It’s the number plus the explanation of where it comes from, how it’s generated, how that has changed over time, the caveats, definitions and what it can and can’t be compared to. Openness about this entire context is a big part of what makes official statistics trustworthy.
So I worry that the open data agenda doesn't quite apply to statistics. When we gave evidence to the Public Administration Select Committee on open data, we tried to illustrate what it would take to adapt Tim Berners-Lee's excellent and valuable five star model for open data to statistics - things like making sure metadata, explanations and caveats are included in the output.
Without that kind of concern, you get things like data.gov.uk, that I think hasn't done justice to the richness of what we know about the world through official statistics.
How important are statisticians in your research? Are there some technical topics where you have had to refer to a professional statistician in order to get an accurate picture?
We’re lucky to have two capable statisticians on our team: Mike Hughes, who was Director of Policy at the Office for National Statistics, and Emily Barnett, who’s on secondment from the Ministry of Justice.
In 1995 Jack Straw made a speech to the RSS, describing the decline of lengthy reporting of speeches in the Commons, and the subsequent abandonment of ‘straight reporting of Parliament’:
'One consequence of this is that the allegedly factual report has replaced the speech as a key political weapon. Those of us who have spent as long as I have in opposition know that a statistically based report on this or that will command far greater attention than ever a speech will.'
With a premium on the nature and availability of official statistics, he argued, statisticians have changed from relatively neutral observers to active referees.
We’ve found over the past few years that in-depth fact checking requires a sound understanding of statistics. As well as recruiting Mike and Emily, we often seek out statisticians who can help with specific cases of inaccuracy. When the children's heart surgery unit was suspended at Leeds General Infirmary and a huge public outcry ensued, we asked a perinatal epidemiologist (an expert in pregnancy and early childhood) to help us explain why the mortality statistics being quoted couldn’t yet be taken at face value: the numbers were rough estimates, and very sensitive to being misinterpreted when they became public.
This month we will be working on the next phase of our new website, which will reflect our merger with Straight Statistics. We'll also be seeking contributions from statisticians.
How do you see Full Fact evolving and how will you be approaching the run up to the 2015 general election?
The debate is going to be faster and noisier in the run up to the election. The main challenge for us will be fact checking thoroughly enough and fast enough that our work reaches all the people it needs to.
We’ll only be successful if we find experts that are willing to help us fact check. Our trial partnership with BBC Question Time has proved a good example of this: each week we received invaluable comments, challenges and tips from experts and we hope that can continue outside that context.
We aim to give our users the tools and advice they need to fact check claims for themselves. That ranges from relatively simple things, such as a better interface to the Treasury’s GDP deflator, to tools that we need for our own work as well as to help others, like Finder - a searchable guide to major sources of information that are used, or should be used, in public debate.
In our experience there are three things that have most impact on how effectively we can stop specific misuses of facts: how early we can spot them, how quickly we can analyse them, and how clearly the rebuttal is presented. We’re creating three tools to meet these needs: we’ve already launched Finder and are adding to it week by week, and there are two more ambitious tools still in the pipeline.