How I caught my council using dodgy stats and why local open data is so important

Written by John Murray on . Posted in Features

In the introduction to their 2014 annual report, Cheshire West and Chester Council stated ‘We are delighted that customer satisfaction with the council is up from 70.5% in 2009 to 96% in 2013/14, a remarkable increase during these difficult times.’

As a practitioner with over 30 years' experience of data analysis, particularly customer surveys, the 96% figure appeared remarkably high, particularly for a local council whose decisions about services are often controversial.

So I decided to use the freedom of information act (FOI) to request the survey fieldwork and details of the methodology including any weighting which was submitted on October 13 2014.

By law the council had 20 working days to respond to the request which meant I should have received my response by November 11, however nothing was forthcoming. Following a chase, the council asked for a two week extension of the time on the grounds that ‘It is taking a little while longer to gather the information to answer your response’.

The initial response was received on November 25 but a subsequent analysis of the data showed that the data only covered the period July 2010 to October 2014, and omitted the period April 2009 to June 2010 which was critical to the analysis of the council’s claim of increase in satisfaction rating. In addition, the responses to some questions in the survey, was not consistent with the file layout documentation provided with the response.

In their response, the council said ‘I can confirm that the statement in the annual report is correct in that it relates to customer satisfaction with the council’s contact centre, as the conduit for all enquiries about services delivered by the council. The council recognises that the statement requires clarification and has therefore amended it online to explain that customer satisfaction is assessed using the feedback it receives from customers raising enquiries via its contact centre.’

I responded by asking the council to comply with the request in full and to clarify the record inconsistencies.

The missing data and clarification was received on December 18, 48 working days after the original request was made. I joined the two sets of the responses together, removing duplicate records due to there being some overlap between the two datasets supplied.

What the data actually showed

As a result my FOI request, the council amended the wording in their introduction to the online version of the annual report to read ‘We are delighted that customer satisfaction with the council’s contact centre is up from 70.5% in 2009 to 96% in 2013/14. The contact centre is the conduit for all enquiries about the services delivered by the council, and satisfaction is assessed using the feedback that customers give when raising enquiries via the contact centre.’

On March 13 of this year, the council leader, Mike Jones, stated on his monthly Q&A session (4:50 in) on Chester local radio that ‘We have got a very good satisfaction level of service delivery within the borough, 96% of the people, I think the number is, are satisfied with council services, which is really good.’

My FOI request showed that the claim on 96% satisfaction with the council is false. The claim actually relates to the first question in an audio survey of those who have contacted the council’s customer services department by telephone and relates solely to how their enquiry was handled. The wording of the question is ‘How would you rate your overall satisfaction with the service you received today?’ The customer has four choices of response: 1 – 4 (excellent, good, adequate, poor),

The survey is conducted via an automated system which calls the customer and asks them to key their responses using the telephone keypad. As telephone is just one method of contacting the council’s customer services centre, and excludes online and face to face contact, the survey is not a representative sample of all users of the contact centre and I suggest will be biased towards those who prefer offline channels. No attempt was made to weight the responses to the survey and those who did not answer were excluded.

Furthermore the claim of a 26% increase in satisfaction with the contact centre channel was also found to be false as the 2009 figure only included those who had responded ‘excellent’ whereas the 2013/14 figure aggregated ‘excellent’ and ‘good’ responses. Instead of a 26% increase, the true figure was a 1.5% increase.

The responses aggregated by financial year are:

Q1: How would you rate your overall satisfaction with the service you received today?
  2013-14 2012-13 2011-12 2010-11 2009-10
Q1 Count Percent Count Percent Count Percent Count Percent Count Percent
No answer 1410   1682   1733   979   525  
1 - Excellent 6327 81.5% 7262 76.3% 7569 82.1% 4032 79.2% 2318 70.8%
2 - Good 1171 15.1% 1754 18.4% 1408 15.3% 911 17.9% 760 23.2%
3 - Adequate 186 2.4% 338 3.5% 210 2.3% 121 2.4% 133 4.1%
4 - Poor 76 1.0% 168 1.8% 34 0.4% 28 0.5% 61 1.9%
 
Excellent or Good 96.6%   94.7%   97.4%   97.1%   94.1%

In the interests of transparency, it is important that claims made by democratic bodies, particularly in important documents, are true and accurate. The Charted Institute of Public Relations, the Market Research Society and the Royal Statistical Society have jointly produced a document of guidelines for best practice guide for using statistics in communications.

Section 7 of the guidance document states:

Common pitfalls that can undermine your message

• Using a sample that is not representative of the audience being represented

The way in which a sample of the required audience is selected is of key importance to how representative the survey results are. A common mistake is to think that a particular group is representative of the target community, if many people are asked. People with similar views tend to have similar behaviour, for example going to work on the early morning trains, or attending certain sporting events, or being in the town centre in midmorning. A random sample of a lot of these people will clearly not be representative of a more general population.

• Getting bias through self-selection

Getting people in a community to ‘have their say’ is commonly used to gather views. This can be useful information but is very unlikely to be representative of the whole community as individuals have been able to opt in to the survey. This is because people with strong views on an issue are more likely to respond than those without strong views.

In terms of lessons learned from the exercise, I believe that a campaign is needed for honesty in public statistics drawing attention to the guidance. Not just those published in tables, but those contained in reports. Sources and references should always be provided as either an appendix, or web link.

Furthermore, I would call for the fieldwork for all council surveys to be published as open data as a matter of course.

 

The views expressed in the Opinion section of StatsLife are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of The Royal Statistical Society.

Open Data

Join the RSS

Join the RSS

Become part of an organisation which works to advance statistics and support statisticians

Copyright 2019 Royal Statistical Society. All Rights Reserved.
12 Errol Street, London, EC1Y 8LX. UK registered charity in England and Wales. No.306096

Twitter Facebook YouTube RSS feed RSS feed RSS newsletter

We use cookies to understand how you use our site and to improve your experience. By continuing to use our site, you accept our use of cookies and Terms of Use.