As the old saying goes, there are lies, damned lies and statistics. The major comparative analysis of the performance of the NHS across the four countries of the UK that we published last week brought this into sharp focus.
While there has been much debate and consternation over the accuracy of one particular figure for the NHS in Scotland – and bear in mind this was an official figure published by the Office for National Statistics that has been in the public domain for three years – what these discussions have shown is the fundamental importance of comparative data on the performance of public services.
Let’s start by setting the record straight: the figure in question referred to the number of hospital medical and dental staff (whole time equivalents per 1,000 population) employed by the home nations. Unlike the figures for England, Wales and Northern Ireland, the statistic for Scotland also included dentists working in the community, thereby swelling their figure for 2006 from 10,161 (as it should have been) to 12,880. To reiterate, this was an official figure published by the ONS but the statistics were compiled on a different basis across the four nations. It has remained undisputed until now.
Many of you may have seen the Scottish Health Secretary Nicola Sturgeon criticising the validity of our research as a result, but it is clear that the error was not the result of our analysis and research, which is conducted to the highest possible academic standards. Based on new data supplied by Scotland, we have amended the relevant graphs on our website – the full report will also be amended. Scottish officials have now come back with a list of questions about the officially published statistics, which we will be discussing with them and the ONS.
This has served to hit home two much more fundamental issues. First the dearth of good comparative data on the performance of the health services of the UK. The authors of this report had to go to great lengths to obtain data that were comparable across the four nations. Although the UK Statistics Authority has a crucial role in monitoring the quality of statistics produced by each country, it does not have the powers to require the four UK countries to produce robust comparative data.
Second, the implication that comparative analyses such as that shown in the report can hardly be conducted if such basic statistics are shown to be inaccurate. Is not HM Treasury interested in value for money for the UK taxpayers? There will be much that practitioners and policymakers from across the UK could learn from the natural experiment that is now occurring across our health services, with some countries adopting very different policies and systems of governance.
Our research provides a strong start. It compares the performance of the NHS across the four UK countries at three time points – 1996/7, 2002/3 and 2006/7. It also examines the performance of the ten English regions and compares them with the NHS in England as a whole and the NHS in each of the devolved countries in 2006/7 – the first time such an analysis has been conducted.
The report looks only at statistics that can be measured in the same way in the English regions and the devolved countries at three selected time points. It is therefore possible that the comparative statistics that are available fail to capture some important dimensions of performance. For example, the report does not look at quality of care for patients because a comprehensive analysis of this was published last year by the Health Foundation and showed no consistent differences across the four countries.
Some of the differences and trends we found between the countries may be because of the historical differences in funding levels, which are not directly related to policies implemented after devolution. But some will reflect the different policies pursued by each of the four nations since 1999, in particular the greater pressure put on NHS bodies in England to improve performance in a few key areas such as waiting and efficiency, via targets, strong performance management, public reporting of performance by regulators, and financial incentives.
Of course, this is but one management tool – and often a very crude one, but the NHS in England is well placed to demonstrate, at least partially, value for money from the unprecedented investment in health services that have characterised the 2000s.
However, while the report raises challenging questions particularly for the devolved nations, there is no room for complacency in England. This report and others shows wide unexplained variations in performance within England. Our research therefore raises important questions about the efficiency of care across the devolved nations.
But back to our core theme, and a warning for UK policymakers: the lack of comparable data that would allow differences in performance across England, Scotland, Wales and Northern Ireland to be analysed in depth in future is a real concern.
Without such comparable data, it simply will not be possible for UK taxpayers and HM Treasury to know whether they are securing value for money for their health services.
Dixon J (2010) ‘Lies, damned lies and statistics’. Nuffield Trust comment, 27 January 2010. https://www.nuffieldtrust.org.uk/news-item/lies-damned-lies-and-statistics