1. Methodology background


 National Statistic   
 Survey name  Public service productivity: total
 Frequency  Annual
 How compiled  Based on third party data
 Geographic coverage  UK
 Last revised  9 January 2019

Back to table of contents

2. About this Quality and Methodology Information report

This quality and methodology report contains information on the quality characteristics of the data (including the European Statistical System five dimensions of quality) as well as the methods used to create it. The information in this report will help you to:

  • understand the strengths and limitations of the data

  • learn about existing uses and users of the data

  • understand the methods used to create the data

  • help you to decide suitable uses for the data

  • reduce the risk of misusing data

Back to table of contents

3. Important points

  • The estimate for public service productivity is displayed as an index, showing the change over time of the amount of output provided for each unit of input.

  • To remove the effect of price changes over time, public service output and inputs are measured in quantity terms (also referred to as volume terms), instead of expenditure terms.

  • Some public service area outputs are also adjusted for changes in the quality of activities and services provided, as recommended by the Atkinson Review (PDF, 1.07MB); this is so the outcome of a public service can be observed, rather than the output alone.

  • Productivity estimates included in this article are multi-factor productivity estimates as opposed to labour productivity estimates (a single factor productivity measure), and so are not comparable with our headline measures of whole-economy labour productivity.

  • These estimates are produced to measure the productivity of total UK public services, but do not measure value for money or the wider performance of public services.

Back to table of contents

4. Quality summary

Overview

Total public service productivity is estimated by comparing growth in the total output provided with growth in the total inputs used. If the growth rate of output exceeds the growth rate of inputs, productivity increases, meaning that more output is being produced for each unit of input. Conversely, if the growth rate of inputs exceeds the growth rate of output, then productivity will fall, indicating that less output is being produced for each unit of input.

Output, inputs and productivity for total public services are estimated by combining growth rates for individual services using their relative share of total government expenditure as weights.

Productivity estimates included in this article are multi-factor productivity estimates as opposed to labour productivity estimates, and so are not comparable with our measures of whole-economy labour productivity. This is because the inputs for public service productivity include goods and services and capital inputs, in addition to labour input. The public service productivity measures included in this article are also not directly comparable with our market sector multi-factor productivity estimates due to differences in the methodology used. For further information on multi-factor productivity methodology, please see A simple guide to multi-factor productivity.

These estimates are produced to measure the productivity of total UK public services. They do not measure value for money or the wider performance of public services. They do not indicate, for example, whether the inputs have been purchased at the lowest possible cost, or whether the desired outcomes are achieved through the output provided.

The methodology for calculating these statistics is based on the recommendations of the Atkinson Review (PDF, 1.07MB) on the measurement of government output and productivity for the national accounts. Estimates are published on a calendar year basis to be consistent with the UK National Accounts, and estimates are available both for total and for an individual service area breakdown. These are included in Public service productivity, total, UK.

More information on the methodology and sources used can be found in Sources and Methods for Public Service Productivity Estimates: Total Public Services (PDF, 111KB).

Uses and users

Users of our public service productivity measures include:

  • departments within UK government such as the Cabinet Office, HM Treasury and regulatory bodies

  • the National Audit Office

  • press and general public

  • the Office for Budget Responsibility

  • Institute of Fiscal Studies (IFS)

  • the Nuffield Trust

  • academia

  • international statistical bodies

These organisations use the productivity estimates in a number of ways. Total public service productivity estimates have been used to inform previous IFS Green Budgets, directly used by the Nuffield Trust and are regular inputs into briefings for Cabinet Office’s ministers and permanent secretaries. We have, similarly, given advice to government departments on how to incorporate the general methodology of the estimates into their own work.

Feedback from users is received via user surveys and consultation events. Acting on such feedback, we are undertaking a development programme to improve public service productivity statistics across all service areas. As well as the annual estimates that are the focus of this release, we also publish experimental estimates of quarterly public service productivity – allowing for statistics to be provided on a more timely basis.

Strengths and limitations

Strengths of Public service productivity: total, UK include:

  • the majority of data we use are administrative data and as such we are not reliant on surveys

  • the dataset is decomposed in multiple ways, allowing us to provide comparable but different insights; for example, decomposing by service area shows how the quality adjustment for education can affect the total productivity figure, while decomposing by component could show how an increase in spending on a factor of inputs can change the higher level numbers

  • it shows the impact of quality change on public services output

  • the dataset experiences continuous improvement due to the open revisions policy – the estimates are not constrained by Blue Book procedures

Limitations of Public service productivity: total, UK include:

  • there is a two-year time lag in producing the estimates – this year, the dataset time series covers the period 1997 to 2016 and this is due to data availability; to account for this, we are now also producing experimental quarterly estimates of public services productivity

  • several different ways of measuring output are used in producing the statistic – some service areas are quality adjusted, some are directly measured, and for the remainder, output is assumed to be equal to inputs, giving a productivity estimate of zero for these service areas; work will continue to develop the output measurements

  • there is no geographical breakdown of the estimate – the numbers given are for the UK as a whole

Recent improvements

There are two major improvements that have been made to the associated statistic as part of Public service productivity: total, UK, 2016.

Changes to adult social care

There are improvements to the inputs and output data, and a new quality adjustment measure to be applied to the output.

For inputs, local authority expenditure on housing and social protection not elsewhere classified has been excluded for the adult social care (ASC) expenditure. There has also been a source change for NHS funding.

Improvements have been made to the deflators used for ASC inputs – for labour and intermediate consumption under local authority own provision, and for direct payments and other services under care provided by others.

For output, the following improvements have been made:

  • introduction of a new cost-weighted activity index for the new data source for England from financial year ending (FYE) 2015 onwards

  • improvements to the cost-weighting of activity for the years before FYE 2015, including the incorporation of NHS spending and merging the cost weights for local authority (LA) and independent sector provided residential care

  • modification of activity data to better align with inputs and NHS Digital’s unit costs analysis

  • incorporation of “output-equals-inputs” elements of output, for services where data on the number of activities carried out is not available, including direct payments

  • use of “output-equals-inputs” estimation for the output growth rate between FYE 2014 and FYE 2015

A new quality adjustment measure has also been developed to apply to the output indices. It is based on the Adult Social Care Outcomes Framework and the concept of adjusted social care-related quality of life introduced within. The data used for this measure are available only for England, although due to the lack of similar data sources for the devolved administrations, it is applied to the whole UK. It is from a survey, conducted each financial year, that asks individuals in local authority-supported care to rate the quality of the care they receive based on different criteria. Some of these criteria are food and nutrition, safety, and social participation. From there, two different quality adjustment measures are built, for community care and for residential and nursing care.

The quality measurement has been applied to the series starting in 2011.

A new splining process

To convert reported data into calendar year, a recognised statistical technique known as cubic splining is used. This technique allows academic year volume estimates and annual financial spending measures to be split into monthly data, which are then re-aggregated to create calendar year figures.

In the previous annual release, a combination of software packages X-13ARIMA-SEATS for stock data and X-11ARIMA-SEATS for flow data were used. For the upcoming publication, the method used will be updated, with X-13ARIMA-SEATS used throughout, for both stock and flow data, in line with latest national accounts practices. X-13ARIMA-SEATS has improved methods for forecasting and backcasting data as well as an updated method for interpolating a trend within a data series.

In addition, the data is splined on a monthly, rather than as previously quarterly, basis. The change to monthly has been made to allow for a more accurate specification of the start of the academic year when reaggregating to calendar year series, and to better account for unevenly spaced data series.

As a result of incorporating these changes, some of the annual growth rates may change as the volume of output and inputs calculated from reported data will be apportioned between calendar years in a slightly different manner. However, previous similar changes resulted in a minimal overall effect.

Back to table of contents

5. Quality characteristics of the data

Relevance

(The degree to which the statistical product meets user needs for both coverage and content.)

The UK Centre for the Measurement of Government Activity (UKCeMGA) was launched in 2005 to take forward the recommendations of the Atkinson Review (PDF, 1.07MB), with the aim to improve the measurement of government output, inputs and productivity, and to establish a regular reporting schedule.

In the years since the publication, we have developed estimates of healthcare and education output, inputs and productivity. These estimates are updated annually, and any methods changes are explained in prior papers and articles that we’ve published. We also periodically update estimates of output, inputs and productivity for the remaining areas of government final consumption, based on the Classification of Functions of Government (COFOG) expenditure. These are:

  • adult social care

  • children’s social care

  • social security administration

  • public order and safety

  • police

  • defence

  • other government services (includes economic affairs, general public services, recreation, housing and environmental protection)

There are three different statistical outputs published in Public service productivity: total, UK:

  • a volume index of total public services output and indices of output by service area

  • a volume index of total public services inputs and indices of inputs by service area

  • a derived index for total public services productivity and by service area (output per unit of inputs)

Accuracy and reliability

(The degree of closeness between an estimate and the true value.)

Both the output and inputs series for each service area are constructed using a variety of administrative and national accounts data. The accuracy of the derived series therefore depends on the accuracy of the source data. Unless we have introduced substantial methodological changes, the main source of revisions to each service area’s productivity estimates will be changes in source data and expenditure weights.

As there is no other source of public service productivity estimates that is comparable in methodology, validating our results is difficult. This is achieved through regular triangulation articles, as set out in the Atkinson Review.

It is difficult to provide a confidence interval around our estimates given the multiple sources of data on which the estimates are based. There will inevitably be some margin for error from a “true” measure of productivity, which is unknown. We collate triangulation evidence from other government departments and independent sources, which provides additional context to inform the interpretation of the public service productivity statistics.

Coherence and comparability

(Coherence is the degree to which data that are derived from different sources or methods, but refer to the same topic, are similar. Comparability is the degree to which data can be compared over time and domain, for example, geographic level.)

Assessing coherence of the data in Public service productivity: total, UK is difficult as there are currently no comparable measures published. We convert some source data from financial year to calendar year and aggregate results to a UK level, which makes it difficult to make comparisons at a country level. Service areas are also defined by Classification of the Functions of Government (COFOG) rather than administrative department or devolved administration. The different methodology developed for healthcare and education and the “output equals inputs” treatment of three service areas (police, defence and other), means that direct comparisons between service areas should not be made.

The four different methods that we use to measure output and their distribution between service areas can be seen in Figure 1.

The estimates cover the UK and, where possible, are based on data for England, Scotland, Wales and Northern Ireland. Where data are not available for all four countries, the assumption is made that the available data are representative of the UK. This can happen for quality adjustment, output or inputs data.

Finally, in instances where the data are available for all four countries of the UK, there may be slight variations in definitions or reporting conventions that introduce additional, largely unquantifiable effects on our estimates.

Accessibility and clarity

(Accessibility is the ease with which users can access the data, also reflecting the format in which the data are available and the availability of supporting information. Clarity refers to the quality and sufficiency of the release details, illustrations and accompanying advice.)

Our recommended format for accessible content is a combination of HTML web pages for narrative, charts and graphs, with data being provided in usable formats such as CSV and Excel. We also offer users the option to download the narrative in PDF format. In some instances, other software may be used, or may be available on request. For further information, please refer to the contact details at the beginning of this page.

For information regarding conditions of access to data, please refer to the following links:

In addition to this Quality and Methodology Information, basic quality information relevant to each release is available in the Quality and methodology section of the relevant article.

Notification of changes in methodology are published on the public service productivity topic specific methodology page, as well as historic changes being available in the guidance and methodology area of our archive website.

Timeliness and punctuality

(Timeliness refers to the lapse of time between publication and the period to which the data refer. Punctuality refers to the gap between planned and actual publication dates.)

Estimates of output, inputs and productivity in the total public sector are published on a calendar-year basis, and generally refer to the period (t-2), with t being the current year of publication. This degree of timeliness is consistent with the publication of data relating to expenditure weights for government services that publish to a (t-2) reference period in the European System of Accounts (ESA) Table 11. If the reference period were to be moved, for example, to (t-1), there would be a significant increase in the use of estimation to fill data gaps in the productivity articles, in advance of the publication of these datasets.

For more details on related releases, the GOV.UK release calendar provides 12 months’ advance notice of release dates. In the unlikely event of a change to the pre-announced release schedule, public attention will be drawn to the change and the reasons for the change will be explained fully at the same time, as set out in the Code of Practice for Statistics.

To date, each Public service productivity: total, UK article has been published as scheduled.

Concepts and definitions

(Concepts and definitions describe the legislation governing the output and a description of the classifications used in the output.)

Our analysis of productivity in UK public services represents internationally pioneering work. Measurement of outputs follows guidance in the System of National Accounts 1993: SNA 1993 and subsequent SNA 2008, as well as the European System of Accounts 1995: ESA 1995 and subsequent ESA 2010. Measurement of outputs (including the need to measure change in quality), inputs and productivity follows the principles in the Atkinson Review. The estimates presented in the article are for service areas classified by the Classification of the Functions of Government (COFOG).

Geography

Estimates are published on a UK geographic basis, with no further geographic breakdown provided.

Output quality

This statistic is a National Statistic and so meets the quality requirements of this status. It measures total productivity and the productivity of nine different service areas, offering a comprehensive coverage of the data required by the users.

Why you can trust our data

The Public service productivity: total, UK statistic is produced in accordance with the best practices set out in the Statistics Authority’s Code of Practice and the ONS’s Data Policies.

Any revisions to the data are clearly identified as such and limitations are made known to all users.

Back to table of contents

6. Methods used to produce the data

Main data sources

A range of data sources are used to provide a comprehensive picture of UK public services. A summary of these data sources is documented in Sources and Methods for Total Public Service Productivity Estimates: Total Public Services (PDF, 111KB).

How we process the data

The following section outlines the main statistical methods used to compile estimates of public service inputs, output and productivity. A detailed explanation of the methods used is given in Sources and Methods for Total Public Service Productivity Estimates: Total Public Services (PDF, 111KB). Significant methods changes are published in advance on the topic specific methodology page to inform users of both the nature and the likely impact of methodological changes.

Measuring output

The methods of measuring output vary between and within service areas. Table 1 provides a breakdown of these.

The output measures used are based on or taken in chained volume from the Blue Book. Given that most public services are supplied free of charge or at cost price, they are considered non-market output. The output of most services is measured by the activities and services delivered. These are usually referred to as “direct output” measures. These activities are measured and aggregated into a single volume output according to their relative cost or share of service area expenditure. This is referred to as a Cost-Weighted Activity Index.

For “collective services” – those that are not provided to an individual, such as defence – it is difficult to define and measure the nature of their output. It is assumed for such services that the volume of output is equal to the volume of inputs used to create them. This is referred to as the “output-equals-inputs” convention.

In addition, a quality adjustment factor is applied to the volume of activity index of several service areas. The purpose of these quality adjustment factors is to reflect the extent to which the services succeed in delivering their intended outcomes and the extent to which services are responsive to users’ needs. This results in estimates differing from those used in the national accounts.

There are currently four service areas that include such an adjustment:

  • healthcare

  • education

  • public order and safety

  • adult social care

Healthcare

The healthcare productivity quality adjustment is a compound measure made up of five components:

  • short-term post-operative survival rates

  • estimated health gain from procedures

  • waiting times

  • primary care outcomes achievement under the Quality and Outcomes Framework

  • National Patient Surveys scores

This quality adjustment process is applied from 2001 onwards. In the national accounts series, no quality adjustment is applied to healthcare output at present.

Further detail can be found in Source and Methods Public Service Productivity Estimates: Healthcare (PDF, 328.6KB).

Education

Quality adjustments are applied to primary and secondary schools output as well as initial teacher training (ITT) output. The process for schools is applied for the entire time series, while for ITT the quality adjustment process is applied from 2001 onwards.

The current schools’ adjustment for England uses Level 2 attainment, which measures the proportion of students attaining the threshold of five approved GCSE subjects or a Level 2 vocational qualification of equivalent size. This method replaces use of average point scores from 2008 onwards. Prior to 2008, GCSE average point scores continue to be used. The new quality adjustment for England will continue to be applied for Northern Ireland while a suitable data source on attainment is investigated.

Average point scores for Scotland are no longer available because of major changes to the education system and identification of a suitable alternative is not yet plausible while changes are ongoing. As the average point score for Scotland has followed a consistent trend over time and Scotland makes up a small proportion of UK education expenditure, as a short-term solution the quality adjustment for Scotland has been forecasted as a geometric average of the preceding five years. Average point scores for GCSEs continue to be used for Wales where there have not been significant changes to the education system. Finally, ITT output is adjusted using the proportion of students who pass their Qualified Teacher Status assessment. In the national accounts series, no quality adjustment is applied to education output at present.

Further detail can be found in Source and Methods Public Service Productivity Estimates: Education.

Public order and safety

Quality adjustments are applied to the criminal justice system elements of public order and safety output. This includes output associated with Crown Courts, magistrates’ courts, legal aid, Crime Prosecution Service, prison and probation services. There are two main sections included. The first adjusts the whole series by a severity-adjusted measure of total reoffences per offender. The second looks more closely at the different service areas. With prisons, this is including escapes from and safety inside the prisons, using number of incidents and their severity. With courts, it uses the timeliness of courts to process cases passed on to them by police.

Further detail can be found in Quality adjustment of public service public order and safety output: current method.

Adult social care

The Adult Social Care Outcomes Framework (ASCOF) is used to produce two different quality adjustment measures, for community care and for residential and nursing care. These make up the quality adjustment used for adult social care (ASC) measures. The data are taken from the Adult Social Care Survey (ASCS), which in the 2017 financial year had a sample size of 58,000, and includes questions on the individual’s needs, the degree to which they are met, and demographics.

To assess how well their needs are met, respondents are asked to rank how well their care needs are met in eight domains, such as food and nutrition, accommodation and safety. Then, each level of response is weighted by importance to quality of life, using weights derived from another survey of community care users.

There are two alternative methods for isolating the impact of ASC services on the respondents’ care-related quality of life from other factors outside the control of ASC services.

For community care, factors from the calculations used in the ASCOF are applied to the person-level data in the ASCS to remove the influence on care-related quality of life of: clients’ age, health status, suitability of clients’ home for meeting their needs, and clients’ ease of travelling around outside in their local environment.

As the factors used in ASCOF only relate to community care users, for residential and nursing care, a regression model is used to calculate the impact of ASC services on care-related quality of life, controlling for these external factors.

The quality measurement has been applied to the series starting in 2011.

Further detail can be found in Measuring adult social care productivity in the UK and England: 2016.

Measuring inputs

The inputs measures used are based on or taken from a mixture of expenditure, deflator and administrative data sources. They consist of compensation of employees, intermediate consumption and consumption of fixed capital of each service by central government and local government.

Central government expenditure data are sourced in current prices from HM Treasury’s public spending database – Online System for Central Accounting and Reporting (OSCAR) – that collects financial information from across the public sector. Annual estimates are derived from monthly profiles of spending for the current financial year and modified to meet national accounts requirements.

Most local government expenditure data are sourced from financial year returns by local authorities, apportioned across calendar years.

Expenditure data are subsequently adjusted for price changes (deflated) using a suitable deflator (price index). The purpose of this being to measure input volumes indirectly.

For a number of inputs – in particular, most healthcare and education labour inputs – volume series are measured directly using administrative data sources (for example, full-time equivalent staff numbers from NHS staff resources).

Deflator or price indices

A suitable deflator (price index) – or composite deflator – is applied to each current price expenditure to estimate a volume series. The Atkinson Review (PDF, 1.07MB) recommends that deflators are applied separately for each factor and that the price indices should be specific for each service. Price indices for labour and procurements should be sufficiently disaggregated to allow for changes in the compositions of the inputs. Currently, deflators are taken from a range of different sources to best represent changes in prices for each service input. Where suitable data are unavailable, the gross domestic product (GDP) implied deflator (acting as a generic price index) is used instead.

These series are aggregated to form an overall estimate of the volume of inputs used to provide each of the public services identified in the total public services.

Further detail can be found in the Sources and Methods for Public Service Productivity Estimates: Total Public Services (PDF, 111.4KB).

Aggregating service area inputs and output

The expenditure shares of each public service component are calculated using a breakdown of GGFCE by Classification of the Functions of Government (COFOG). ONS publishes this breakdown in ESA table 11 and provide the data to Eurostat for the European Deficit Procedure (EDP) in accordance with the Maastricht Treaty.

The EDP source is used for the following reasons:

  • consistent time series are available for all public service components

  • the data are published on a regular basis

  • a detailed breakdown is available, allowing us to separate, for example, adult social care and children’s social care from social protection

Aggregating output

Estimates of total public sector output are produced by weighting and then aggregating the volume of output in each service area. The weights used in this process are the service area COFOG expenditure weights, and are applied to form a chain-linked Laspeyres volume index of total public service output.

To identify the impact of including three sectors in which an “output-equals-inputs” convention is used, Public service productivity: total, UK reports the headline results of a sensitivity test that excludes these sectors. The following equation is used:

Where:

  • O is a Laspeyres index of output

  • e is expenditure

  • t and j index time and service areas respectively

  • Ot = 0 is set equal to 100

Aggregating inputs

Estimates of total public sector inputs are produced in a similar manner. This involves weighting and then aggregating the volume of inputs in each service area, using the same COFOG expenditure weights as in the calculation of aggregate output. This produces a chain-linked Laspeyres volume index of inputs for total public services, which is calculated using the following equation:

Where:

  • I is a Laspeyres index of inputs

  • e is expenditure

  • t and j index time and service areas respectively

  • It=0 is set equal to 100

Measuring productivity

Estimates of total public sector productivity are calculated using the aggregate output and inputs indices produced using the approach discussed previously.

Including the police, defence and other government services in the calculation of productivity will limit the growth in total public service productivity, pushing estimates of productivity growth towards zero. The extent to which they affect growth of total public service productivity is proportional to their share of total expenditure. During periods when productivity in other sectors is positive, the “output-equals-inputs” convention will reduce productivity growth. During periods when productivity in other sectors is negative, the inclusion of the police, defence and other sectors will tend to raise productivity growth estimates.

How we analyse and interpret the data

The contributions of each service area to total growth in output, inputs and productivity are calculated, as are the levels of revisions. These different findings are shown in a series of charts for stakeholders within Office for National Statistics (ONS), and the reasons behind changes in the figures are identified as far as possible.

The data are then published for use by various external stakeholders, who are welcomed to provide feedback, show us how they use the statistic and provide guidance where we should focus future work in public service productivity.

How we quality assure and validate the data

A number of procedures are followed to quality assure the data. These processes are applied at all stages of the production process – at both granular and aggregate levels.

Internal quality assurance is carried out at all main stages of processing. This is followed by a larger scale quality assurance, involving stakeholders and important individuals. A new addition to this year’s quality assurance was a parallel run of two aggregation systems (by service area and by component). This made it possible to check the accuracy of the data and the processing system simultaneously.

Visual presentations are created from the processed data. These presentations are used for an internal analysis to highlight significant data points or patterns that may warrant further investigation.

How we disseminate the data

The Public service productivity: total, UK releases are published free of charge on the ONS website. They are published once a year, within the Public services productivity section of the ONS website. Supporting documents are clearly linked and accessible to users. Additional data can be provided on request.

How we review and maintain the data processes

Further revisions to the estimates may be required in accordance with, for example, changes to source data. This follows our Revisions Policy. A guide to statistical revisions is also available.

Back to table of contents

7. Other information

Assessment of user needs and perceptions

(The processes for finding out about uses and users, and their views on the statistical products.)

Our productivity releases have a range of users, as given in the earlier section.

We have developed two main ways of obtaining information on users and uses of our public service productivity estimates:

  1. user consultation meetings and regular healthcare and education functional board meetings – these meetings allow the exchange of information on data sources, development issues and methods changes that affect our public service productivity estimates

  2. a user feedback questionnaire is circulated to those who make enquires about public service productivity

Useful links

Back to table of contents

Contact details for this Methodology

Katherine Kent
productivity@ons.gov.uk
Telephone: +44 (0)1633 455829