- National Statistic: yes
- Survey name: Public service productivity
- Data collection: administrative data, some survey data
- Frequency: annual
- How compiled: based on third party data
- Geographic coverage: UK
- Related publications:
- Public service productivity: total, UK: 2020
- Public service productivity: quarterly, UK, April to June 2023
- Public service productivity, healthcare, England: financial year ending 2021
- Public service productivity, adult social care, England: financial year ending 2021
- Public service productivity, UK: 1997 to 2022
This quality and methodology report contains information on the quality characteristics of the data (including the European Statistical System five dimensions of quality) as well as the methods used to create it.
The information in this report will outline:
the strengths and limitations of the data
existing uses and users of the data
the methods used to create the data
suitable uses for the data
The estimate for public service productivity is displayed as an index, showing the change over time of the amount of output provided for each unit of input.
To remove the effect of price changes over time, public service output and inputs are measured in quantity terms (also referred to as volume terms), instead of expenditure terms.
Some public service area outputs are also adjusted for changes in the quality of activities and services provided, as recommended by the Atkinson Review (PDF, 1.08MB). A quality adjustment is a statistical estimate of the change in the quality of a public service. This allows for observation of the outcome of a public service being provided, rather than the output alone.
Productivity estimates included in this article are multi-factor productivity estimates as opposed to labour productivity estimates (a single factor productivity measure), and are not comparable with our headline measures of whole-economy labour productivity. More information can be found in our How to compare and interpret ONS productivity measures methodology.
These estimates are produced to measure the productivity of total UK public services, but do not measure value for money or the wider performance of public services.
The statistics produced for each service area within PSP are based on their Classification of the functions of government (COFOG). These classifications align data with specific industries and service areas, but not with government departments. The Office for National Statistics (ONS) has investigated the links between COFOG and government departments to determine how data series captured within COFOG align with government departments.
As a result of the coronavirus (COVID-19) pandemic, notable adjustments, which included alteration of data sources and methods, were required to capture non-quality and quality adjusted output that reflected activities for separate service areas. Adjustments were performed predominantly because of the lack of conventional data (for example, GCSE attainment data which are used for quality adjustment in education), and to capture additional activities that arose because of the coronavirus pandemic. The adjustments applied were unique to individual service areas and will be described throughout this article.
Following the Chancellor of the Exchequer’s announcement in June 2023, the ONS is reviewing PSP to improve the measurement and reporting of public service productivity in the UK. An important output of this work is producing baselines and nowcasts of the expected indexes for PSP and its service areas. The baselines use the compound annual growth rate (CAGR) for the period of 1997 to 2019 and show the long-term trend of growth, or a shortened baseline period for service areas where new methods have been introduced. The nowcasts aim to estimate the 2021 and 2022 index values, using dynamic regression and the annualised quarterly series. The new method we have developed differs from the annualised quarterly estimates published in our Public service productivity, quarterly, UK bulletin.Back to table of contents
Total public service productivity is estimated by comparing growth in the total output provided with growth in the total inputs used. If the growth rate of output exceeds the growth rate of inputs, productivity increases, meaning that more output is being produced for each unit of input. Conversely, if the growth rate of inputs exceeds the growth rate of output, then productivity will fall, indicating that less output is being produced for each unit of input.
Output, inputs and productivity for total public services are estimated by combining growth rates for individual services using their relative share of total government expenditure as weights.
The public service productivity measures included in this article are also not directly comparable with our market sector multi-factor productivity estimates owing to differences in the methodology used. For further information, see How to compare and interpret ONS productivity measures and A simple guide to multi-factor productivity.
These estimates are produced to measure the productivity of total UK public services. They do not measure value for money or the wider performance of public services. They do not indicate, for example, whether the inputs have been purchased at the lowest possible cost, or whether the desired outcomes are achieved through the output provided.
The methodology for calculating these statistics is based on the recommendations of the Atkinson Review (PDF, 1.08MB) on the measurement of government output and productivity for the national accounts. Estimates are published on a calendar year basis to be consistent with the UK National Accounts, and estimates are available both for total and for an individual service area breakdown. These are included in Public service productivity: Total, UK: 2020.
More information on the methodology and sources used can be found in Sources and methods for public service productivity estimates.
Uses and users
Users of our public service productivity measures include:
departments within UK government such as the Cabinet Office, HM Treasury and regulatory bodies
the National Audit Office
Press and general public
the Office for Budget Responsibility
Institute of Fiscal Studies (IFS)
the Nuffield Trust
international statistical bodies
These organisations use the productivity estimates in a number of ways. Total public service productivity estimates have been used to inform previous IFS Green Budgets, directly used by the Nuffield Trust and are regular inputs into briefings for Cabinet Office's ministers and permanent secretaries. We have, similarly, given advice to government departments on how to incorporate the general methodology of the estimates into their own work.
Feedback from users is received via user surveys and consultation events. Acting on such feedback, we are undertaking a development program to improve public service productivity statistics across all service areas. Changes and improvements on our statistics are published in methodology articles. As well as the annual estimates that are the focus of this release, we also publish experimental estimates of quarterly public service productivity - allowing for statistics to be provided on a more timely basis.
Strengths and limitations
Strengths of Public service productivity
The majority of data we use are administrative data, which means we are not reliant on surveys. The data in one paragraph can be disaggregated in multiple ways.
Estimates for inputs, output and productivity are disaggregated by service area. Inputs estimates are disaggregated by component (labour, goods and services, and consumption of fixed capital). Productivity estimates are calculated with and without adjustments for the quality of output.
Therefore, some data can be used to estimate the impact of quality change on public services output. The open revisions policy allows us to continuously improve the dataset, which means that the estimates are not constrained by Blue Book procedures.
Limitations of Public service productivity
There is a two-year time lag in producing the annual estimates. Our last Public service productivity publication covers the period 1997 to 2020. This is because of data availability. To account for this, we are now also producing our Public service productivity quarterly bulletin.
Several different ways of measuring output are used in producing the statistic. Some service areas are quality adjusted, some are directly measured, and the remainder output is assumed to be equal to inputs. For areas where output is assumed equal to inputs, productivity growth is zero.
We are currently working with HM Treasury and other government departments to further develop the output measurements, and to comprehensively measure government activity under each service, as outlined in our Public services productivity review. This is a continually evolving area, and so productivity figures may not represent all activities.
There is no geographical breakdown of the estimate; the numbers given are for the UK as a whole.
There are several major changes that have been made to the associated statistics in the last year.
Data were not available to calculate output volumes for Northern Ireland in 2020. The UK-level estimates are therefore based on output growth in England, Scotland and Wales only for 2020.
For England, the transition from using NHS reference costs to Patient Level Information and Costing System (PLICS) data to measure mental health activity and unit cost data meant that we could not calculate volumes for these services using our usual data source, the NHS Cost Collection. Instead, growth in mental health output was estimated using a selection of proxy activity indicators from data sources used to estimate output in the quarterly national accounts. More information can be found in our Improvements to healthcare volume output in the quarterly national accounts methodology.
For healthcare quality adjustment, the patient satisfaction and general practice outcome measures for the financial year ending (FYE) 2021 were not included because of data availability and quality issues.
Adult social care
The NHS adult social care (ASC) activity and finance report describes the activity and expenditure elements associated with the delivery of adult social care. Where activity data are available, output is measured directly. Where activity data are not available, output is indirectly measured using the “output-equals-inputs” approach. This assumes that the level of spending that goes into the service results in a corresponding increase in the services provided.
From the publication of our Public service productivity, adult social care, England: financial year ending 2021 article onwards, expenditure on ”commissioning and service delivery” has been removed from the indirectly measured output, as it captures overhead costs for the provision of ASC rather than capturing additional services provided to clients. This category of expenditure captured additional funding from local authorities to support service providers during the coronavirus (COVID-19) pandemic that did not directly lead to corresponding increases in the quantity of care provided. Therefore, the removal of their spending from the entire time series results in an output measure that more closely represents the quantity of care provided.
New measures relating to output for the devolved nations were also introduced for ASC as outlined in our Public service productivity: total, UK, 2020 article. These new measures include an updated measure for Scotland, using activity data for home care and residential care from Public Health Scotland (PHS) and spending data from the Scottish Government, to produce a direct volume output measure from financial year ending (FYE) 2017 onwards. Other services for Scotland are included on an “output-equals-inputs” basis.
Measures for Northern Ireland are now included on the directly measured basis for the period FYE 2007 to FYE 2020 using data from the Department of Health, Northern Ireland. As this data source was discontinued after FYE 2020, output is measured indirectly after this point. No activity data are available for Wales, therefore the “output-equals-inputs” assumption is applied to expenditure data from the Welsh Government. Output across the nations is aggregated into a UK index series using the respective expenditure shares of each nation.
Social security administration
In 2018, several benefits and tax credits in the UK were replaced with Universal Credit, which saw legacy benefits combined into a single framework. However, this has presented issues relating to unit costs and therefore interpreting the volume of output.
Historically, legacy benefits were assigned a corresponding unit cost and these weighted components were aggregated into a chain-volume measure to represent output. However, upon transfer of benefits to Universal Credit, there are no updated unit costs available that accurately capture output within the Universal Credit framework. Until updated unit costs for Universal Credit become available, or an alternative measurement is proposed that represents Social Security Administration (SSA) output, an “output-equals-inputs" assumption will be applied. This approach renders SSA productivity growth at zero until a solution for determining output is identified.
Student enrolment figures and absence rates are obtained from the National Accounts to assess the non-quality adjusted output of Education. Upon following revisions within the National Accounts, the estimates for student enrolment rates have been revised up from Quarter 2 (Apr to June) 2020. This led to higher estimates of final consumption expenditure since Quarter 2 2020. The figures have been updated from Quarter 1 (Jan to Mar) 2022 onwards, while estimates from Quarter 2 2020 to Quarter 4 (Oct to Dec) 2021 will be updated at the earliest opportunity.
This change has been included for the first time in our Public service productivity, quarterly, UK: April to June 2023 bulletin, and we will incorporate this change in our Public service productivity: total, UK, 2021 article.
Adjustments made in response to the coronavirus (COVID-19) pandemic
Several major adjustments were required to determine public sector productivity statistics as a result of the coronavirus (COVID-19) pandemic. The coronavirus pandemic caused widespread disruption to the provision of public services, and this led to an adjustment of the data sources and methods to better reflect output of public services during this period. The statistical outcomes are described in our latest Public service productivity: total, UK, 2020 article, which measures total public service productivity. In summary, the adjustments can be described as follows.
Coronavirus (COVID-19) adjustment to education
Enrolment figures within nursery, primary, secondary, and special schools (aggregated figures for local authority schools and academies) are a major component of education output. Prior to the coronavirus (COVID-19) pandemic, enrolment figures were adjusted by the average absence rate, which is sourced from national accounts.
When schools were closed in response to lockdown protocols, a “discounted rate” had to be determined that accounted for absences (those in remote learning are not included), attendance rates during periods of in-person teaching, and remote learning. The average absence rate and attendance rates during in-person teaching are obtained from national accounts. More information can be found in our Coronavirus and the impact of measures of UK government education output: March 2020 to February 2021 article.
During periods of remote learning, it was necessary to calculate a remote learner discount that represented teaching activity delivered remotely, while capturing the lower rate of output that would have been delivered in comparison with in-person teaching. Survey questions that aimed to capture the reduction in material covered, and the proportion of learning dependent on parents, were circulated to teachers via the Teacher Tapp app.
The absence rate, attendance rate during in-person teaching, and remote learners discount are weighted by student numbers within each education setting across the UK, and are aggregated into a weighted full-time equivalent (FTE) discount rate. This value essentially represents the volume of activity that was delivered relative to 1 FTE unit of conventional teaching. This is then used to adjust enrolment figures to generate non-quality adjusted output.
In addition, quality adjusted output for education required intervention because of the lack of comparable attainment data with years prior to the coronavirus pandemic. As a result of lockdown protocols, and the disruption incurred by the coronavirus pandemic, conventional exam practices across the UK were cancelled and students’ academic performances were determined via teacher-assessed grades. Concerns related to potential bias and comparability with pre-coronavirus pandemic attainment presented issues in the choice of teacher-assessed grades as the quality adjustment measure.
In addition, no data were available relating to bullying and the Key Stage 2 (KS2) disadvantaged gap index (DAG) during the coronavirus pandemic, which also contribute to the quality adjustment metric. Therefore, these data were replaced by the estimated mean learning loss in months in reading (primary and secondary) and mathematics (primary), which were published by the Department for Education for the 2020 to 2022 academic years. More information can be found in their Pupils' progress in the 2020 to 2022 academic years report.
The differences in average learning loss between students on free school meals (FSM) and non-FSM students served as a proxy measure for the KS2 DAG index. As a result of the disruption that arose from the coronavirus pandemic, the quality of education suffered notably during this time, and the learning loss metric helped to understand the trends in learning outcomes during periods when the education system faced disruption.
Coronavirus (COVID-19) adjustment to healthcare
Within healthcare output, new activities were included in 2020 to capture the volume of coronavirus-related testing, tracing and vaccination output provided. This applied the same methods established to capture volume output in the UK national accounts. More information can be found in our Measuring the economic output of COVID-19 testing, tracing and vaccinations: April 2020 to June 2021 report. These services were established to manage and mitigate the impact of COVID-19 and represented a sizeable contribution to public service healthcare output in the FYE 2021.
Expenditure relating to goods and services procured to combat the coronavirus pandemic were reported in the Department of Health and Social Care’s (DHSC) annual accounts for the FYE 2021, and are reported as goods and services in our measurement of healthcare inputs for 2020. This includes the operational costs of NHS Test and Trace, personal protective equipment and other equipment and consumables procured by the department.
Given that elements of this expenditure capture goods and services used across the UK, the total amounts reported in the DHSC annual accounts were split between England and the devolved nations based on the latest population shares at the time of processing (2021).
Coronavirus (COVID-19) adjustment to adult social care
The principle of the quality-adjustment procedure for adult social care (ASC) is to adjust output that accounts for how well clients needs are being met within care settings across domains such as accommodation, safety, and dignity. Data relating to these are obtained from the Personal social services adult social care survey conducted by the NHS. Up to 151 councils with adult social service responsibilities (CASSR) usually participate, representing a sample size of roughly 60,000 clients.
However, as a result of the coronavirus pandemic, participation in the survey was voluntary, and 18 CASSRs (with a sample size of 6,695) participated in 2020. This resulted in a substantially reduced sample size compared with previous years, raising doubts concerning the accuracy, precision and bias of the data.
Quality adjustment metrics are produced separately for residential and nursing care, and community care. In the case of community care, the data provided by the 18 CASSRs in 2020 were compared with the same 18 CASSRs from 2019, and the growth rates in scores were used as the quality measure. For residential and nursing care, coefficients for the relationship between personal circumstances and social care related quality of life are usually updated annually.
The small sample size would lead a smaller number of observations to the regression model, and the lack of coverage from these 18 CASSRs would not be representative of England. Therefore, the predicted coefficient for 2019 was used for 2020, under the assumption that the relationship between personal circumstances and social care-related quality of life remains relatively similar across time. The regular approach to measuring the change in social care-related quality of life while controlling for changes in personal circumstances is then applied, but using only the data from the 18 CASSRs present in both years’ data.
Coronavirus (COVID-19) adjustment to public order safety
Several parameters are included in the quality-adjustment measure for public order safety (POS), one of which includes reoffending rates, published by the Ministry of Justice. The data show the proven reoffences that have occurred within the following year. In the context of assessing public service productivity in 2020, this means that proven reoffences for 2019 were followed up in 2020.
The levels of reoffending for these cohorts did fall significantly as a result of the coronavirus pandemic. This was because of the impact of lockdowns and delays in the rate at which reoffences were proven.
As a result, these data are inconsistent and incomparable with how data on reoffending was collected for years prior to the coronavirus pandemic. In response to this, the reoffending rate for Quarter 4 2018 was kept constant throughout 2019 and 2020. This effectively removed the contribution of reoffending rates to the aggregated quality adjustment metric.
As mentioned above, following the Chancellor of the Exchequer’s announcement in June 2023, the Office for National Statistics (ONS) is reviewing public service productivity (PSP) to improve the measurement and reporting of public service productivity in the UK. Details on these improvements will be published in the future.
In addition, more information on adjustments to output measurements and how they compare internationally can be found in our International comparisons of the measurement of non-market output during the coronavirus (COVID-19) pandemic methodology.Back to table of contents
The baseline growth rates for public service productivity (PSP) are the Compound Annual Growth Rates (CAGR) for each service area from 1997 to 2019 (or a shortened time period where more appropriate). These provide a long-term average yearly growth rate for each service area within this period.
The 1997 to 2019 period was chosen because of its wide-ranging political and economic landscape and to avoid the coronavirus (COVID-19) pandemic’s effect on the growth rate. Because of methodological changes in adult social care (ASC) and children’s social care (CSC), their periods have been altered to begin from when the change was implemented. For ASC this is 2011 to 2019 and for CSC it is 2010 to 2019.
A brief overview of our nowcasting approach for estimating 2021 and 2022 PSP values is described here. It should be noted that revised data in the future may change the optimal model. Several forecasting methodologies were tested:
- exponential smoothing (ETS)
- autoregressive integrated moving average (ARIMA)
- seasonal and trend decomposition using loess (STL)
- simple average of ETS, ARIMA, and STL (combined)
- dynamic regression of annual values on annualised quarterly values (dynamic)
The dynamic regression had the greatest accuracy for most of the series, therefore for the 2021 and 2022 nowcasts, dynamic regression has been used. Dynamic regression also allows modelling using selected quarterly PSP series as predictor variables and accounts for the time series nature. It has been assumed that the regression errors follow an ARIMA process.
Please see Public service productivity, UK: 1997 to 2022 paper to see a more detailed method.
The nowcast presented in this model differs from the productivity estimates of 2021 and 2022 published on Figure 4 of our Public service productivity, quarterly, UK: April to June 2023. Figure 4 places the inputs, output and productivity in an annual context, simply combining our annual estimates from 1997 to 2020 with the annualised quarterly data from 2021 onwards.Back to table of contents
(The degree to which the statistical product meets user needs for both coverage and content.)
The UK Centre for the Measurement of Government Activity (UKCeMGA) was launched in 2005 to take forward the recommendations of the Atkinson Review (PDF, 1.08MB), with the aim to improve the measurement of government output, inputs and productivity, and to establish a regular reporting schedule. UKCeMGA closed in 2016 when development and production was continued by the Office for National Statistics (ONS).
In the years since the publication, we have developed estimates of output, inputs and productivity for different service areas. These estimates are updated annually, and any methods changes are explained in prior papers and articles that we have published. Service areas are based on the Classification of the functions of government (COFOG). These are:
- Adult social care
- Children’s social care
- Social security administration
- Public order and safety
- Other government services (this includes general government services, economic affairs, environmental protection, housing, recreation, and other public order and safety)
There are three different statistical outputs published in Public service productivity: Total, UK: 2020:
a volume index of total public services output and indices of output by service area
a volume index of total public services inputs and indices of inputs by service area
a derived index for total public services productivity and by service area (output per unit of inputs)
Following the announcement in June 2023 by the Chancellor of the Exchequer asking the National Statistician, Sir Ian Diamond, to review how we can improve the way public service productivity is measured, the ONS is working closely with users, analysts and public service experts in other government departments and academia to develop our data sources and methods to drive forward better public service productivity measures and quality adjusted estimates of public sector output.
In addition, as mentioned in Section 6, the ONS is doing additional work to establish the link between service areas and government departments.
Accuracy and reliability
(The degree of closeness between an estimate and the true value.)
Both the output and inputs series for each service area are constructed using a variety of administrative and national accounts data. The accuracy of the derived series therefore depends on the accuracy of the source data. Unless we have introduced substantial methodological changes, the main source of revisions to each service area's productivity estimates will be changes in source data and expenditure weights.
As there is no other source of public service productivity estimates that is comparable in methodology, validating our results is difficult. This is achieved through regular triangulation articles, as set out in the Atkinson Review.
It is difficult to provide a confidence interval around our estimates given the multiple sources of data on which the estimates are based. There will inevitably be some margin for error from a "true" measure of productivity, which is unknown. We collate triangulation evidence from other government departments and independent sources, which provides additional context to inform the interpretation of the public service productivity statistics.
Concerning data adjustments in response to coronavirus (COVID-19), efforts have been made to source alternative data that capture output as accurately as possible. For example, the mechanisms that were put in place to capture remote learning, attendance and learning loss for education provide the most accurate indication as to how education services were delivered. Opposingly, the reduced sample size for the quality adjustment measure for adult social care has likely generated a less accurate indication of how care needs were being met during this period. Considering the novelty of these changes in relation to our conventional practices, some caution will be required when interpreting the overall reliability of the data.
Coherence and comparability
(Coherence is the degree to which data that are derived from different sources or methods, but refer to the same topic, are similar. Comparability is the degree to which data can be compared over time and domain, for example, geographic level.)
Assessing coherence of the data in Public Service Productivity: Total, UK 2020 is difficult as there are currently no comparable measures published. We convert some source data from financial year to calendar year and aggregate results to a UK level, which makes it difficult to make comparisons at a country level. Service areas are also defined by Classification of the Functions of Government (COFOG) rather than administrative department or devolved administration. The different methodology developed for healthcare and education and the "output-equals-inputs" treatment of several service areas (police, defence, SSA, and other), means that some direct comparisons between service areas should not be made.
The estimates cover the UK and, where possible, are based on data for England, Scotland, Wales and Northern Ireland. Where data is not available for all four countries, the assumption is made that the available data are representative of the UK. This can happen for quality adjustment, output or inputs data.
Finally, in instances where the data are available for all four countries of the UK, there may be slight variations in definitions or reporting conventions that introduce additional, largely unquantifiable effects on our estimates.
A good degree of coherence was maintained when identifying alternative data sources to serve as alternative measurements during the COVID-19 pandemic. For example, data on absences and attendance for education all came from the same source in the national accounts. New sources such as Teacher Tapp were included in the adjustments, and the data were very relevant to the topic to justify its inclusion. There are potentially some notable concerns related to the comparability of the data. For education, healthcare, ASC, and POS, the conventional data series have been interrupted and replaced with proxy measurements; despite their justification for implementation, it is difficult to make comparisons with pre-COVID years. The reduced sample size in the quality adjustment measure for ASC and learning loss metric for education are notable examples of this.
Accessibility and clarity
(Accessibility is the ease with which users can access the data, also reflecting the format in which the data are available and the availability of supporting information. Clarity refers to the quality and sufficiency of the release details, illustrations and accompanying advice.)
Our recommended format for accessible content is a combination of HTML webpages for narrative, charts and graphs, with data being provided in usable formats such as CSV and Excel. We also offer users the option to download the narrative in PDF format. In some instances, other software may be used, or may be available on request. The datasets associated with this release have been modified in accordance with the accessibility legislation.
For information regarding conditions of access to data, please refer to the following links:
In addition to this Quality and Methodology Information, basic quality information relevant to each release is available in the background notes of the relevant article.
Notification of changes in methodology are published in the public service productivity topic specific methodology page as well as historic changes being available in the guidance and methodology area of our archive website.
Timeliness and punctuality
(Timeliness refers to the lapse of time between publication and the period to which the data refer. Punctuality refers to the gap between planned and actual publication dates.)
Estimates of output, inputs and productivity in the total public sector are published on a calendar-year basis, and generally refer to the period (t-2), with t being the current year of publication. If the reference period were to be moved, for example to (t-1), there would be a significant increase in the use of estimation to fill data gaps in the productivity articles, in advance of the publication of these datasets.
For more details on related releases, the GOV.UK release calendar provides 12 months' advance notice of release dates. In the unlikely event of a change to the pre-announced release schedule, public attention will be drawn to the change and the reasons for the change will be explained fully at the same time, as set out in the Code of Practice for Official Statistics.
Concepts and definitions
(Concepts and definitions describe the legislation governing the output and a description of the classifications used in the output.)
Our analysis of productivity in UK public services represents internationally pioneering work. Measurement of outputs follows the guidance in the System of National Accounts (SNA) 1993 and subsequent SNA 2008, as well as the European System of Accounts (ESA) 1995 and subsequent ESA 2010. Measurement of outputs (including the need to measure the change in quality), inputs and productivity follows the principles in the Atkinson Review. The estimates presented in the article are for service areas classified by the Classification of the Functions of Government (COFOG).
Estimates are published on a UK geographic basis, with no further geographic breakdown provided. This is unchanged from last year’s publication.
This statistic is a National Statistic and so meets the quality requirements of this status. It measures total productivity and the productivity of nine different service areas, offering a comprehensive coverage of the data required by the users.
Why you can trust our data
Any revisions to the data are clearly identified as such and limitations are made known to all users.Back to table of contents
How we collect the data, main data sources and accuracy
A range of data sources are used to provide a comprehensive picture of UK public services. A summary of these data sources is documented in Sources and methods for public service productivity estimates.
How we process the data
The following section outlines the main statistical methods used to compile estimates of public service inputs, output and productivity. A detailed explanation of the methods used is given in Sources and methods for public service productivity estimates. Significant methods changes are published in advance on the topic specific methodology page to inform users of both the nature and the likely impact of methodological changes.
The methods of measuring output vary between and within service areas. This section provides a breakdown of methods of measuring output, by output measure, including definition, service areas and their coverage percentages.
The expenditure shares among service areas were:
- Healthcare (41.4%)
- Education (16.3%)
- "Other" government services (16.0%): general government services, economic affairs, environmental protection, housing, recreation, and other public order and safety
- Defence (9.1%)
- Adult social care (5.8%)
- Police (4%)
- Public order safety (3.1%)
- Children's social care (2.8%)
- Social security administration (1.6%)
Quantity output measure
Definition: the number of activities performed and services delivered. Growth values in individual activities are weighted together using the relative cost of delivery.
Percentage of service area with quantity measures only:
- Children’s social care: 4%
- Public order and safety: 23%
- Healthcare: 9%
- Education: 14%
Quality adjusted output measure
Definition: Quantity output is adjusted for the quality of the services delivered. If the quality adjustment is positive, estimates of output growth will increase.
Percentage of service area that is quality adjusted:
- Adult social care: 100%
- Education: 86%
- Healthcare: 79%
- Public order and safety: 77%
- Children’s social care: 61%
Definition: For some services, we cannot measure output directly, so we assume the volume of output equals the volume of inputs used to create them, meaning that productivity growth will always be zero.
Percentage of service area that is 'output-equals-inputs':
- Police: 100%
- Defence: 100%
- Other government services: 100%
- Social security administration: 100%
- Children’s social care: 35%
- Healthcare: 12%
The output measures used are based on or taken in chained volume from the Blue Book. Given that most public services are supplied free of charge or at cost price they are considered non-market output. The output of most services is measured by the activities and services delivered. These are usually referred to as "direct output" measures. These activities are measured and aggregated into a single volume output according to their relative cost or share of service area expenditure. This is referred to as a Cost-Weighted Activity Index.
For "collective services" — those that are not provided to an individual, such as defence — it is difficult to define and measure the nature of their output. It is assumed for such services that the volume of output is equal to the volume of inputs used to create them. This is referred to as the "output-equals-inputs" convention.
In addition, a quality adjustment factor is applied to the volume of activity index of several service areas. The purpose of these quality adjustment factors is to reflect the extent to which the services succeed in delivering their intended outcomes and the extent to which services are responsive to users' needs. This results in estimates differing from those used in the national accounts.
There are currently five service areas that include such an adjustment:
- Public order and safety
- Adult social care
- Children's social care
The quality adjusted measures are now described but users need to be aware of the changes applied in 2020 (described in Section 4: Quality summary).
The healthcare productivity quality adjustment is a compound measure made up of five components:
- short-term post-operative survival rates
- estimated health gain from procedures
- waiting times
- primary care outcomes achievement under the Quality and Outcomes Framework
- National Patient Surveys scores
This quality adjustment process is applied from 2001 onwards. In the national accounts series, no quality adjustment is applied to healthcare output at present.
Further detail can be found in Source and Methods Public Service Productivity Estimates: Healthcare.
The education productivity is quality adjusted using four component:
- Attainment for primary and secondary school
- Disadvantaged attainment gap index at Key Stage 2
- Qualified teacher status
For the years in which schools faced disruption because of coronavirus (COVID-19), these four components were replaced with a single component, the average months of learning lost in reading and mathematics.
Further detail can be found in Sources and methods for public service productivity estimates, and in our Improved methodology articles published in 2021 and 2022.
Public order and safety
Quality adjustments are applied to the criminal justice system elements of public order and safety output. This includes output associated with Crown Courts, magistrates' courts, legal aid, Crime Prosecution Service, prison and probation services. There are two main sections included. The first adjusts the whole series by a severity-adjusted measure of total reoffences per offender. The second looks more closely at the different service areas. With prisons, this is including escapes from and safety inside the prisons, using number of incidents and their severity. With courts, it uses the timeliness of courts to process cases passed on to them by police.
There are two main sections included. The first adjusts the whole series by a severity-adjusted measure of total reoffences per offender. The second looks more closely at the different service areas. With prisons, this is including escapes from and safety inside the prisons, using number of incidents and their severity. With courts, it uses the timeliness of courts to process cases passed on to them by police.
In Public service productivity: total, UK, 2019, an adjustment has been made for the recidivism indicator, since the data on reoffending for the last quarter of 2018 and all of 2019 have been affected by coronavirus (COVID-19). Following on from this, the adjustment was also applied to the 2020 data. More information about these aspects can be found in our methodology article. Further detail can be found in Quality adjustment of public service public order and safety output: current method.
Adult social care
A new quality adjustment in ASC was introduces to apply the concept of adjusted social care-related quality of life and data from the Adult Social Care Survey. To assess how well their needs are met, respondents are asked to rank how well their care needs are met in eight domains, such as food & nutrition, accommodation and safety. Then, each level of response is weighted by importance to quality of life, using weights derived from another survey of community care users.
The quality adjustment is produced separately for working age adults with learning disabilities, other working age adults and older adults for each of residential and nursing care, and community care. The final six components are then weighted together using the same measure of public expenditure as used in the inputs and output. The quality adjusted output is obtained from the rate of change in the aggregate quality adjustment for each year and then applied to the corresponding year of the output index. More information on the methodological developments can be found in Public service productivity: adult social care QMI.
The input measures used are based on or taken from a mixture of expenditure, deflator and administrative data sources. They consist of compensation of employees, intermediate consumption and consumption of fixed capital of each service by central government and local government.
Central government expenditure data are sourced in current prices from HM Treasury's public spending database - Online System for Central Accounting and Reporting (OSCAR) - that collects financial information from across the public sector. Annual estimates are derived from monthly profiles of spending for the current financial year and modified to meet national accounts requirements.
Most local government expenditure data are sourced from financial year returns by local authorities, apportioned across calendar years.
Expenditure data are subsequently adjusted for price changes (deflated) using a suitable deflator (price index). The purpose of this being to measure input volumes indirectly.
For a number of inputs - in particular most healthcare and education labour inputs - volume series are measured directly using administrative data sources (that is, full-time equivalent staff numbers from NHS staff resources).
Deflator or price indices
A suitable deflator (price index) - or composite deflator - is applied to each current price expenditure to estimate a volume series. The Atkinson Review (PDF, 1.08MB) recommends that deflators are applied separately for each factor and that the price indices should be specific for each service. Price indices for labour and procurements should be sufficiently disaggregated to allow for changes in the compositions of the inputs. Currently, deflators are taken from a range of different sources to best represent changes in prices for each service input. Where suitable data is unavailable the gross domestic product (GDP) implied deflator (acting as a generic price index) is used instead.
These series are aggregated to form an overall estimate of the volume of inputs used to provide each of the public services identified in the total public services.
Further detail can be found in the Sources and methods for public service productivity estimates.
Aggregating service area inputs and output
The expenditure shares of each public service component are calculated using a breakdown of general government final consumption expenditure (GGFCE) by Classification of the Functions of Government (COFOG).
Estimates of total public sector output are produced by weighting and then aggregating the volume of output in each service area. The weights used in this process are the service area COFOG expenditure weights, and are applied to form a chain-linked Laspeyres volume index of total public service output.
This method follows the formula:
Where wit-1 is the value share of item i in the base period t-1, and Rlt-1,t is the volume relative (the ratio of the quantity of an activity to the quantity of the same activity in the base period).
In the context of public service output the weights (wi) are indicative of the relative value of different activities (qi). Unit costs can be used to approximate the "price" of an activity (pi) given the difficulty of accurately estimating the relative social and economic value of different activities. However, in practice, expenditure shares from public finance data are generally used to approximate relative value (wi) of activities. The weights for different activities are those taken from the first year of each activity pair (the base year t-1).
For example, if we were combining activity series for each of the devolved UK nations for 2010, we would weight each of the activity growths from 2009 to 2010 for England, Scotland, Wales or Northern Ireland by their respective expenditure shares in 2009.
Estimates of total public sector inputs are produced in a similar manner. This involves weighting and then aggregating the volume of inputs in each service area, using the same COFOG expenditure weights as in the calculation of aggregate output. This produces a chain-linked Laspeyres volume index of inputs for total public services.
Estimates of total public sector productivity are calculated using the aggregate output and inputs indices produced using the approach just discussed.
Including the police, defence and other government services in the calculation of productivity will limit the growth in total public service productivity, pushing estimates of productivity growth towards zero. The extent to which they affect the growth of total public service productivity is proportional to their share of total expenditure. During periods when productivity in other sectors is positive, the "output-equals-inputs" convention will reduce productivity growth. During periods when productivity in other sectors is negative, the inclusion of the police, defence and other sectors will tend to raise productivity growth estimates.
How we analyse and interpret the data
The contributions of each service area to total growth in output, inputs and productivity are calculated, as are the levels of revisions. These different findings are shown in a series of charts for stakeholders within the Office for National Statistics (ONS), and the reasons behind changes in the figures are identified as far as possible.
The data are then published for use by various external stakeholders, who are welcomed to provide feedback, show us how they use the statistic and provide guidance on where we should focus future work in public service productivity.
How we quality assure and validate the data
A number of procedures are followed to quality assure the data. These processes are applied at all stages of the production process - at both granular and aggregate levels.
Internal quality assurance is carried out at all key stages of processing. This is followed by a larger scale quality assurance, involving stakeholders and key individuals. A quality assurance follows a parallel run of two aggregation systems (by service area and by component). This made it possible to check the accuracy of the data and the processing system simultaneously.
Visual presentations are created from the processed data. These presentations are used for internal analysis to highlight significant data points or patterns that may warrant further investigation.
How we disseminate the data
The Public service productivity: Total, UK releases are published free of charge on the ONS website. They are published once a year, within the Public Services Productivity section of the ONS website. Supporting documents are clearly linked and accessible to users. Additional data can be provided on request.
How we review and maintain the data processes
Further revisions to the estimates may be required in accordance with, for example, changes to source data. This follows ONS' Revisions Policy. There is also available a guide to statistical revisions.Back to table of contents
Assessment of user needs and perceptions
(The processes for finding out about uses and users, and their views on the statistical products.)
Our productivity releases have a range of users, as given in the earlier section.
We have developed two main ways of obtaining information on users and uses of our public service productivity estimates:
User consultation meetings and regular functional board meetings on healthcare, education and adult social care. These meetings allow the exchange of information on data sources, development issues and methods changes that affect our public service productivity estimates.
A user feedback questionnaire is circulated to those who make enquiries about public service productivity
For further information, please contact the Productivity team via email at firstname.lastname@example.org.
Contact details for this Methodology
Telephone: +44 1633 455759
- Public service productivity: total, UK, 2020
- Public service productivity, quarterly, UK: April to June 2023
- Public service productivity, healthcare, England: financial year ending 2021
- Public service productivity, adult social care, England: financial year ending 2021
- Public service productivity, UK: 1997 to 2022