Table of contents
- Overview
- Healthcare
- Education
- Defence
- Adult social care
- Policing and immigration
- Public order and safety
- Children’s social care
- Social security administration
- Tax administration
- Other government services
- Difference between annual and quarterly statistics
- User and stakeholder needs
- Cite this methodology
1. Overview
This methodology article sets out the sources and methods used to construct estimates of productivity for total public services, most recently presented in our article Public service productivity: total, UK, 2022.
It contains a summary of the data sources used and a breakdown of how the Office for National Statistics (ONS) calculates estimates of productivity in each service area.
In addition to the annual statistics, we also publish quarterly estimates. While this article focuses on the sources and data used for the annual statistics, a summary of the main differences between annual and quarterly is included in Section 12: Difference between annual and quarterly statistics.
For the impact of recent methodological changes, refer to our article Public Services Productivity Review, impact of improved methods on total public service productivity: 1997 to 2021. Further information on the methodology used and details on the strengths and limitations are included in the Public service productivity: total, UK Quality and Methodology Information (QMI).
The main concepts and methods common to all service areas are explained in this section, with specific detail on each service area's output and inputs measures contained in Sections 2 to 11.
Productivity
At the most aggregate level, productivity is the measure of how many units of output are produced from one unit of inputs and is calculated by dividing total output by total inputs. Adopting P, O and I to indicate productivity, output and inputs, respectively, and including a subscript t for time periods:
Total public service output and inputs indices are calculated by aggregating output and inputs for the following service areas:
- healthcare
- education
- defence
- adult social care
- policing and immigration
- public order and safety
- children's social care
- social security administration
- tax administration
- other government services
Total public service productivity is then calculated by dividing this index of output by the index of inputs.
Statistics are published on a UK geographic basis from 1997 to the latest available year, usually two years prior to the publication date.
Output and inputs indices for each service area are aggregated together using their relative general government (combined central and local government) expenditure weight, using data from the UK National Accounts on a Classification of the Functions of Government (COFOG) basis.
Quantity output
Different measurement techniques for output are adopted for different service areas.
For most service areas, output is measured in direct volume terms by the number of activities performed by that service area. Activities are weighted together into a cost-weighted activity index (CWAI). The CWAI calculates the change in the number of activities undertaken, weighting each activity by its cost such that a change of one unit of activity for a high-cost activity has a greater effect on the output than a change of one unit of activity for a low-cost activity. Healthcare and education, as well as adult social care, children's social care, social security administration, tax administration, and public order and safety, all involve some degree of direct volume measurement in the form of a cost-weighted activity index (CWAI).
Three service areas (police, defence and other public services) are largely "collective" services, which impact the population as a whole rather than an individual, and therefore output from these sectors is more difficult to measure directly. Instead, an "output-equals-inputs" convention is applied, where output volume is assumed to equal the volume of inputs used to create them. In this case, productivity is constant.
Within healthcare, approximately 13% of output is measured indirectly; this output is services delivered by non-NHS providers. It is worth noting that inputs and output are also equal for GP prescribing, however, in this case we calculate the volume of outputs directly with cost-weighted activity.
Within children's social care (CSC), 71% of output is directly measured. Since the financial year ending (FYE) 2015, direct measurements have been available for services including safeguarding, non-secure accommodation, secure accommodation, adoptions, and care leavers.
Within UK adult social care (ASC), 34% of output is directly measured; in particular, directly measured output for England is only available for residential and nursing care settings from FYE 2015 (approximately 34% of the England measure). All Wales output is indirectly measured, and Northern Ireland output is indirectly measured after FYE 2020. Approximately 58% of Scotland output is directly measured, and this relates to care home and home care activity.
Within the tax administration service area, a "revenue adjustment" is applied that adjusts the cost weights by the revenue raised per £ of administrative cost for different taxes. This enables efficiency improvements from changes in the number of tax payments made for low-cost taxes relative to high-cost taxes to be represented in the measure.
In total, approximately 40% of output is measured using the "output-equals-inputs" convention, the other 60% is measured directly. All figures stated here refer to Public service productivity: total, UK, 2022. In our publications, "quantity output" and "non-quality-adjusted output" have the same meaning.
Quality adjustment
Where data are available and relevant, output measures are quality adjusted. Quality adjustments are currently applied to six service areas:
healthcare
education
children's social care
adult social care
public order and safety
Following the Review into public services productivity, we have begun to introduce quality adjustment for social security administration.
A quality adjustment is, in its simplest terms, a statistical estimate of the change in the quality of a public service. This provides a more accurate picture of the link between output and the desired outcomes, for example, increased attainment in GCSE-level attainment scores for the education service area. In the market sector, quality is accounted for through differences in the prices of goods and services. However, in the non-market sector, there is no market price therefore prices cannot be used. For more detail on quality adjustments, see A guide to quality adjustment in public service productivity measures.
The reasons for quality-adjusting public service output are well-documented and follow from recommendations made in the Atkinson Review (PDF, 1.07MB).
"Quality-adjusted output" describes this concept in our publications.
Inputs
Inputs comprise volume estimates of labour, goods and services (intermediate inputs), and capital assets used in delivering public services. These series are aggregated together to form an overall estimate of the volume of inputs used to provide each of the public services identified in the total public service productivity articles.
For some service areas, inputs are measured indirectly by using current expenditure adjusted by a suitable deflator. In most areas inputs are measured directly, such as the number of full-time equivalent staff.
Deflation
Where direct inputs volume measures are unavailable, or indirect volume measures are more precise, expenditure from the UK National Accounts are deflated by an appropriate price deflator in order to remove the effect of price inflation. If single, appropriate deflators are not feasible, composite deflators are constructed, with broader indices used (such as CPI: All Items) where there is not enough evidence to inform this granularity further.
Composite deflators are constructed by sourcing more relevant data on the prices and quantities of specific inputs. The changes in the prices of different inputs are aggregated into a chain-linked Paasche price index, which weights the changes in prices by their relative volumes in the current year.
For example, the growth in average gross pay for different prison service staffing groups (a price change in labour) are weighted together using staffing numbers (the quantity) to create a composite labour deflator. This better approximates the overall price changes in labour for a service area with fairly homogeneous labour inputs.
Public sector procurement data for different service areas are used to create composite intermediate consumption deflators that better reflect price changes in the cost of goods and services relevant to specific service areas. Procurement data are sourced from the Online System for Central Accounting and Reporting (OSCAR) dataset for central government expenditure and from the Subjective Analysis Return, which is published as an annex to Local authority revenue expenditure and financing for local government expenditure. Relevant price indices are mapped to each expenditure category, and these are chain-linked according to the expenditure weight of the category they represent.
Therefore, where inputs are measured indirectly, revisions to inputs estimates can result both from changes to expenditure and changes to the deflator used.
Splining
Where data are received on a financial (April to March) or academic (September to August) year basis, a statistical technique known as splining is used to align these data to the calendar year (January to December). We use a cubic spline method, which calculates a quarterly path for the annual data (in the financial or academic year). The method follows a set of constraints to ensure that the quarterly path experiences no artificial changes in the growth or level of the series and that the average or sum of the four quarters for a particular academic or financial year is equal to the annual data used. This quarterly path can then be re-aggregated up to a calendar year by averaging or summing the four quarters of the calendar year.
Index numbers
Indices can be used to determine how changes in the monetary value of economic transactions can be attributed to changes in price (to measure inflation) and changes in quantity (to measure sales volume or economic output) over time. Different indices are used depending on the data type and purpose. The approach taken is consistent with ONS methodology guidance, the Consumer Prices Indices technical manual and calculations carried out in the UK National Accounts.
Volume activity series are constructed using a cost-weighted Laspeyres index (base year-weighted arithmetic mean).
This method follows the formula:
Where wit is the value share of item i in the base period 0, and Rit-1,t is the volume relative (the ratio of the quantity of an activity to the quantity of the same activity in the base period).
In the context of public service output, the weights (wi) are indicative of the relative value of different activities. Unit costs can be used to approximate the "price" of an activity (pi) given the difficulty of accurately estimating the relative social and economic value of different activities. The weights for different activities are those taken from the first year of each activity pair (the base year 0). For example, if we were combining activity series for each of the devolved UK nations for 2010, we would weight each of the activity growths from 2009 to 2010 for England, Scotland, Wales or Northern Ireland by their respective expenditure shares in 2009.
Where prices indices (for example, deflators) are weighted together, these are constructed using Paasche indices (current year-weighted harmonic mean).
This method follows the formula:
where wit is the value share of item i in the current period t, and Rit-1,t is the price relative (the ratio of the price of a good or service to its price in the base period).
For example, data on price changes in the cost of labour, goods or services and their quantities are used to construct composite price indices. Nominal expenditure data are deflated to real expenditure by dividing by the appropriate price index.
Further guidance on indices methodology can be found at ONS's index numbers guidance.
Comparability
Unlike other measures of productivity produced by the ONS, public service productivity estimates include goods and services, as well as labour and capital, as inputs. This is necessitated by the fact that public service output measures are gross output (total output) measures, rather than value added measures as used in labour productivity and multi-factor productivity, meaning that estimates are not comparable. For more information on how to compare the three measures of productivity, see our article How to compare and interpret ONS productivity measures.
Back to table of contents2. Healthcare
Quantity output
The quantity of healthcare is estimated using data on a range of healthcare services provided within:
hospital and community health services (HCHS); this includes hospital services, community care, mental health and ambulance services
primary care and preventive health services, formerly known as family health services (FHS); this includes publicly funded general practice, dentistry and ophthalmic services, services provided via NHS phonelines and websites, and preventive care services commissioned by local authorities that are not provided by NHS trusts or primary care providers
community prescribing; this represents prescribed medicines and other medical goods that are dispensed in the community
non-NHS provision; this represents healthcare funded by the government but provided by the private or third sector (outside of primary care providers previously referenced) and is indirectly measured using the "output-equals-inputs" approach
COVID-19-related testing, tracing and vaccinations; this represents specific services established during the coronavirus (COVID-19) pandemic to mitigate the effects of the disease as well as ongoing vaccination campaigns
With the exception of non-NHS provision, non-quality adjusted (or "quantity") output growth for all of these components is measured by cost-weighting different types of activities undertaken using a Laspeyres index. Healthcare quantity output is first compiled at a nation level by aggregating different activities into a cost-weighted activity index (CWAI) using unit costs at the most granular level available. UK-level aggregation is then achieved by combing the nation-level output indices using expenditure weights taken from HM Treasury's Country and regional analysis.
HCHS and primary care services output is adjusted to account for differences in the number of days in a given year. If services are not provided on every day of the year we apply a working-days adjustment to account for changes in the number of weekends and bank holidays in a given financial year. Otherwise a total days adjustment is applied to account for leap years.
In some instances, changes to the data sources used to capture activity growth for different services can make the data incomparable from year to year, for example, changes to the definition of an activity, or data collections halting. Depending on the data issue, we have different approaches to try to continue to measure the activity in question wherever possible to do so. These approaches include estimating activity growth at a higher level of aggregation or using growth rates from proxy data sources.
Where possible, we look to liaise with data producers to identify the most suitable methods. When it is not possible to reconcile data differences, we exclude the activity from the overall measure for that year. When we do adopt alternative methods for estimating output growth, these are detailed in the methods articles accompanying a release.
As in previous publications, non-NHS provision is calculated by deflating expenditure data, using the same deflator that is used in the inputs. Therefore, non-NHS provision is an "output-equals-inputs" component. Similarly, the output measure produced for community prescribing is also used in the inputs on an "inputs-equals-output" basis. Given output and inputs measures are the same, no productivity growth can be observed for these components.
Data sources and geographic coverage for healthcare quantity output
For England, we mostly collect a variety of open data published by the Department of Health and Social Care or NHS England, with some unpublished data also provided. For Scotland, Wales and Northern Ireland, equivalent data are provided in the form of unpublished direct data submissions by the devolved health administrations to the Office for National Statistics (ONS).
We typically measure admitted patient care activity as finished consultant episodes, with different activity measures used for other services (for example, number of examinations for diagnostic imaging, number of critical care bed-days, or number of tests for pathology services).
While not an exhaustive source list, our main data sources used to measure health activity in England are:
National Cost Collection (hospital and community health services)
Appointments in General Practice (General Practice)
Dental statistics (dentistry)
General Ophthalmic Services (GOS) activity data (ophthalmic services)
Prescription Cost Analysis (community prescribing)
NHS England (NHS COVID-19 vaccinations)
Department of Health and Social Care Annual Report and Accounts (non-NHS expenditure)
Quality adjustment
The Quality adjustment of public service health output: current method (PDF, 152KB) provides a detailed description of the quality adjustment methodology.
We apply a series of quality adjustments quantity output to account for the change in quality of a service provided in instances where this cannot be ascertained from the activity and unit cost data. A positive quality adjustment indicates that the quality of healthcare services provided, as defined by the selection of indicators used in the quality adjustment, has improved. The quality adjustment is applied to UK output on a calendar-year basis, but also to our England-only financial year statistics. Currently the quality adjustment is produced from England-only data.
The health quality adjustment has three components. The first two are related to achieving outcomes, and the third relates to meeting user needs:
health gain for elective and non-elective hospital procedures
the degree to which GPs are following best practice in the treatment of certain ongoing conditions
the quality of the patient experience for various primary and secondary care services
Our quality adjustment for hospital elective and non-elective procedures uses a dataset provided by the Centre for Health Economics at the University of York. The adjustment includes:
short-term post-operative survival rates, derived from Hospital Episode Statistics (HES); short-term survival is used to adjust day cases, elective inpatients and non-elective inpatients
estimates of health benefit from procedures, derived from research studies, ONS Life Tables and Patient Reported Outcome Measures; we apply this adjustment to day cases, elective inpatients and non-elective inpatients
waiting times from HES; waiting times are used as a quality adjustment for day cases and elective inpatients
Further information on how this quality adjustment is applied is available in our Quality adjustment of public service health output: current method (PDF, 152KB), which provides a detailed description of the quality adjustment methodology.
Elements within the hospital and community healthcare sector include:
- national patient experience surveys, from NHS England, used as an adjustment for day cases, elective inpatients, non-elective inpatients, emergency care and mental health
Our quality adjustment accounting for outcomes in general practice relies on aggregate data on clinical measures recorded on GP practice computers, from the Quality and Outcomes Framework (QOF). We use the change in achievement scores from a selection of clinical indicators that relate most closely to actual health outcomes. Indicators relate to changes in outcomes for patients with the following common health conditions:
diabetes
asthma
chronic heart disease
hypertension
stroke and transient ischaemic attack
chronic kidney disease
The quality adjustment assesses the change in the achievement rate for the chosen indicators, which is subsequently weighted to account for the share of patients with the given health condition.
Changes in patients' experience of care is measured through responses collected in a selection of national surveys that are part of the Care Quality Commission's NHS patient survey programme, as well as the GP Patient Survey.
Currently, we apply patient satisfaction measures for:
inpatient care (elective inpatient, day cases and non-elective)
A&E
mental health services
general practice
dentistry
Changes in patient experience are determined by changes in the average patient experience scores and are weighted according to the relative value of patient satisfaction for each service sector.
Hospital and community health services (HCHS) and primary care services are quality adjusted, but no quality adjustment is applied to community prescribing, COVID-19 testing, tracing and vaccinations, or non-NHS provided services.
Inputs
Labour inputs are mainly measured through a Laspeyres cost-weighted labour index (CWLI), which uses administrative data on the health service's workforce to measure growth in full-time equivalent staff numbers weighted by their cost, in a similar manner to the cost-weighted activity index used for quantity output. However, it should be noted that agency staff are included in intermediate consumption inputs because they are not employed by the NHS, while NHS bank staff are included in labour inputs, because they are NHS employees.
The intermediate consumption of goods and services used in the provision of healthcare is also calculated using expenditure data deflated by relevant deflators to account for the cost inflation faced by the health service. From our Public service productivity, healthcare, UK: 2017 onwards, many of the deflators used are taken from the NHS Cost Inflation Index (NHSCII), which is produced by the Department of Health and Social Care (DHSC). This includes the overall NHSCII, sector-specific components of the NHSCII and a version specific to NHS providers' intermediate consumption produced by the ONS. A change to the methodology for deflating agency staff expenditure, which makes use of mandatory data collections undertaken by NHS England and NHS Improvement on agency staff spending, was incorporated in the data for financial year ending (FYE) 2019 onwards.
The volume of capital inputs is measured by consumption of fixed capital, which covers the cost of depreciation of capital goods (items that are anticipated to be in use over several years, such as buildings and vehicles) over time. Data used for this element are estimated in the UK National Accounts using the perpetual inventory method.
The total inputs index is created by weighting the three components of healthcare input together according to their share of total healthcare expenditure recorded in the UK National Accounts. Where data are not provided by a country, it is assumed that this component grows in line with the rest of the UK.
Geographical coverage of inputs data varies across the countries of the UK.
For labour inputs, we include information for hospital and community health services (HCHS), GP services and bank staff. Data from England, Wales, Scotland and Northern Ireland are available for HCHS and GP services, however, bank staff are only included for England.
The sources for HCHS and GP services are:
NHS England (NHSE) for England
Welsh Government for Wales
Scottish Government for Scotland
Department of Health Northern Ireland (DH NI) for Northern Ireland
The sources for bank staff are:
- NHS England (NHSE) for England
For goods and services inputs, we include information on HCHS, dental services, ophthalmic services, pharmaceutical services, GP services, community health and miscellaneous services (CHMS), GP drugs, non-NHS provision, agency staff expenditure, welfare food and health administration.
Sources for HCHS and dental services are:
NHSE for England
Welsh Government for Wales
Scottish Government for Scotland
Sources for non-NHS provision, ophthalmic and pharmaceutical services are:
NHSE for England
Welsh Government for Wales
Scottish Government for Scotland
Sources for GP services are:
NHSE for England
Welsh Government for Wales
Scottish Government for Scotland
Sources for CHMS are:
DHSC for England
Welsh Government for Wales
Sources for GP drugs are:
prescription cost analysis (PCA) for England
Welsh Government analysis for Wales
Scottish Government analysis for Scotland
DH NI analysis for Northern Ireland
Sources for agency staff expenditure and welfare food are:
DHSC for England
Welsh Government for Wales
Sources for health administration are:
- DHSC for England
Capital inputs include information on UK capital consumption, for which data are available from the UK National Accounts.
The geographic coverage for deflators is now described, which are either UK-wide or England-only deflators. Where deflators are available for England only, the same rate of price increase is assumed for the other countries of the UK.
Intermediate consumption other than that specified (includes NHS providers) in:
FYE 2015 to FYE 2023 is deflated using an ONS intermediate consumption-specific version of the NHS Cost Inflation Index (NHSCII) NHS providers non-pay deflator
FYE 1996 to FYE 2015 is deflated using intermediate consumption other than that specified (includes NHS providers)
Non-NHS provided services in:
FYE 2015 to FYE 2023 are deflated using the NHSCII NHS providers deflator including both pay and non-pay elements
FYE 1996 to FYE 2015 are deflated using a non-NHS deflator combining an ONS intermediate consumption-specific version of the Health Service Cost Index (HSCI) and ONS pay cost index covering Hospital and Community Health Services (HCHS) staff
NHS bank staff costs for FYE 2016 to FYE 2023 are deflated using the NHSCII pay cost deflator for NHS providers.
Agency staff costs in:
FYE 2018 to FYE 2023 are deflated using NHSCII agency cost deflator
FYE 2015 to FYE 2018 are deflated using NHSCII pay cost deflator for NHS providers
FYE 1996 to FYE 2015 are deflated using ONS pay cost index covering HCHS staff
General practice intermediate consumption for FYE 1996 to FYE 2023 is deflated using the Consumer Price Index including owner occupiers' housing costs (CPIH).
Dental services in:
FYE 2008 to FYE 2023 are deflated using NHSCII dental cost deflator and an ONS equivalent for earlier years
FYE 1996 to FYE 2008 are deflated using a non-NHS deflator combining an ONS intermediate consumption-specific version of the HSCI and an ONS pay cost index covering HCHS staff
Pharmaceutical services (excluding drug costs) in:
FYE 2015 to FYE 2023 are deflated using overall NHSCII
FYE 1996 to FYE 2015 are deflated using a non-NHS deflator combining an ONS intermediate consumption-specific version of the HSCI and an ONS pay cost index covering HCHS staff
General ophthalmic services in:
FYE 2015 to FYE 2023 are deflated using an overall NHSCII non-NHS deflator combining an ONS intermediate consumption-specific version of the HSCI and an ONS pay cost index covering HCHS staff for FYE 1996 to FYE 2015
FYE 1996 to FYE 2015 are deflated using a non-NHS deflator combining an ONS intermediate consumption-specific version of the HSCI and an ONS pay cost index covering HCHS staff
The inputs for hospital and community health service employees (other than bank staff) working in general practice and GP-prescribed drugs are not deflated as these inputs are directly measured using a cost-weighted labour index or cost-weighted drug index.
Capital consumption inputs are obtained from the national accounts in volume terms and so need no further deflation.
Back to table of contents3. Education
Quantity output
Education quantity output is the sum of full-time equivalent (FTE) publicly funded student numbers within the following sectors across the UK:
pre-school education, which is composed of students in local authority (LA)-maintained pre-primary schools and places funded in the private, voluntary and independent sector (PVI)
LA-maintained primary, secondary and special schools
for England – primary, secondary and special academies
for England – alternative provision (AP) for LA-maintained schools
for England – AP for academies
further education (FE), composed of adult learners aged 16 to 19 years
further education training of healthcare professionals from 1997 to 2011
Healthcare training is only included up to 2011 because of the measures being captured in the non-profit institutions serving households (NPISH) sector in the UK National Accounts from this period. This was because of healthcare training moving into the higher education (HE) category.
Enrolment figures for primary, secondary, special schools and AP (composed of separate figures for LA-maintained schools and academies) are adjusted for attendance to produce activity metrics for these school phases. From 2020 onwards, additional adjustments to consider the delivery of remote learning, absence rates and attendance during periods of in-person teaching are conducted to account for the impact of the coronavirus (COVID-19) pandemic on education activity. More information on this adjustment can be found in our Coronavirus and the impact on measures of UK government education output: March 2020 to February 2021 and Remote schooling through the coronavirus (COVID-19) pandemic, England: April 2020 to June 2021.
Activity and expenditure data are splined into calendar year, then activity figures are weighted according to the cost of providing education to each school phase for each individual nation to produce a cost-weighted activity index (CWAI). The CWAI is produced using the Laspeyres approach.
By 2022, the following phases had the corresponding weights:
pre-school: 5.5%
primary (LA-maintained and academies): 41.9%
secondary (LA-maintained and academies): 37.0%
special (LA-maintained and academies): 6.7%
alternative provision (including pupil referral units): 0.9%
FE: 7.9%
Sources of education output data
Data for quantity and expenditure in schools are:
England: Department for Education (DfE), RO1 local authority outturn – Ministry of Housing, Communities and Local Government (MHCLG)
Wales: Welsh Government
Scotland: Scottish Government
Northern Ireland: HM Treasury
Data for quantity and expenditure in FE are:
England: DfE (quantity), Education and Skills funding agency (expenditure)
Wales: Welsh Government (quantity and expenditure)
Scotland: Scottish Funding Council (quantity and expenditure)
Northern Ireland: Department for Economy Northern Ireland (quantity, Welsh unit costs are used for expenditure)
Quality adjustment
Attainment for primary and secondary schools, and further education
Education quality adjustment is dictated by several components, but attainment bears the most weight on the model given that exam performance is the most crucial outcome of schools. Output in primary and secondary schools (including LA-maintained schools and academies) across the UK are quality adjusted according to attainment measures. FE output in England is also quality-adjusted according to attainment measures.
Because of education being a devolved policy area with courses and syllabi specific to each nation, different data sources are used to inform attainment according to school phase and nation. FE attainment in England is a new quality adjustment measure for education following the PSP Review and is based on the percentage of students meeting the minimum requirement for Level 2 and Level 3 qualifications by age 19 years. Separate attainment indices for Level 2 and Level 3 are prepared (these are processed by the "cohort-split" model, which is now discussed further), and these are weighted by the percentage of students completing each qualification.
During the coronavirus (COVID-19) pandemic, historical data sources were no longer valid because of data not being published, or concerns around grade inflation that arose following teacher-assessed grading practices. Therefore, the National Reference Test (NRT) is used to inform attainment from 2019 to 2020 onwards for primary and secondary schools.
The NRT is independent of teacher-assessed grades, and a robust indicator of GCSE-level performance. Although the NRT takes place in England only, it is used to inform attainment for schools across the UK because of the absence of similar metrics specific to each of the devolved administrations.
The NRT is used to inform attainment for primary schools in the absence of alternative data. Because of primary schools bearing significant weight, there was a concern that leaving data gaps untreated would underestimate the broader effects of the pandemic on attainment. In addition, the NRT represents performance in academic subjects, which are a focus in primary schools. This is unlike for FE, where there is a shift to technical qualifications in addition to academic, therefore the NRT is not used to inform attainment for FE from 2019 to 2020.
A summary of historical data sources and how the NRT is being used to treat each school phase is now outlined. The Office for National Statistics (ONS) will continuously review the application of the NRT and the period from when historical attainment data sources are published once again and are independent of marking practices that arose during the pandemic.
England
For the primary school phase, the data source up to 2018 to 2019 was the percentage of pupils meeting the expected standard in reading, writing and maths. Following the coronavirus (COVID-19) pandemic, the NRT was used to inform a data gap for 2019 to 2020 before resuming original data from 2020 to 2021.
For the secondary school phase, the data source up to 2018 to 2019 was the average attainment 8 score. The NRT has informed this measure from 2019 to 2020.
For FE, the data source up to 2018 to 2019 was the percentage of pupils meeting the minimum requirement for Level 2 and Level 3, respectively. There were no alternative data that can inform FE during the coronavirus pandemic, therefore the index is kept constant from 2019 to 2020.
Wales
For the primary school phase, the data source up to 2018 to 2019 was the percentage of pupils reaching expected level in English, Welsh and maths. The NRT has informed this measure from 2019 to 2020.
For the secondary school phase, the data source up to 2018 to 2019 was the average capped 9 score per pupil. The NRT has informed this measure from 2019 to 2020.
Scotland
For the primary school phase, the data source up to 2018 to 2019 was the percentage of pupils achieving the Curriculum for Excellence (CfE) level in reading, writing and numeracy. The NRT was used to inform a data gap for 2019 to 2020 before resuming original data from 2020 to 2021.
For the secondary school phase, the data source up to 2018 to 2019 was a composite measure of Level 2, skills for work and personal development, and National 5 attainment. The NRT has informed this measure from 2019 to 2020.
Northern Ireland
For the primary school phase, the data source up to 2018 to 2019 was the percentage of pupils achieving Level 4 or above in communication, maths, and information and communication technology (ICT). The NRT has informed this measure from 2019 to 2020.
For the secondary school phase, the data source up to 2018 to 2019 was the percentage of students achieving five or more GCSEs at grades A* to C including English and maths. The NRT has informed this measure from 2019 to 2020.
Processing attainment – the "cohort-split" model
The ONS applied a "cohort-split" model to account for the cumulative nature of education while processing attainment for each school phase. The model is applied to primary, secondary and FE phases within each nation separately.
The model considers exam performance achieved at the end of each school phase (Year 6 for primary, Year 11 for secondary, age 19 years for FE) and retrospectively apportions contributions to individual year groups within the cohort that build up towards the exam outcome.
For FE, attainment data are published according to age group, not academic year group, therefore retrospective contributions are assigned to age groups for FE. For a given academic year, weighted contributions are obtained from individual year groups from across separate cohorts to produce an attainment value that accounts for academic performance across the educational journey. Were this approach not applied, the model would assume that exam outcomes are solely attributed to performance at the end of the school journey (for example, Year 11 for secondary schools), which is not conceptually accurate. The weighted contributions assigned to year groups within each school phase are outlined in Table 1.
Year Group | Contribution to attainment (%) |
---|---|
Primary Schools | |
Reception | 14.29% |
Year 1 | 14.29% |
Year 2 | 14.29% |
Year 3 | 14.29% |
Year 4 | 14.29% |
Year 5 | 14.29% |
Year 6 | 14.29% |
Secondary schools | |
Year 7 | 20.00% |
Year 8 | 20.00% |
Year 9 | 20.00% |
Year 10 | 20.00% |
Year 11 | 20.00% |
Further education | |
16 years | 25.00% |
17 years | 25.00% |
18 years | 25.00% |
19 years | 25.00% |
Download this table Table 1: Contributions to attainment by year group
.xls .csvThese weights were given attention and revision following the PSP Review. For example, for attainment data released for secondary schools in 2018 to 2019, 20% of the score is applied to 2018 to 2019 to represent the contributions from Year 11. Likewise, 20% of the 2018 to 2019 score is retrospectively applied to 2017 to 2018 when that cohort was in Year 10. The remaining 60% is retrospectively added to the previous years.
In the latest years, there will be an incomplete number of contributions from some year groups because of those students not yet sitting their examinations and thus not having any scores to back-cast. In such circumstances, available year group contributions are re-scaled to total 100%, and these are re-weighted and updated as additional year groups feed into the model every production round.
The coronavirus (COVID-19) pandemic violated the typical assumptions of the "cohort-split" model; in particular, it would not be conceptually correct to retrospectively apportion exam outcomes achieved during the pandemic to previous years.
Attainment did fall during the pandemic (as informed by the NRT and available primary school data), therefore the previous model would have assigned more negative contributions to previous year groups leading up to the pandemic. This would have been unwarranted, as the pandemic, rather than performance before and leading up to the pandemic, was predominantly responsible for exam outcomes during this time. Ultimately, this would have meant that the quality-adjusted output back-series would have been unfairly penalised because of false assumptions. To navigate this challenge, we introduced two main interventions to the model.
Firstly, instead of retrospectively apportioning coronavirus-affected exam scores to year groups before the pandemic, these groups' scores would be informed by taking the average performance of similar year groups over the previous five years.
Secondly, residual adjustments are applied to year groups directly affected by the pandemic (2020 to 2021 and 2021 to 2022) so that the sum of contributions from within a cohort equals the final achieved attainment value. For example, for the first cohort affected by the pandemic (Year 11 in 2020 to 2021) the Year 11 score is informed by 20% of the final attainment value but is adjusted so that the Year 7 to Year 11 contributions equal the final attainment value. The Year 11 residual adjustment is then applied to other year groups within that academic year to avoid arbitrary adjustments to other year groups. This process continues for the following years, considering residual adjustments applied to previous year groups. These adjustments will automatically drop out of the model when the last cohort who attended school during the pandemic take their exams.
In essence, these adjustments confine the effects of the pandemic to year groups that were directly impacted by the pandemic and allow the cumulative nature of education to be accounted for in ongoing PSP estimates.
Student well-being
Primary and secondary schools are also quality adjusted according to student well-being, which is based on data from the Understanding societies harmonised UK Household Longitudinal Survey (UKHLS).
Student well-being is determined from weighted responses to the questions "How do you feel about your school?", and "How do you feel about your schoolwork?". Responses are derived from a Likert scale, ranging from 1 (Completely happy) to 7 (Completely unhappy). The total positive responses to both questions (1 to 3 on the Likert scale) are compiled and the proportion of positive responses in relation to neutral and negative responses are determined.
From this, an index is derived based on growth in relative positive responses. The well-being index is weighted based on percentage of expenditure allocated to addressing pupil deprivation as declared in the Department for Education's (DfE's) National Funding Formula.
Key stage 2 disadvantaged gap index
Disadvantaged pupils are defined by the DfE as those who attend primary school and have been eligible for free school meals at any point in the last six years, children looked after by a local authority, and children who left local authority care in England and Wales. Equity of attainment is an important priority for the UK education system.
This quality adjustment is based on data for England, as no equivalent measures are available covering other parts of the UK. As such, we have applied it to the output measure for all parts of the UK.
The disadvantaged attainment gap index is published by the DfE. As the index approaches 0, this reflects the gap between disadvantaged pupils and their peers being closed. Therefore, to be consistent with quality adjustments in public service productivity, the index growth is inverted such that a fall in the index (as it gets closer to 0) reflects an improvement in quality.
Data on the disadvantaged gap index were not published for the 2019 to 2020 and 2020 to 2021 academic years because of the coronavirus (COVID-19) pandemic. Therefore, the index has been held constant over these years, however, data were available from 2021 to 2022.
To weight the disadvantaged attainment gap quality series together with the other quality metrics in the education service area, the proportion of primary school funding that is specifically pupil premium funding is used. The pupil premium is defined as "funding to improve education outcomes for disadvantaged pupils in schools in England".
Inputs
The ONS publishes estimates of publicly funded education inputs in the UK from 1997 onwards. The components that compose inputs: labour, intermediate consumption, capital consumption, are organised in education as:
local authority (LA) direct labour
central government indirect labour
goods and services (provision)
goods and services (administration)
consumption of fixed capital
A direct measurement of labour within schools (including LA-maintained schools and academies in England) is based on full-time equivalent (FTE) teacher and support staff numbers, which are weighted using salary data. Various sources are used to inform salary data for occupations across the devolved administrations, including the ONS's Annual Survey of Hours and Earnings (ASHE).
Numbers of FTE staff are initially gathered on an academic year basis, and these are splined to calendar year before subsequent aggregation steps. Numbers of FTE staff are then aggregated by staff type and school phase and are adjusted by their hours worked overtime.
Currently, data on hours worked overtime are obtained from the Labour Force Survey (LFS). Country-specific FTE numbers adjusted by hours worked overtime are weighted by salary data and aggregated into a direct labour index that represents the UK. During the PSP Review, an exercise was undertaken to update salary data informed by ASHE to ensure salary weights were as accurate as possible. As a result, salary weights were updated, and the impact of the changes can be observed in our article Public Services Productivity Review, impact of improved methods on total public service productivity: 1997 to 2021. An indirectly measured labour index is prepared for central government labour, which is informed by deflating central government labour expenditure according to the Classification of the Functions of Government (COFOG 9).
The indirect labour measure also includes FE inputs, as these are not currently captured in the direct measure, and academy expenditure is transferred to local government expenditure. This is because academies receive funding and support from central government but are being accounted for in the direct labour measure, which is weighted by local government shares. The average labour compensation per hour (ALCH) industry O (public administration and defence; compulsory social security) is used to deflate central government expenditure, and this replaces the Average Weekly Earnings (AWE): Public Administration Index. This is because the ALCH covers all costs of labour such as pension and National Insurance contributions, whereas these are not reflected in the AWE, which is more of a pure price measure.
The direct and indirect labour indices are weighted and aggregated into a total labour index based on their general government expenditure shares. By 2022, direct and indirect labour weights were approximately 92% and 8%, respectively.
While goods and services (provision) relate to intermediate consumption within schools, goods and services (administration) refers to intermediate consumption within the public administration component of education.
Goods and services expenditure data according to provision and administration are determined from national accounts expenditure (COFOG 9) and are deflated by separate deflators.
Provision is deflated using a composite Paasche intermediate consumption deflator, which is constructed by the relevant CPIs, SPPIs and PPIs for each area of expenditure, and are weighted according to expenditure shares declared in the DfE's annual reports and accounts.
Administration is deflated using the gross domestic product-implied deflator.
Provision and administration are weighted based on their relative expenditure shares to produce an intermediate consumption index. By 2022, the weights for provision and administration were approximately 82% and 18%, respectively.
Consumption of fixed capital national accounts expenditure data is deflated using a constructed education general government capital deflator to produce a consumption of fixed capital index.
The labour, goods and services, and capital indices are aggregated together using their respective UK National Accounts general government expenditure shares to form a chain-linked Laspeyres volume index. By 2022, the approximate weights for labour, goods and services, and capital, were 66%, 26%, and 8%, respectively.
Sources of input data
Direct labour – school staff numbers:
England: DfE
Wales: Welsh Government
Scotland: Scottish Government
Northern Ireland (teaching only, no support staff): Department for Education Northern Ireland
Direct labour – salary data:
England: DfE for teaching staff, ASHE for support staff
Wales: Welsh Government for teaching staff, ASHE for support staff
Scotland: ASHE for teaching and support staff
Northern Ireland: ASHE for teaching and support staff
Direct labour – hours worked overtime:
- LFS
Indirect labour:
expenditure: ONS UK National Accounts
deflator: ALCH industry O
Goods and services (provision):
expenditure: ONS UK National Accounts
deflator: Constructed composite Paasch intermediate consumption deflator
Goods and services (administration):
expenditure: ONS UK National Accounts
deflator: GDP-implied deflator
Capital expenditure:
expenditure: ONS UK National Accounts
deflator: constructed education general government capital deflator
4. Defence
Defence is a service area in which output is indirectly measured. This is because of difficulties in identifying and measuring the collective nature of services delivered. There are also significant data barriers to estimating output. We therefore apply the "output-equals-inputs" convention and assume productivity growth to be zero. Defence was prioritised for improvements by the National Statistician's Independent Review of the Measurement of Public Services Productivity resulting in new adjustments applied to our inputs measure.
Inputs
Previously, total defence inputs were measured indirectly by deflating current price expenditure of all inputs components, in accordance with the Classification of the Functions of Government (COFOG 2), by an implied deflator reflecting the whole industry.
In 2022, we improved our defence labour measurement by transitioning from an indirect to a direct labour measurement approach, a methodological development implemented based on existing available data.
The direct measure is based on the numbers of staff employed, adjusted for hours worked (given by full-time equivalent hours), and provides grouping by grade, military rank or skill level. Full-time equivalent hours (FTE) growth per employment rank is cost-weighted by rank-specific implied expenditure shares, using departmental personal statistics pay data. This produces a volume series, which is weighted alongside other inputs components by labour current price expenditure shares, in accordance with COFOG 2. This general formula outlines the direct labour estimation approach for an individual rank, i:
From this formula, to get a total labour volume growth estimate, we simply sum the growth contributions of all relevant ranks: i to N.
Data for military personnel strengths and employment rank was sourced from Quarterly service personnel statistics: index. Salary data was sourced from Armed Forces Pay Review Body. Data for civilian personnel full-time equivalent employees (FTEs), employment grade, and salaries was sourced from Civil Service statistics. Civilian personnel salary data were backcast prior to 2007 to maintain consistent pay groups across the time series.
In addition, we transitioned from a general implied deflator for defence to individual bespoke deflators for intermediate consumption and capital. The updated intermediate consumption deflator is constructed from the Online System for Central Accounting and Reporting (OSCAR) dataset and Office for National Statistics (ONS) price indices. The updated capital deflator is an implied deflator derived from ONS capital stocks defence volume and current price estimates, in accordance with the UK National Accounts.
A cost-weighted Laspeyres volume index is then calculated for the volume of defence inputs, using chain-linked expenditure shares, and assumed to equal the volume of defence output.
Back to table of contents6. Policing and immigration
Policing and immigration is a service area in which output is indirectly measured. This is because of difficulties in identifying and measuring the collective nature of services delivered.
Inputs
With the exception of local government labour, police inputs are estimated by deflating expenditure on labour, goods and services, and capital.
The volume of local government labour inputs is measured directly from data on full-time equivalent employees (FTEs) and relative salaries for different groups. FTE data are sourced from Police workforce statistics for England and Wales, Workforce statistics for Police Scotland, and directly from the Police Service of Northern Ireland. Additional workforce numbers for the Police Uplift Programme are sourced from Police Uplift Statistics.
The volume of central government labour input is measured indirectly. Expenditure data are deflated by the Average Weekly Earnings (AWE) Index for Public Administration and Improved methods for total public service productivity: total, UK, 2018 details an adjustment made to police expenditure data from 2013 onwards.
The deflator for local government goods and services expenditure is constructed from subjective analysis returns (SAR) within local government financial statistics and Office for National Statistics (ONS) price indices. The deflator for central government goods and services expenditure is constructed from the Online System for Central Accounting and Reporting (OSCAR) dataset and ONS price indices.
Local and central government net expenditure on capital consumption is deflated by the implied local and central government capital deflator for industry O (public administration and defence).
A cost-weighted Laspeyres volume index is then calculated for the volume of police inputs, using chain-linked expenditure shares, and assumed to equal the volume of police output.
Back to table of contents7. Public order and safety
Quantity output
Within the public order and safety (POS) service area there are four main components:
fire
courts, which itself has five further sub-components: magistrates' courts, county courts, Crown Courts, Crown Prosecution Service and legal aid
prisons
probation
Police and immigration are measured separately to POS so are excluded from these measurements.
For each component, a cost-weighted activity index (CWAI) is constructed. We use direct output measures for all components.
A quality adjustment is not applied to fire service, county courts services (civil cases), or the civil component of legal aid. This is because these services are deemed to have different outcomes to the criminal justice elements of POS and have data limitations.
Fire
Fire output activities are categorised into three groups:
fire response (FR)
fire prevention (FP)
fire special services (FS)
These groups all form part of the fire and rescue service (FRS). Activity measures for the FRS are based on the number of incidents attended for fire response and fire special services activities, and staff hours spent on fire prevention activity.
Appropriate cost weights are based on the Economic Cost of Fire estimates for different fire incidents. The output measure combines the different activities into a single cost-weighted activity index (CWAI) using the associated unit costs as their weights, and an overall output index is then constructed as a chain-linked Laspeyres index using the previous year's prices.
Fire response services (quality adjustment is not applied to these services) include fire response for dwellings, commercial premises, vehicle, chimney, false alarms, measured by the number of incidents attended and sourced from Home Office data.
Special services response (road and non-road) is measured by the number of incidents attended and sourced from Home Office data.
Prevention is measured by inspections, investigation, community safety, for example fitting fire alarms. It is measured by number of hours of workload and is sourced from Home Office data.
Courts
Law courts (partially quality adjusted) include the Crown Court, magistrates' courts, county courts and family courts (which cover private, public, divorce and adoption cases). Separate cost-weighted activity indices for different areas of the courts system are constructed and then further aggregated based on expenditure shares.
The output of criminal courts (Crown and magistrates) uses disposal data sourced from Ministry of Justice (MoJ) Criminal Court Statistics.
For the Crown Court, data are provided on the hearing time of Crown Court cases broken down by hearing and plea type.
For triable-either-way trials and indictable-only trials, the plea types are:
guilty plea
not guilty plea
no plea entered
dropped case
For committed for sentence and appeals hearings, the plea type was not applicable.
By calculating average hearing time for all cases, it is possible to estimate an average hourly cost for each hearing. This can be interacted with hearing time for each of these categories to weight the different types of Crown Court activity.
Because of difficulties in measuring hearing time for magistrates' courts (which can hold multiple hearings in a single session, with no indication for average time for each hearing), overall disposals are taken for magistrates' courts activity.
The output for civil courts (family and county courts) is currently measured by caseload with data sourced from the MoJ. Data from the MoJ on applications, hearings and final orders are used to produce a "weighted caseload". Unit costs are periodically sourced from the MoJ and are used as weights for the output index. However, all civil courts outputs have been forecast or estimated since 2019.
Crown Prosecution Service (CPS)
The indices for magistrate's courts and Crown Courts are used to predict activity growth in this area. Both indices are aggregated based on their expenditure shares to approximate the growth in activity for the CPS.
Legal aid
From our 2022 ONS Public service productivity publication, we have expanded the categories used for legal aid, which now captures lower crime, higher crime and civil legal aid.
Lower crime mostly covers work carried out by legal aid providers in magistrates' courts and at police stations in relation to people accused of or charged with criminal offences. On the other hand, higher crime refers to legal representation in the Crown Court and for criminal cases in the higher courts. Civil legal aid refers to legal representation in civil courts.
In 2022, there were a total of 72 categories (31 for lower crime, 18 for higher crime and 23 for civil legal aid), although this number varies slightly over time.
The number of cases requiring legal aid, as well as the associated costs, are taken from Legal aid statistics. Using these data, it is possible to construct a single cost-weighted activity index for legal aid in its entirety.
Prisons
Output for prisons is measured by the average number of prisoners in UK prisons. These data are collected on a monthly basis and coverage is for Great Britain. For England and Wales, the prison population has been split by security category.
Prison population statistics
MoJ Prison population statistics are cost weighted using prison performance data, which provides the expenditure Prison and Probation Performance Statistics of individual prisons annually for England and Wales. These are published by security category, which are used to split the activity data.
For England and Wales, the Office for National Statistics (ONS) uses the categories provided in the annual cost publication to allocate cost-weights for each of nine categories:
male dispersal (category A)
male reception (category B)
male trainer (category B)
male category C and young offender institution (YOI) trainer and resettlement
male open (category D)
female closed
female local
female open
male YOI young people (ages 15 to 17 years)
For Scotland, prison population data are taken from Scottish Prison Service Data. With limited data on Scottish prisons expenditure, it is not possible to split Scottish prison population by security category. As such, Scottish prison population data are included at an aggregate level, weighted using an average unit cost for England and Wales.
Probation
The current probation output measure uses supervisions to capture activity. This is taken from data published by the MoJ Offender management statistics.
Data are currently available for England and Wales. Supervisions are split into two categories:
community order and suspended sentence order
on license
Relative weights are assigned to these two activity categories using the marginal unit cost of each group of probationers provided by MoJ, allowing the ONS to create a cost-weighted activity index (CWAI) for probation. However, as these data are available only for 2023 to 2024, the ratio of unit costs for the two groups will be fixed over time and uprated using total expenditure per probationer.
Quality adjusted output
Full details of the quality adjustments can be found in our article Quality adjustment of public service public order and safety output: current method.
Quality adjustments are not applied to fire protection services, county courts, or the civil component of legal aid. As these all deliver civil cases, they are deemed to have different outcomes to the criminal justice elements.
Component | Recidivsm (applied from 2020) | Prison safety (applied from 1997) | Custody escapes (applied from 1997) | Courts' timeliness (applied from 2011) |
---|---|---|---|---|
Fire | - | - | - | - |
Prisons | 29.20% | 37.50% | 33.30% | - |
Probation | 100.00% | - | - | - |
Magistrates Courts | 40.00% | - | - | 60.00% |
Crown Courts | 40.00% | - | - | 60.00% |
County Courts | - | - | - | - |
Crown Prosecution Service | 100.00% | - | - | - |
Legal Aid (Criminal) | 100.00% | - | - | - |
Legal Aid (Civil) | - | - | - | - |
Download this table Table 2: Quality Adjustments
.xls .csvThe courts' timeliness adjustment
The timeliness adjustment relates to the average time taken for criminal cases to reach completion, on the basis that the delivery of a sentence in a timely manner is favourable.
In 2022 for Crown Court, we have introduced data from the MoJ Timeliness Tool, which has also allowed the opportunity to improve the granularity of our timeliness quality adjustment. The ONS can now ensure the timeliness adjustment is reflective of the underlying mix of case disposal activity at Crown Court (that is, aligning with the relative proportion of both case type and plea), and incorporate triable-either-way, indictable, committed for sentence, and appeals case types.
Average timeliness was used for appeals and those cases where no plea is entered (as these are excluded from the MoJ timeliness calculations).
For magistrate courts, the measure is based on the mean average number of days between charge and completion.
The recidivism adjustment
The recidivism adjustment approximates the effect the Criminal Justice System (CJS) has on reducing the volume and severity of further crimes being committed by those who have gone through it.
This adjustment is composed of three parts, the first being the change in the number of proven re-offences committed by adults and juvenile offenders categorised between crime types. An adjustment is made to adult offenders, to account for differences between cohort characteristics and their likelihood to re-offend. The final adjustment made provides a weighting by which to aggregate together all re-offences. This weighting is based upon the relative severity of the re-offence and is derived from the ONS's Crime Severity Score for England and Wales.
Data on proven reoffending from the MoJ is used, alongside other measures, to quality adjust output in the criminal justice system.
Because of the disruption to court proceedings during the coronavirus (COVID-19) pandemic, comparable reoffending data were not available. Following discussions with the MoJ and the fact that re-offending rates returned to levels within historical ranges, we decided to remove the "covid fix" and resume the use of actual re-offending data. To cover the period impacted by coronavirus, we used linear interpolation between the last available unaffected data (July to September 2018) up to the point we resumed the use of real-time data (April to June 2022).
The prisons safety adjustment
The prisons safety adjustment relates to the number of incidents of assaults, self-harm and deaths that occur in prison custody.
We measure the number of incidents per 1,000 prisoners, which are grouped into "those resulting in a death", "severe", and "less severe". These groups are subsequently weighted and aggregated together based on their relative cost. This is achieved by using the total cost to society of workplace injuries as a proxy, taken from the Health and Safety Executive (HSE).
The custody escapes adjustment
The escape adjustment relates to ensuring prisons fulfil the role of public protection and is applied to activities used to measure the output of the prison service.
The measure is based on changes in the difference between the number of escapes and a baseline of 0.05% of the England and Wales prison population – a historical target used by the MoJ. The purpose of this is that as the absolute number of escapes approaches zero, the relative change year-on-year would have a disproportionate effect on a non-baselined quality adjustment index.
Combining the components
For each component, we calculate an overall growth factor to be applied to the basic activity index. For those areas where multiple adjustments are applied, the growth factors are applied on a weighted average basis (Table 2 outlines the weights used). All the components of public order and safety (POS), including non-quality adjusted components, are then cost-weighted together to produce an aggregate index of POS quality adjusted output.
Inputs
Inputs estimates are calculated for:
fire
courts (including probation)
prisons
The public order and safety volume of inputs series is a weighted combination of fire, courts (including probation) and prison (chain-linked using the UK National Accounts expenditure weights).
The inputs are mostly measured indirectly, with the exception of fire (further detail follows), using deflated expenditure to derive the volume of labour, intermediate consumption (goods and services) and capital consumption inputs. An aggregate index is then compiled for total POS inputs and weighted using expenditure levels for fire, prison, courts and probation.
For 2022, a direct labour input measure for fire and rescue services has been introduced; this is included from 2011 onwards. This relates to local government labour, which accounts for more than four-fifths of fire and rescue service labour expenditure.
Full-time equivalent (FTE) volumes for fire by rank or grade are matched with relevant average salary data from ASHE data. For the 2022 publication, improvements have also been made to the central and local government deflators used to determine volume growth in the intermediate consumption (IC) measure. Because of the different types of expenditure across service areas, a composite bespoke IC deflator has been generated for each service area, except for the fire service, where IC expenditure will be deflated using the headline CPI deflator.
Back to table of contents10. Tax administration
Quantity output
The unit of output for the tax administration service area is the number of taxpayers or registered traders or operators for each tax. These data are sourced from the HM Revenue and Customs (HMRC) Numbers of taxpayers and registered traders publication in addition to specific datasets supplied directly by HMRC for National Insurance and Corporation Tax.
Expenditure data for calculating cost weights are sourced from an annual dataset supplied to the Office for National Statistics (ONS) by HMRC. This dataset includes net expenditure and receipts for each tax collected from tax year ending 2019 to tax year ending 2023.
These unit costs are then used in a Laspeyres index, where output growth for periods t-1 to t is calculated as activity weighted by the unit costs from period t-1.
Activity and cost data enable direct measurement of 11 taxes:
Income Tax
National Insurance
Value Added Tax (VAT)
Corporation Tax
Capital Gains Tax
Insurance Premium Tax
Air Passenger Duty
Landfill Tax
Climate Change Levy
Aggregates Levy
Inheritance Tax
Revenue adjustment
A "revenue adjustment" is applied that adjusts the cost weights by the revenue raised per £ of administrative cost for different taxes.
Data for revenue adjustments are sourced from the HMRC tax receipts and National Insurance contributions for the UK publication, which provides the receipts by financial year for the 11 taxes covered in this analysis.
An increase in revenue raised does not directly result in output growth. The effect of an increase in the revenue raised by one tax relative to others will increase the effect of increases or decreases in the number of payers of that tax on the overall output index.
While this enables efficiency improvements from changes in the number of tax payments made for low-cost taxes relative to high-cost taxes to be represented in the measure, it does not address any aspects of quality in tax administration. As such, the measure is categorised as non-quality-adjusted.
Inputs
Inputs estimates are based on expenditure figures that are calculated by HMRC by allocating their overall expenditure to tax administration functions based on reported staff time spent on those activities. These expenditure values are also used by HMRC in their "Cost of Collection" statistics, which are published annually in the HMRC annual report and accounts.
Expenditure values net of customs have been provided directly to the ONS for the period tax year ending 2019 to tax year ending 2023, which are converted to calendar year by the same cubic splining method used in other PSP areas.
A number of small adjustments are applied to maintain coherence and comparability with the UK National Accounts and other public service productivity service areas.
Detailed Online System for Central Accounting and Reporting (OSCAR) data can be used to estimate the proportions of labour, intermediate consumption and consumption of fixed capital within this expenditure, with deflators being applied in these proportions to estimate the volume of inputs.
The labour deflator is constructed by weighting together changes in salaries of the grades within HMRC by their shares of staff in post. The composite deflator information is sourced from HMRC workforce and salaries data provided to directly to the ONS.
A combination of ONS price indices and OSCAR data have been used to construct a composite intermediate consumption deflator and consumption of fixed capital is deflated using the Public sector finances (central government): depreciation deflator.
Back to table of contents11. Other government services
Central and local government expenditure data are obtained for:
general public services, for example, executive and legislative organs, basic research
economic affairs, for example, general economic, commercial and labour affairs including transport, agricultural, forestry and fishing
environmental protection, for example, waste management, pollution abatement
housing and community amenities, for example, housing development, water supply and street lighting
recreation, culture and religion, for example, recreational and sporting activities, broadcasting and publishing
other public order and safety services, for example, research and development
Total current expenditure on these categories is deflated using the Consumer Prices Index (CPI) to obtain a constant price expenditure series. This series is then used to generate an index of volume of inputs, which is assumed to equal the volume of output.
Tax administration is removed from the "other" grouping from 2018 onwards. Inputs growth net of tax administration for the period 2018 to 2019 is calculated and chain-linked to the index including tax administration up to 2018. Thus, the removal of tax administration does not affect the growth rate or interpretation of the "other" index when it is removed in 2018.
Back to table of contents12. Difference between annual and quarterly statistics
Alongside the annual estimate of Public service productivity: total, UK, 2022, which is badged as an Accredited official statistic, we also publish quarterly official statistics in development (also known as experimental statistics) measures of total public service productivity.
The quarterly series offers a timelier measure, as the annual series has a significant time lag.
However, compared with the annual estimates, the quarterly statistics:
- do not have quality adjustments on a quarterly basis
- use forecasts of output for some service sectors, such as social protection
- use far less granular data on activity and unit costs for healthcare output
- lack full breakdowns by Classification of the Functions of Government (COFOG) – inputs estimates are calculated on an industry basis instead
The inputs for the quarterly estimates are based on Office for National Statistics (ONS) current price expenditure on labour, intermediate consumption, capital and social transfers in kind (STIK). Appropriate deflators are applied to approximate the volume of inputs from expenditure data. For more recent quarters, full-time equivalent (FTE) data derived from the ONS public sector employment estimates are used instead to calculate labour inputs for health, education, social protection, fire and justice. Deflated expenditure data on NHS bank staff are also introduced for later quarters.
The output only accounts for the volume of activity, not the quality of output, and uses non-seasonally adjusted ONS chained volume measures (CVM).
Inputs and output of experimental quarterly estimates of productivity are seasonally adjusted at the total level.
Expenditure and CVM data are consistent with non-seasonally adjusted quarterly national accounts (QNA) data as published in ONS breakdowns of general government final consumption expenditure. However, published quarterly productivity estimates use different seasonal adjustment methods and may differ from seasonally adjusted data published in QNA.
We published the quarterly estimates of healthcare inputs, output, and productivity for the first time in February 2025, alongside the estimates of total public service productivity, inputs, and output.
To provide more timely estimates of annual productivity (for total productivity and healthcare) we include quarterly annualised growth rate (QAGR) estimates in our quarterly publication. The QAGR method uses the growth rate in annualised quarterly PSP estimates to produce nowcast annual estimates.
Back to table of contents13. User and stakeholder needs
The Office for National Statistics (ONS) actively seeks feedback from users of its public service productivity statistics to inform its future work priorities. We are particularly interested in user views on the value of these statistics to inform policy debates and research projects within the academic and national accounts fields. The updated Quality and Methodology Information (QMI) for the total public service productivity article includes further information on user needs and perceptions.
We use various methods to engage with users about our statistics, including regular stakeholder engagement, pre-publication quality assurance from government experts, user consultation meetings and pre-announced methods changes, such as improved methods for total public service productivity: total, UK, 2021. Any feedback or comments are welcome and can be sent to productivity@ons.gov.uk.
Back to table of contents14. Cite this methodology
Office for National Statistics (ONS), released 22 April 2025, ONS website, methodology, Public service productivity estimates: sources and methods
9. Social security administration
Social security administration (SSA) is the administration of different types of benefits including the processing of new benefit claims and maintaining existing benefit load.
For the total public service productivity (PSP) annual publications between 2018 and 2021, because of the difficulties in capturing the composition of Universal Credit (UC) cases via traditional unit cost weighting, SSA had employed an "output-equals-inputs" approach. As a result, no productivity metrics were calculated from this service area.
Following the PSP Review, the output model has been updated to account for UC cases, and productivity metrics are now available for SSA. Furthermore, following the PSP Review, quality adjustment has been applied to SSA for the first time, which determines the "correctness" of administered benefits based on Department for Work and Pensions (DWP) fraud and error rates.
Quantity output
SSA output is directly measured using data provided by the DWP.
Reforms affecting SSA mean that the output measure has evolved over time, with multiple benefit schemes being replaced by UC, which presents two conceptual challenges to the measurement of UC output.
Firstly, the application of a conventional cost-weighted activity index (CWAI) would prevent the measurement of any productivity change resulting from the transfer to UC; if UC could deliver equivalent benefits at lower cost, then a conventional CWAI would place a lower value on the new UC activity despite an equivalent value of service delivered.
Secondly, as UC claims contain a high degree of heterogeneity, and generally more simple claims migrated from legacy benefits to UC before those consisting of many entitlement components, one UC claim administered in a later year may involve more value delivered than a claim administered in an earlier year.
As part of development work undertaken through the PSP Review, for the 2022 release, a method for directly measuring the output of UC has been employed.
To integrate UC into the SSA output index, UC activity has firstly been adjusted to account for changes over time in the number of entitlements per claim using data on the proportion of claims with various entitlements and the average cash payment associated with these entitlements. The adjusted UC activity is then weighted together with legacy benefit activity on a benefit-weighted basis. As a result, a benefit payment bundle of equivalent value from UC and its legacy benefits are given equivalent weight in output, enabling any change in the input cost of delivering such broadly equivalent sets of benefits to be accounted for in the productivity measure.
For benefits other than UC and its legacy benefits, output is calculated using DWP data on benefit claims and caseload weighted by the unit cost of each. The combined UC and legacy benefit output index is integrated with other benefits on a cost-weighted basis.
This approach will remain under review, as the migration from legacy benefits to UC is expected to be completed in 2025 and a decision will be made as to whether or not it is worthwhile continuing to adjust UC claims for the number of entitlements per claim after the transition is complete. The measure currently excludes data on benefits administered by HM Revenue and Customs (HMRC), in particular, Tax Credits and Child Benefit, and further research is intended to extend coverage to these.
Quality adjustment
Following the PSP Review, the SSA service area has implemented a quality adjustment based on fraud and error rates for benefits administered by the DWP. Benefits can be overpaid or underpaid, with errors categorised as one, or a combination of:
official error (erroneous payments made by DWP or authority)
customer error (mistake made by the customer)
customer fraud (deliberate fraud made by the customer)
For the SSA quality adjustment measure, the total DWP benefit fraud and error rates are used from FYE 2008. These refer to the percentage of the total amount of benefit expenditure administered each year that relates to overpayments and underpayments. Statistics on fraud and error are published annually by DWP and cover England, Scotland and Wales.
Firstly, total fraud and error rates are derived from overpayment and underpayment data. The gross overpayment and underpayment rates per year, not net, are used to inform the measure. For example, if there is a 2% overpayment and 2% underpayment administered for a particular year, then the total fraud and error rate would be 4%. Using the net derived value from overpayment and underpayment data would conceal the total error rate within the system and is thus not appropriate.
Basing the quality adjustment index on movements in total fraud and error rates over time would not have been appropriate because of small variations in the fraud and error rates causing disproportionate volatility. Therefore, quality adjustment is based on "correctness" rate, where the fraud and error rates are deducted from 100%. For example, a fraud and error rate of 4% would return a "correctness" rate of 96%. The growth rates in "correctness" rates are used to inform quality adjustment.
Because of HMRC benefits (Working Tax Credits and Child Benefits) not being currently captured in the SSA output model, the quality adjustment only comprises data from DWP. When HMRC data are included in the output measure, the quality adjustment will be amended to account for the relative "correctness" contributions from HMRC benefits. An aggregate percentage "correctness" rate will be determined by weighting the relative monetary value of "correct" payments from DWP and HMRC in relation to total benefit expenditure administered by both departments.
Inputs
The SSA inputs index consists of UK National Accounts-consistent deflated expenditure on labour, intermediate consumption and consumption of fixed capital under the Classification of the Functions of Government (COFOG) 10N1: Social Protection. Currently, there are no directly measured components for SSA inputs. Current price expenditure is drawn from the UK National Accounts and is deflated to produce a constant price series.
As of the 2022 publication, labour is deflated using the ALCH for industry O (Public administration and defence; compulsory social security), while a constructed SSA general government capital deflator is provided directly by the ONS Capital Stocks team. For intermediate consumption, a composite deflator is used to reflect price changes in the cost of goods and services used within the SSA service area. The composite deflator is constructed by sourcing data on the prices and quantities of goods and services used in the provision of SSA.
Changes in these prices are then aggregated into a Paasche price index, which weights the changes in prices by their relative volumes in the current year. Weights are determined using expenditure data, sourced from DWP's Annual Report and Accounts.
The labour, goods and services, and capital indices are aggregated together using their respective UK National Accounts general government expenditure shares to form a chain-linked Laspeyres volume index. By 2022, the approximate weights for labour, goods and services, and capital, were 47%, 47%, and 6%, respectively.
Back to table of contents