1. Overview

This methodology article sets out the sources and methods used to construct estimates of productivity for total public services, most recently presented in our article Public service productivity: total, UK, 2022.

It contains a summary of the data sources used and a breakdown of how the Office for National Statistics (ONS) calculates estimates of productivity in each service area.

In addition to the annual statistics, we also publish quarterly estimates. While this article focuses on the sources and data used for the annual statistics, a summary of the main differences between annual and quarterly is included in Section 12: Difference between annual and quarterly statistics.

For the impact of recent methodological changes, refer to our article Public Services Productivity Review, impact of improved methods on total public service productivity: 1997 to 2021. Further information on the methodology used and details on the strengths and limitations are included in the Public service productivity: total, UK Quality and Methodology Information (QMI).

The main concepts and methods common to all service areas are explained in this section, with specific detail on each service area's output and inputs measures contained in Sections 2 to 11.

Productivity

At the most aggregate level, productivity is the measure of how many units of output are produced from one unit of inputs and is calculated by dividing total output by total inputs. Adopting P, O and I to indicate productivity, output and inputs, respectively, and including a subscript t for time periods:

Total public service output and inputs indices are calculated by aggregating output and inputs for the following service areas:

  • healthcare
  • education
  • defence
  • adult social care
  • policing and immigration
  • public order and safety
  • children's social care
  • social security administration
  • tax administration
  • other government services

Total public service productivity is then calculated by dividing this index of output by the index of inputs.

Statistics are published on a UK geographic basis from 1997 to the latest available year, usually two years prior to the publication date.

Output and inputs indices for each service area are aggregated together using their relative general government (combined central and local government) expenditure weight, using data from the UK National Accounts on a Classification of the Functions of Government (COFOG) basis.

Quantity output

Different measurement techniques for output are adopted for different service areas.

For most service areas, output is measured in direct volume terms by the number of activities performed by that service area. Activities are weighted together into a cost-weighted activity index (CWAI). The CWAI calculates the change in the number of activities undertaken, weighting each activity by its cost such that a change of one unit of activity for a high-cost activity has a greater effect on the output than a change of one unit of activity for a low-cost activity. Healthcare and education, as well as adult social care, children's social care, social security administration, tax administration, and public order and safety, all involve some degree of direct volume measurement in the form of a cost-weighted activity index (CWAI).

Three service areas (police, defence and other public services) are largely "collective" services, which impact the population as a whole rather than an individual, and therefore output from these sectors is more difficult to measure directly. Instead, an "output-equals-inputs" convention is applied, where output volume is assumed to equal the volume of inputs used to create them. In this case, productivity is constant.

Within healthcare, approximately 13% of output is measured indirectly; this output is services delivered by non-NHS providers. It is worth noting that inputs and output are also equal for GP prescribing, however, in this case we calculate the volume of outputs directly with cost-weighted activity.

Within children's social care (CSC), 71% of output is directly measured. Since the financial year ending (FYE) 2015, direct measurements have been available for services including safeguarding, non-secure accommodation, secure accommodation, adoptions, and care leavers.

Within UK adult social care (ASC), 34% of output is directly measured; in particular, directly measured output for England is only available for residential and nursing care settings from FYE 2015 (approximately 34% of the England measure). All Wales output is indirectly measured, and Northern Ireland output is indirectly measured after FYE 2020. Approximately 58% of Scotland output is directly measured, and this relates to care home and home care activity.      

Within the tax administration service area, a "revenue adjustment" is applied that adjusts the cost weights by the revenue raised per £ of administrative cost for different taxes. This enables efficiency improvements from changes in the number of tax payments made for low-cost taxes relative to high-cost taxes to be represented in the measure.

In total, approximately 40% of output is measured using the "output-equals-inputs" convention, the other 60% is measured directly. All figures stated here refer to Public service productivity: total, UK, 2022. In our publications, "quantity output" and "non-quality-adjusted output" have the same meaning.

Quality adjustment

Where data are available and relevant, output measures are quality adjusted. Quality adjustments are currently applied to six service areas:

  • healthcare 

  • education

  • children's social care

  • adult social care

  • public order and safety

Following the Review into public services productivity, we have begun to introduce quality adjustment for social security administration.

A quality adjustment is, in its simplest terms, a statistical estimate of the change in the quality of a public service. This provides a more accurate picture of the link between output and the desired outcomes, for example, increased attainment in GCSE-level attainment scores for the education service area. In the market sector, quality is accounted for through differences in the prices of goods and services. However, in the non-market sector, there is no market price therefore prices cannot be used. For more detail on quality adjustments, see A guide to quality adjustment in public service productivity measures.

The reasons for quality-adjusting public service output are well-documented and follow from recommendations made in the Atkinson Review (PDF, 1.07MB).

"Quality-adjusted output" describes this concept in our publications.

Inputs

Inputs comprise volume estimates of labour, goods and services (intermediate inputs), and capital assets used in delivering public services. These series are aggregated together to form an overall estimate of the volume of inputs used to provide each of the public services identified in the total public service productivity articles.

For some service areas, inputs are measured indirectly by using current expenditure adjusted by a suitable deflator. In most areas inputs are measured directly, such as the number of full-time equivalent staff.

Deflation

Where direct inputs volume measures are unavailable, or indirect volume measures are more precise, expenditure from the UK National Accounts are deflated by an appropriate price deflator in order to remove the effect of price inflation. If single, appropriate deflators are not feasible, composite deflators are constructed, with broader indices used (such as CPI: All Items) where there is not enough evidence to inform this granularity further.

Composite deflators are constructed by sourcing more relevant data on the prices and quantities of specific inputs. The changes in the prices of different inputs are aggregated into a chain-linked Paasche price index, which weights the changes in prices by their relative volumes in the current year.

For example, the growth in average gross pay for different prison service staffing groups (a price change in labour) are weighted together using staffing numbers (the quantity) to create a composite labour deflator. This better approximates the overall price changes in labour for a service area with fairly homogeneous labour inputs.

Public sector procurement data for different service areas are used to create composite intermediate consumption deflators that better reflect price changes in the cost of goods and services relevant to specific service areas. Procurement data are sourced from the Online System for Central Accounting and Reporting (OSCAR) dataset for central government expenditure and from the Subjective Analysis Return, which is published as an annex to Local authority revenue expenditure and financing for local government expenditure. Relevant price indices are mapped to each expenditure category, and these are chain-linked according to the expenditure weight of the category they represent.

Therefore, where inputs are measured indirectly, revisions to inputs estimates can result both from changes to expenditure and changes to the deflator used.

Splining

Where data are received on a financial (April to March) or academic (September to August) year basis, a statistical technique known as splining is used to align these data to the calendar year (January to December). We use a cubic spline method, which calculates a quarterly path for the annual data (in the financial or academic year). The method follows a set of constraints to ensure that the quarterly path experiences no artificial changes in the growth or level of the series and that the average or sum of the four quarters for a particular academic or financial year is equal to the annual data used. This quarterly path can then be re-aggregated up to a calendar year by averaging or summing the four quarters of the calendar year.

Index numbers

Indices can be used to determine how changes in the monetary value of economic transactions can be attributed to changes in price (to measure inflation) and changes in quantity (to measure sales volume or economic output) over time. Different indices are used depending on the data type and purpose. The approach taken is consistent with ONS methodology guidance, the Consumer Prices Indices technical manual and calculations carried out in the UK National Accounts.

Volume activity series are constructed using a cost-weighted Laspeyres index (base year-weighted arithmetic mean).

This method follows the formula:

Where wit is the value share of item i in the base period 0, and Rit-1,t is the volume relative (the ratio of the quantity of an activity to the quantity of the same activity in the base period).

In the context of public service output, the weights (wi) are indicative of the relative value of different activities. Unit costs can be used to approximate the "price" of an activity (pi) given the difficulty of accurately estimating the relative social and economic value of different activities. The weights for different activities are those taken from the first year of each activity pair (the base year 0). For example, if we were combining activity series for each of the devolved UK nations for 2010, we would weight each of the activity growths from 2009 to 2010 for England, Scotland, Wales or Northern Ireland by their respective expenditure shares in 2009.

Where prices indices (for example, deflators) are weighted together, these are constructed using Paasche indices (current year-weighted harmonic mean).

This method follows the formula:

where wit is the value share of item i in the current period t, and Rit-1,t is the price relative (the ratio of the price of a good or service to its price in the base period).

For example, data on price changes in the cost of labour, goods or services and their quantities are used to construct composite price indices. Nominal expenditure data are deflated to real expenditure by dividing by the appropriate price index.

Further guidance on indices methodology can be found at ONS's index numbers guidance.

Comparability

Unlike other measures of productivity produced by the ONS, public service productivity estimates include goods and services, as well as labour and capital, as inputs. This is necessitated by the fact that public service output measures are gross output (total output) measures, rather than value added measures as used in labour productivity and multi-factor productivity, meaning that estimates are not comparable. For more information on how to compare the three measures of productivity, see our article How to compare and interpret ONS productivity measures.

Back to table of contents

2. Healthcare

Quantity output

The quantity of healthcare is estimated using data on a range of healthcare services provided within:

  • hospital and community health services (HCHS); this includes hospital services, community care, mental health and ambulance services

  • primary care and preventive health services, formerly known as family health services (FHS); this includes publicly funded general practice, dentistry and ophthalmic services, services provided via NHS phonelines and websites, and preventive care services commissioned by local authorities that are not provided by NHS trusts or primary care providers

  • community prescribing; this represents prescribed medicines and other medical goods that are dispensed in the community

  • non-NHS provision; this represents healthcare funded by the government but provided by the private or third sector (outside of primary care providers previously referenced) and is indirectly measured using the "output-equals-inputs" approach

  • COVID-19-related testing, tracing and vaccinations; this represents specific services established during the coronavirus (COVID-19) pandemic to mitigate the effects of the disease as well as ongoing vaccination campaigns

With the exception of non-NHS provision, non-quality adjusted (or "quantity") output growth for all of these components is measured by cost-weighting different types of activities undertaken using a Laspeyres index. Healthcare quantity output is first compiled at a nation level by aggregating different activities into a cost-weighted activity index (CWAI) using unit costs at the most granular level available. UK-level aggregation is then achieved by combing the nation-level output indices using expenditure weights taken from HM Treasury's Country and regional analysis.

HCHS and primary care services output is adjusted to account for differences in the number of days in a given year. If services are not provided on every day of the year we apply a working-days adjustment to account for changes in the number of weekends and bank holidays in a given financial year. Otherwise a total days adjustment is applied to account for leap years.

In some instances, changes to the data sources used to capture activity growth for different services can make the data incomparable from year to year, for example, changes to the definition of an activity, or data collections halting. Depending on the data issue, we have different approaches to try to continue to measure the activity in question wherever possible to do so. These approaches include estimating activity growth at a higher level of aggregation or using growth rates from proxy data sources.

Where possible, we look to liaise with data producers to identify the most suitable methods. When it is not possible to reconcile data differences, we exclude the activity from the overall measure for that year. When we do adopt alternative methods for estimating output growth, these are detailed in the methods articles accompanying a release.

As in previous publications, non-NHS provision is calculated by deflating expenditure data, using the same deflator that is used in the inputs. Therefore, non-NHS provision is an "output-equals-inputs" component. Similarly, the output measure produced for community prescribing is also used in the inputs on an "inputs-equals-output" basis. Given output and inputs measures are the same, no productivity growth can be observed for these components.

Data sources and geographic coverage for healthcare quantity output

For England, we mostly collect a variety of open data published by the Department of Health and Social Care or NHS England, with some unpublished data also provided. For Scotland, Wales and Northern Ireland, equivalent data are provided in the form of unpublished direct data submissions by the devolved health administrations to the Office for National Statistics (ONS).

We typically measure admitted patient care activity as finished consultant episodes, with different activity measures used for other services (for example, number of examinations for diagnostic imaging, number of critical care bed-days, or number of tests for pathology services).

While not an exhaustive source list, our main data sources used to measure health activity in England are:

Quality adjustment

The Quality adjustment of public service health output: current method (PDF, 152KB) provides a detailed description of the quality adjustment methodology.

We apply a series of quality adjustments quantity output to account for the change in quality of a service provided in instances where this cannot be ascertained from the activity and unit cost data. A positive quality adjustment indicates that the quality of healthcare services provided, as defined by the selection of indicators used in the quality adjustment, has improved. The quality adjustment is applied to UK output on a calendar-year basis, but also to our England-only financial year statistics. Currently the quality adjustment is produced from England-only data.

The health quality adjustment has three components. The first two are related to achieving outcomes, and the third relates to meeting user needs:

  • health gain for elective and non-elective hospital procedures

  • the degree to which GPs are following best practice in the treatment of certain ongoing conditions

  • the quality of the patient experience for various primary and secondary care services

Our quality adjustment for hospital elective and non-elective procedures uses a dataset provided by the Centre for Health Economics at the University of York. The adjustment includes:

  • short-term post-operative survival rates, derived from Hospital Episode Statistics (HES); short-term survival is used to adjust day cases, elective inpatients and non-elective inpatients

  • estimates of health benefit from procedures, derived from research studies, ONS Life Tables and Patient Reported Outcome Measures; we apply this adjustment to day cases, elective inpatients and non-elective inpatients

  • waiting times from HES; waiting times are used as a quality adjustment for day cases and elective inpatients

Further information on how this quality adjustment is applied is available in our Quality adjustment of public service health output: current method (PDF, 152KB), which provides a detailed description of the quality adjustment methodology.

Elements within the hospital and community healthcare sector include:

  • national patient experience surveys, from NHS England, used as an adjustment for day cases, elective inpatients, non-elective inpatients, emergency care and mental health

Our quality adjustment accounting for outcomes in general practice relies on aggregate data on clinical measures recorded on GP practice computers, from the Quality and Outcomes Framework (QOF). We use the change in achievement scores from a selection of clinical indicators that relate most closely to actual health outcomes. Indicators relate to changes in outcomes for patients with the following common health conditions:

  • diabetes

  • asthma

  • chronic heart disease

  • hypertension

  • stroke and transient ischaemic attack

  • chronic kidney disease

The quality adjustment assesses the change in the achievement rate for the chosen indicators, which is subsequently weighted to account for the share of patients with the given health condition.

Changes in patients' experience of care is measured through responses collected in a selection of national surveys that are part of the Care Quality Commission's NHS patient survey programme, as well as the GP Patient Survey.

Currently, we apply patient satisfaction measures for:

  • inpatient care (elective inpatient, day cases and non-elective)

  • A&E

  • mental health services

  • general practice

  • dentistry

Changes in patient experience are determined by changes in the average patient experience scores and are weighted according to the relative value of patient satisfaction for each service sector.

Hospital and community health services (HCHS) and primary care services are quality adjusted, but no quality adjustment is applied to community prescribing, COVID-19 testing, tracing and vaccinations, or non-NHS provided services.

Inputs

Labour inputs are mainly measured through a Laspeyres cost-weighted labour index (CWLI), which uses administrative data on the health service's workforce to measure growth in full-time equivalent staff numbers weighted by their cost, in a similar manner to the cost-weighted activity index used for quantity output. However, it should be noted that agency staff are included in intermediate consumption inputs because they are not employed by the NHS, while NHS bank staff are included in labour inputs, because they are NHS employees.

The intermediate consumption of goods and services used in the provision of healthcare is also calculated using expenditure data deflated by relevant deflators to account for the cost inflation faced by the health service. From our Public service productivity, healthcare, UK: 2017 onwards, many of the deflators used are taken from the NHS Cost Inflation Index (NHSCII), which is produced by the Department of Health and Social Care (DHSC). This includes the overall NHSCII, sector-specific components of the NHSCII and a version specific to NHS providers' intermediate consumption produced by the ONS. A change to the methodology for deflating agency staff expenditure, which makes use of mandatory data collections undertaken by NHS England and NHS Improvement on agency staff spending, was incorporated in the data for financial year ending (FYE) 2019 onwards.

The volume of capital inputs is measured by consumption of fixed capital, which covers the cost of depreciation of capital goods (items that are anticipated to be in use over several years, such as buildings and vehicles) over time. Data used for this element are estimated in the UK National Accounts using the perpetual inventory method.

The total inputs index is created by weighting the three components of healthcare input together according to their share of total healthcare expenditure recorded in the UK National Accounts. Where data are not provided by a country, it is assumed that this component grows in line with the rest of the UK.

Geographical coverage of inputs data varies across the countries of the UK.

For labour inputs, we include information for hospital and community health services (HCHS), GP services and bank staff. Data from England, Wales, Scotland and Northern Ireland are available for HCHS and GP services, however, bank staff are only included for England.

The sources for HCHS and GP services are:

  • NHS England (NHSE) for England

  • Welsh Government for Wales

  • Scottish Government for Scotland

  • Department of Health Northern Ireland (DH NI) for Northern Ireland

The sources for bank staff are:

  • NHS England (NHSE) for England

For goods and services inputs, we include information on HCHS, dental services, ophthalmic services, pharmaceutical services, GP services, community health and miscellaneous services (CHMS), GP drugs, non-NHS provision, agency staff expenditure, welfare food and health administration.

Sources for HCHS and dental services are:

  • NHSE for England

  • Welsh Government for Wales

  • Scottish Government for Scotland

Sources for non-NHS provision, ophthalmic and pharmaceutical services are:

  • NHSE for England

  • Welsh Government for Wales

  • Scottish Government for Scotland

Sources for GP services are:

  • NHSE for England

  • Welsh Government for Wales

  • Scottish Government for Scotland

Sources for CHMS are:

  • DHSC for England

  • Welsh Government for Wales

Sources for GP drugs are:

  • prescription cost analysis (PCA) for England

  • Welsh Government analysis for Wales

  • Scottish Government analysis for Scotland

  • DH NI analysis for Northern Ireland

Sources for agency staff expenditure and welfare food are:

  • DHSC for England

  • Welsh Government for Wales

Sources for health administration are:

  • DHSC for England

Capital inputs include information on UK capital consumption, for which data are available from the UK National Accounts.

The geographic coverage for deflators is now described, which are either UK-wide or England-only deflators. Where deflators are available for England only, the same rate of price increase is assumed for the other countries of the UK.

Intermediate consumption other than that specified (includes NHS providers) in:

  • FYE 2015 to FYE 2023 is deflated using an ONS intermediate consumption-specific version of the NHS Cost Inflation Index (NHSCII) NHS providers non-pay deflator

  • FYE 1996 to FYE 2015 is deflated using intermediate consumption other than that specified (includes NHS providers)

Non-NHS provided services in:

  • FYE 2015 to FYE 2023 are deflated using the NHSCII NHS providers deflator including both pay and non-pay elements

  • FYE 1996 to FYE 2015 are deflated using a non-NHS deflator combining an ONS intermediate consumption-specific version of the Health Service Cost Index (HSCI) and ONS pay cost index covering Hospital and Community Health Services (HCHS) staff

NHS bank staff costs for FYE 2016 to FYE 2023 are deflated using the NHSCII pay cost deflator for NHS providers.

Agency staff costs in:

  • FYE 2018 to FYE 2023 are deflated using NHSCII agency cost deflator

  • FYE 2015 to FYE 2018 are deflated using NHSCII pay cost deflator for NHS providers

  • FYE 1996 to FYE 2015 are deflated using ONS pay cost index covering HCHS staff

General practice intermediate consumption for FYE 1996 to FYE 2023 is deflated using the Consumer Price Index including owner occupiers' housing costs (CPIH).

Dental services in:

  • FYE 2008 to FYE 2023 are deflated using NHSCII dental cost deflator and an ONS equivalent for earlier years

  • FYE 1996 to FYE 2008 are deflated using a non-NHS deflator combining an ONS intermediate consumption-specific version of the HSCI and an ONS pay cost index covering HCHS staff

Pharmaceutical services (excluding drug costs) in:

  • FYE 2015 to FYE 2023 are deflated using overall NHSCII

  • FYE 1996 to FYE 2015 are deflated using a non-NHS deflator combining an ONS intermediate consumption-specific version of the HSCI and an ONS pay cost index covering HCHS staff

General ophthalmic services in:

  • FYE 2015 to FYE 2023 are deflated using an overall NHSCII non-NHS deflator combining an ONS intermediate consumption-specific version of the HSCI and an ONS pay cost index covering HCHS staff for FYE 1996 to FYE 2015

  • FYE 1996 to FYE 2015 are deflated using a non-NHS deflator combining an ONS intermediate consumption-specific version of the HSCI and an ONS pay cost index covering HCHS staff

The inputs for hospital and community health service employees (other than bank staff) working in general practice and GP-prescribed drugs are not deflated as these inputs are directly measured using a cost-weighted labour index or cost-weighted drug index.

Capital consumption inputs are obtained from the national accounts in volume terms and so need no further deflation.

Back to table of contents

3. Education

Quantity output

Education quantity output is the sum of full-time equivalent (FTE) publicly funded student numbers within the following sectors across the UK:

  • pre-school education, which is composed of students in local authority (LA)-maintained pre-primary schools and places funded in the private, voluntary and independent sector (PVI)

  • LA-maintained primary, secondary and special schools

  • for England – primary, secondary and special academies

  • for England – alternative provision (AP) for LA-maintained schools

  • for England – AP for academies

  • further education (FE), composed of adult learners aged 16 to 19 years

  • further education training of healthcare professionals from 1997 to 2011

Healthcare training is only included up to 2011 because of the measures being captured in the non-profit institutions serving households (NPISH) sector in the UK National Accounts from this period. This was because of healthcare training moving into the higher education (HE) category.

Enrolment figures for primary, secondary, special schools and AP (composed of separate figures for LA-maintained schools and academies) are adjusted for attendance to produce activity metrics for these school phases. From 2020 onwards, additional adjustments to consider the delivery of remote learning, absence rates and attendance during periods of in-person teaching are conducted to account for the impact of the coronavirus (COVID-19) pandemic on education activity. More information on this adjustment can be found in our Coronavirus and the impact on measures of UK government education output: March 2020 to February 2021 and Remote schooling through the coronavirus (COVID-19) pandemic, England: April 2020 to June 2021.

Activity and expenditure data are splined into calendar year, then activity figures are weighted according to the cost of providing education to each school phase for each individual nation to produce a cost-weighted activity index (CWAI). The CWAI is produced using the Laspeyres approach.

By 2022, the following phases had the corresponding weights:

  • pre-school: 5.5%

  • primary (LA-maintained and academies): 41.9%

  • secondary (LA-maintained and academies): 37.0%

  • special (LA-maintained and academies): 6.7%

  • alternative provision (including pupil referral units): 0.9%

  • FE: 7.9%

Sources of education output data

Data for quantity and expenditure in schools are:

  • England: Department for Education (DfE), RO1 local authority outturn – Ministry of Housing, Communities and Local Government (MHCLG)

  • Wales: Welsh Government

  • Scotland: Scottish Government

  • Northern Ireland: HM Treasury

Data for quantity and expenditure in FE are:

  • England: DfE (quantity), Education and Skills funding agency (expenditure)

  • Wales: Welsh Government (quantity and expenditure)

  • Scotland: Scottish Funding Council (quantity and expenditure)

  • Northern Ireland: Department for Economy Northern Ireland (quantity, Welsh unit costs are used for expenditure)

Quality adjustment

Attainment for primary and secondary schools, and further education

Education quality adjustment is dictated by several components, but attainment bears the most weight on the model given that exam performance is the most crucial outcome of schools. Output in primary and secondary schools (including LA-maintained schools and academies) across the UK are quality adjusted according to attainment measures. FE output in England is also quality-adjusted according to attainment measures.

Because of education being a devolved policy area with courses and syllabi specific to each nation, different data sources are used to inform attainment according to school phase and nation. FE attainment in England is a new quality adjustment measure for education following the PSP Review and is based on the percentage of students meeting the minimum requirement for Level 2 and Level 3 qualifications by age 19 years. Separate attainment indices for Level 2 and Level 3 are prepared (these are processed by the "cohort-split" model, which is now discussed further), and these are weighted by the percentage of students completing each qualification.   

During the coronavirus (COVID-19) pandemic, historical data sources were no longer valid because of data not being published, or concerns around grade inflation that arose following teacher-assessed grading practices. Therefore, the National Reference Test (NRT) is used to inform attainment from 2019 to 2020 onwards for primary and secondary schools.

The NRT is independent of teacher-assessed grades, and a robust indicator of GCSE-level performance. Although the NRT takes place in England only, it is used to inform attainment for schools across the UK because of the absence of similar metrics specific to each of the devolved administrations.

The NRT is used to inform attainment for primary schools in the absence of alternative data. Because of primary schools bearing significant weight, there was a concern that leaving data gaps untreated would underestimate the broader effects of the pandemic on attainment. In addition, the NRT represents performance in academic subjects, which are a focus in primary schools. This is unlike for FE, where there is a shift to technical qualifications in addition to academic, therefore the NRT is not used to inform attainment for FE from 2019 to 2020.

A summary of historical data sources and how the NRT is being used to treat each school phase is now outlined. The Office for National Statistics (ONS) will continuously review the application of the NRT and the period from when historical attainment data sources are published once again and are independent of marking practices that arose during the pandemic.

England

For the primary school phase, the data source up to 2018 to 2019 was the percentage of pupils meeting the expected standard in reading, writing and maths. Following the coronavirus (COVID-19) pandemic, the NRT was used to inform a data gap for 2019 to 2020 before resuming original data from 2020 to 2021.

For the secondary school phase, the data source up to 2018 to 2019 was the average attainment 8 score. The NRT has informed this measure from 2019 to 2020.

For FE, the data source up to 2018 to 2019 was the percentage of pupils meeting the minimum requirement for Level 2 and Level 3, respectively. There were no alternative data that can inform FE during the coronavirus pandemic, therefore the index is kept constant from 2019 to 2020.

Wales

For the primary school phase, the data source up to 2018 to 2019 was the percentage of pupils reaching expected level in English, Welsh and maths. The NRT has informed this measure from 2019 to 2020.

For the secondary school phase, the data source up to 2018 to 2019 was the average capped 9 score per pupil. The NRT has informed this measure from 2019 to 2020.

Scotland

For the primary school phase, the data source up to 2018 to 2019 was the percentage of pupils achieving the Curriculum for Excellence (CfE) level in reading, writing and numeracy. The NRT was used to inform a data gap for 2019 to 2020 before resuming original data from 2020 to 2021.

For the secondary school phase, the data source up to 2018 to 2019 was a composite measure of Level 2, skills for work and personal development, and National 5 attainment. The NRT has informed this measure from 2019 to 2020.

Northern Ireland

For the primary school phase, the data source up to 2018 to 2019 was the percentage of pupils achieving Level 4 or above in communication, maths, and information and communication technology (ICT). The NRT has informed this measure from 2019 to 2020.

For the secondary school phase, the data source up to 2018 to 2019 was the percentage of students achieving five or more GCSEs at grades A* to C including English and maths. The NRT has informed this measure from 2019 to 2020.

Processing attainment – the "cohort-split" model

The ONS applied a "cohort-split" model to account for the cumulative nature of education while processing attainment for each school phase. The model is applied to primary, secondary and FE phases within each nation separately.

The model considers exam performance achieved at the end of each school phase (Year 6 for primary, Year 11 for secondary, age 19 years for FE) and retrospectively apportions contributions to individual year groups within the cohort that build up towards the exam outcome.

For FE, attainment data are published according to age group, not academic year group, therefore retrospective contributions are assigned to age groups for FE. For a given academic year, weighted contributions are obtained from individual year groups from across separate cohorts to produce an attainment value that accounts for academic performance across the educational journey. Were this approach not applied, the model would assume that exam outcomes are solely attributed to performance at the end of the school journey (for example, Year 11 for secondary schools), which is not conceptually accurate. The weighted contributions assigned to year groups within each school phase are outlined in Table 1.

These weights were given attention and revision following the PSP Review. For example, for attainment data released for secondary schools in 2018 to 2019, 20% of the score is applied to 2018 to 2019 to represent the contributions from Year 11. Likewise, 20% of the 2018 to 2019 score is retrospectively applied to 2017 to 2018 when that cohort was in Year 10. The remaining 60% is retrospectively added to the previous years.

In the latest years, there will be an incomplete number of contributions from some year groups because of those students not yet sitting their examinations and thus not having any scores to back-cast. In such circumstances, available year group contributions are re-scaled to total 100%, and these are re-weighted and updated as additional year groups feed into the model every production round.

The coronavirus (COVID-19) pandemic violated the typical assumptions of the "cohort-split" model; in particular, it would not be conceptually correct to retrospectively apportion exam outcomes achieved during the pandemic to previous years.

Attainment did fall during the pandemic (as informed by the NRT and available primary school data), therefore the previous model would have assigned more negative contributions to previous year groups leading up to the pandemic. This would have been unwarranted, as the pandemic, rather than performance before and leading up to the pandemic, was predominantly responsible for exam outcomes during this time. Ultimately, this would have meant that the quality-adjusted output back-series would have been unfairly penalised because of false assumptions. To navigate this challenge, we introduced two main interventions to the model.

Firstly, instead of retrospectively apportioning coronavirus-affected exam scores to year groups before the pandemic, these groups' scores would be informed by taking the average performance of similar year groups over the previous five years.

Secondly, residual adjustments are applied to year groups directly affected by the pandemic (2020 to 2021 and 2021 to 2022) so that the sum of contributions from within a cohort equals the final achieved attainment value. For example, for the first cohort affected by the pandemic (Year 11 in 2020 to 2021) the Year 11 score is informed by 20% of the final attainment value but is adjusted so that the Year 7 to Year 11 contributions equal the final attainment value. The Year 11 residual adjustment is then applied to other year groups within that academic year to avoid arbitrary adjustments to other year groups. This process continues for the following years, considering residual adjustments applied to previous year groups. These adjustments will automatically drop out of the model when the last cohort who attended school during the pandemic take their exams.

In essence, these adjustments confine the effects of the pandemic to year groups that were directly impacted by the pandemic and allow the cumulative nature of education to be accounted for in ongoing PSP estimates.  

Student well-being

Primary and secondary schools are also quality adjusted according to student well-being, which is based on data from the Understanding societies harmonised UK Household Longitudinal Survey (UKHLS).

Student well-being is determined from weighted responses to the questions "How do you feel about your school?", and "How do you feel about your schoolwork?". Responses are derived from a Likert scale, ranging from 1 (Completely happy) to 7 (Completely unhappy). The total positive responses to both questions (1 to 3 on the Likert scale) are compiled and the proportion of positive responses in relation to neutral and negative responses are determined.

From this, an index is derived based on growth in relative positive responses. The well-being index is weighted based on percentage of expenditure allocated to addressing pupil deprivation as declared in the Department for Education's (DfE's) National Funding Formula.   

Key stage 2 disadvantaged gap index

Disadvantaged pupils are defined by the DfE as those who attend primary school and have been eligible for free school meals at any point in the last six years, children looked after by a local authority, and children who left local authority care in England and Wales. Equity of attainment is an important priority for the UK education system.

This quality adjustment is based on data for England, as no equivalent measures are available covering other parts of the UK. As such, we have applied it to the output measure for all parts of the UK.

The disadvantaged attainment gap index is published by the DfE. As the index approaches 0, this reflects the gap between disadvantaged pupils and their peers being closed. Therefore, to be consistent with quality adjustments in public service productivity, the index growth is inverted such that a fall in the index (as it gets closer to 0) reflects an improvement in quality.

Data on the disadvantaged gap index were not published for the 2019 to 2020 and 2020 to 2021 academic years because of the coronavirus (COVID-19) pandemic. Therefore, the index has been held constant over these years, however, data were available from 2021 to 2022.

To weight the disadvantaged attainment gap quality series together with the other quality metrics in the education service area, the proportion of primary school funding that is specifically pupil premium funding is used. The pupil premium is defined as "funding to improve education outcomes for disadvantaged pupils in schools in England".

Inputs

The ONS publishes estimates of publicly funded education inputs in the UK from 1997 onwards. The components that compose inputs: labour, intermediate consumption, capital consumption, are organised in education as:

  • local authority (LA) direct labour

  • central government indirect labour

  • goods and services (provision)

  • goods and services (administration)

  • consumption of fixed capital

A direct measurement of labour within schools (including LA-maintained schools and academies in England) is based on full-time equivalent (FTE) teacher and support staff numbers, which are weighted using salary data. Various sources are used to inform salary data for occupations across the devolved administrations, including the ONS's Annual Survey of Hours and Earnings (ASHE).

Numbers of FTE staff are initially gathered on an academic year basis, and these are splined to calendar year before subsequent aggregation steps. Numbers of FTE staff are then aggregated by staff type and school phase and are adjusted by their hours worked overtime.

Currently, data on hours worked overtime are obtained from the Labour Force Survey (LFS). Country-specific FTE numbers adjusted by hours worked overtime are weighted by salary data and aggregated into a direct labour index that represents the UK. During the PSP Review, an exercise was undertaken to update salary data informed by ASHE to ensure salary weights were as accurate as possible. As a result, salary weights were updated, and the impact of the changes can be observed in our article Public Services Productivity Review, impact of improved methods on total public service productivity: 1997 to 2021. An indirectly measured labour index is prepared for central government labour, which is informed by deflating central government labour expenditure according to the Classification of the Functions of Government (COFOG 9).

The indirect labour measure also includes FE inputs, as these are not currently captured in the direct measure, and academy expenditure is transferred to local government expenditure. This is because academies receive funding and support from central government but are being accounted for in the direct labour measure, which is weighted by local government shares. The average labour compensation per hour (ALCH) industry O (public administration and defence; compulsory social security) is used to deflate central government expenditure, and this replaces the Average Weekly Earnings (AWE): Public Administration Index. This is because the ALCH covers all costs of labour such as pension and National Insurance contributions, whereas these are not reflected in the AWE, which is more of a pure price measure.

The direct and indirect labour indices are weighted and aggregated into a total labour index based on their general government expenditure shares. By 2022, direct and indirect labour weights were approximately 92% and 8%, respectively.

While goods and services (provision) relate to intermediate consumption within schools, goods and services (administration) refers to intermediate consumption within the public administration component of education.

Goods and services expenditure data according to provision and administration are determined from national accounts expenditure (COFOG 9) and are deflated by separate deflators.

Provision is deflated using a composite Paasche intermediate consumption deflator, which is constructed by the relevant CPIs, SPPIs and PPIs for each area of expenditure, and are weighted according to expenditure shares declared in the DfE's annual reports and accounts.

Administration is deflated using the gross domestic product-implied deflator.

Provision and administration are weighted based on their relative expenditure shares to produce an intermediate consumption index. By 2022, the weights for provision and administration were approximately 82% and 18%, respectively.

Consumption of fixed capital national accounts expenditure data is deflated using a constructed education general government capital deflator to produce a consumption of fixed capital index.

The labour, goods and services, and capital indices are aggregated together using their respective UK National Accounts general government expenditure shares to form a chain-linked Laspeyres volume index. By 2022, the approximate weights for labour, goods and services, and capital, were 66%, 26%, and 8%, respectively.

Sources of input data

Direct labour – school staff numbers:

  • England: DfE

  • Wales: Welsh Government

  • Scotland: Scottish Government

  • Northern Ireland (teaching only, no support staff): Department for Education Northern Ireland

Direct labour – salary data:

  • England: DfE for teaching staff, ASHE for support staff

  • Wales: Welsh Government for teaching staff, ASHE for support staff

  • Scotland: ASHE for teaching and support staff

  • Northern Ireland: ASHE for teaching and support staff

Direct labour – hours worked overtime:

  • LFS

Indirect labour:

  • expenditure: ONS UK National Accounts

  • deflator: ALCH industry O

Goods and services (provision):

  • expenditure: ONS UK National Accounts

  • deflator: Constructed composite Paasch intermediate consumption deflator

Goods and services (administration):

  • expenditure: ONS UK National Accounts

  • deflator: GDP-implied deflator

Capital expenditure:

  • expenditure: ONS UK National Accounts

  • deflator: constructed education general government capital deflator

Back to table of contents

4. Defence

Defence is a service area in which output is indirectly measured. This is because of difficulties in identifying and measuring the collective nature of services delivered. There are also significant data barriers to estimating output. We therefore apply the "output-equals-inputs" convention and assume productivity growth to be zero. Defence was prioritised for improvements by the National Statistician's Independent Review of the Measurement of Public Services Productivity resulting in new adjustments applied to our inputs measure.

Inputs

Previously, total defence inputs were measured indirectly by deflating current price expenditure of all inputs components, in accordance with the Classification of the Functions of Government (COFOG 2), by an implied deflator reflecting the whole industry.

In 2022, we improved our defence labour measurement by transitioning from an indirect to a direct labour measurement approach, a methodological development implemented based on existing available data.

The direct measure is based on the numbers of staff employed, adjusted for hours worked (given by full-time equivalent hours), and provides grouping by grade, military rank or skill level. Full-time equivalent hours (FTE) growth per employment rank is cost-weighted by rank-specific implied expenditure shares, using departmental personal statistics pay data. This produces a volume series, which is weighted alongside other inputs components by labour current price expenditure shares, in accordance with COFOG 2. This general formula outlines the direct labour estimation approach for an individual rank, i:

From this formula, to get a total labour volume growth estimate, we simply sum the growth contributions of all relevant ranks: i to N.

Data for military personnel strengths and employment rank was sourced from Quarterly service personnel statistics: index. Salary data was sourced from Armed Forces Pay Review Body. Data for civilian personnel full-time equivalent employees (FTEs), employment grade, and salaries was sourced from Civil Service statistics. Civilian personnel salary data were backcast prior to 2007 to maintain consistent pay groups across the time series.

In addition, we transitioned from a general implied deflator for defence to individual bespoke deflators for intermediate consumption and capital. The updated intermediate consumption deflator is constructed from the Online System for Central Accounting and Reporting (OSCAR) dataset and Office for National Statistics (ONS) price indices. The updated capital deflator is an implied deflator derived from ONS capital stocks defence volume and current price estimates, in accordance with the UK National Accounts.

A cost-weighted Laspeyres volume index is then calculated for the volume of defence inputs, using chain-linked expenditure shares, and assumed to equal the volume of defence output.

Back to table of contents

5. Adult social care

Adult social care (ASC) services provide care and support to older people, adults with learning or physical disabilities, adults with mental health problems, drug and alcohol misusers, and carers. Provision of ASC is the responsibility of local authorities in the UK.

ASC services include:

  • placements in residential and nursing care
  • provision of home care services
  • day care services
  • supported living and accommodation
  • "meals on wheels"
  • equipment and home adaptions
  • care assessments and support services

Local authorities can provide ASC services themselves or contract ASC services from independent sector providers. Our estimates cover both forms of provision.

The Office for National Statistics (ONS) publishes two separate estimates for ASC; these relate to England-only findings on a financial year basis, and the UK on a calendar year basis. However, the latest publication referring to England estimates was for financial year ending (FYE) 2023 (Public service productivity, adult social care, England: financial year ending 2023). The latest estimates for England (up to FYE 2023) are published as an ad hoc dataset in association with our article Public service productivity: total, UK, 2022.

Quantity output

The ASC quantity output is produced via a cost-weighted activity index where activity data are available (such as the number of weeks of care provided), and on an "output-equals-inputs" basis where they are not.

Activity data are available for England and informed by the NHS Digital's Adult Social Care Activity and Finance Reportand its predecessors. These data cover output measures from FYE 2015, however, directly measured output data are only available for residential and nursing care, the remaining care services are measured on an "output-equals-inputs" basis.

Activity data are not available for Wales, nor for Northern Ireland from FYE 2021, therefore these are also measured on an "output-equals-inputs" basis. Direct measures are available for Scotland. More information on output data for each of the devolved nations can be found in our article Improvements to non-market adult social care output in the National Accounts.

For England, a change in data collection between FYE 2014 and FYE 2015 resulted in fewer activities being measured. Until FYE 2014, activity data were available for residential care, nursing care, assessments of need, day care, home care, provision of meals and provision of equipment. However, from FYE 2015 onwards, only residential care and nursing care activity data were available to be included in the output index. The proportion of output which was measured on an output-equals-inputs basis increased in FYE 2015 because of these changes.

Furthermore, where the data are available, services are cost-weighted separately for different client groups. These are split by age (people aged 65 years and over, and working age adults aged 18 to 64 years) and by primary support reasons, such as physical disability, learning disability or mental health needs.

Services for which activity data are not available are measured using deflated expenditure (including funding from both local authorities and NHS income from FYE 2005 onwards) on an "output-equals-inputs" basis.

However, one spending element that is not captured in the indirectly measured output component of ASC is "commissioning and service delivery" (CSD). This area of expenditure relates to overhead costs for the provision of care services such as business planning, strategic business direction, and communications and personal protective equipment (PPE). As these are not related to provision of care-related activities, increased spending within CSD does not correlate with a corresponding volume of care activities being delivered, and thus the "output-equals-inputs" assumption cannot be applied here.

Furthermore, local authorities received grant support during the coronavirus (COVID-19) pandemic to support and sustain ASC provision during this period, and these were recorded within the CSD category. Such costs were used to sustain the ASC sector through the pandemic, covering areas such as PPE and hospital discharges, however, this expenditure did not deliver additional volumes of care activities.

It must also be acknowledged that coronavirus-related grant support has likely spilled over into other cost areas as not all local authorities may have recorded these costs in CSD. But by excluding CSD from indirectly measured output, we have taken steps to accurately reflect the impact of coronavirus-related expenditure on ASC productivity as much as possible.

Activity data for Scotland are sourced from Public Health Scotland (PHS) and cost-weights are sourced from local government finance return (LGFR) data. Adjustments are made to remove client-funded activity.

There is no ASC activity data to measure output directly in Wales. However, expenditure data are available from financial year ending (FYE) 2002, and this has been used to calculate output according to the "output-equals-inputs" assumption, in a similar manner to the indirect output components for the other three nations.

Our latest ASC output for Northern Ireland include a combination of direct and indirect output measurement approaches, in line with the national accounts.

The direct volume output approach makes use of activity and expenditure data provided by the Department of Health Northern Ireland for the period FYE 2007 to FYE 2020. This data collection covers a wide range of ASC services, including residential and nursing care, domiciliary care and day care, with services split by client group where appropriate. From FYE 2021, direct measures are no longer available, therefore the "output-equals-inputs" assumption is applied. We intend to reintroduce the direct component when activity data become available in future.

The indirectly measured output component is also included from FYE 2013 for Northern Ireland using the residual expenditure not accounted for in the direct volume output measure. This approach uses HM Treasury's Public Expenditure Statistical Analysis: Country and Regional Analysis to define total ASC expenditure. As with Scotland and Wales, to deflate this expenditure, the same deflators are used as for England.

To produce the indirect output measure for the devolved administrations, similar deflators are used as for England, although the weights applied to the local authority (LA) and independent sector components differ to reflect data on the expenditure split by provider sector in Wales.

The expenditure data used for the devolved administrations include NHS funding where available and exclude client funding in a similar manner to England, and a similar adjustment is applied to remove coronavirus-support funding, as this does not result in an increase in the quantity of services provided.

Quality adjustment

The quality adjustment for ASC output is based on the concept of adjusted social care-related quality of life from the Adult Social Care Outcomes Framework (ASCOF), the main source of outcomes data for ASC services in England.

Separate quality adjustments have been developed for community care, and residential and nursing care, both using data from NHS Digital's Adult Social Care Survey (ASCS) from FYE 2011 onwards. The ASCS is a sample survey of clients in LA-supported care in England. Coverage includes clients whose care is partly or entirely funded by a LA, including those in receipt of direct payments, or clients in LA-organised care who are fully self-funding.

Using the data from the ASCS, it is possible to calculate how well clients' needs are met (on a scale from no needs met to no unmet needs) across eight domains:

  • control
  • personal care
  • food and nutrition
  • accommodation
  • safety
  • social participation
  • occupation
  • dignity

Each level of response on care needs across each of the eight domains is then weighted to account for its importance in affecting quality of life, using weights developed from a separate survey of community care users.

In addition, factors predominantly outside the influence of ASC services, but which affect the likelihood of needs being met, are controlled for to derive the change in social care-related quality of life resulting from changes in ASC service quality.

For community care, factors from the calculations used in the ASCOF are applied to the person-level data in the ASCS to remove the influence on care-related quality of life of clients' age, health status, suitability of clients' home for meeting their needs, and clients' ease of travelling around outside in their local environment. As the factors used in ASCOF only relate to community care users, for residential and nursing care, a regression model is used to calculate the impact of ASC services on care-related quality of life, controlling for these external factors.

As a result of the coronavirus pandemic, participation in the ASCS was voluntary, and 18 councils with Adult Social Services Responsibilities (CASSRs) provided data for FYE 2021, compared with 151 CASSRs in the previous years. Separate adjustments for residential and nursing care, and community care are applied in response to this.

For community care, growth rates between weighted 1J scores between the 18 CASSRs for FYE 2021 were determined between the full list of CASSRs for FYE 2020 and FYE 2022. Data are processed as normal from FYE 2022.  

For residential and nursing care, the change in adjusted social care-related quality of life compared with a predicted change in adjusted social care-related quality of life drives the measure. The predicted change is based on coefficients produced from the regression model, and the predicted coefficients for FYE 2020 were used to inform FYE 2021. Data are processed as normal from FYE 2022. More information on the adjustments applied for ASC quality adjustment in response to coronavirus can be found in our article Public service productivity, adult social care, England: financial year ending 2021.

Inputs

Of the input's components, goods and services is the largest for ASC (80.1% weight in 2022). It includes all services contracted from independent sector providers, and services purchased by clients using direct payments. LA intermediate consumption of goods and services is also included.

The quantity of ASC inputs is estimated by deflating expenditure using appropriate deflators.

For UK ASC estimates, inputs expenditure growth is taken from the national accounts. For the England-only estimates, expenditure is informed by the NHS Digital's Adult Social Care Activity and Finance Report and its predecessors.

Local authority inputs expenditure for the UK

Public service ASC is primarily funded by LAs in the UK. LA social protection expenditure data from the national accounts is used to measure public service expenditure on ASC. This is produced from local authority revenue expenditure and financing data from the Ministry of Housing, Communities and Local Government (MHCLG) for England, and equivalent data for Wales.

LA capital consumption is also measured using data from the national accounts and is estimated using the perpetual inventory method.

There are several adjustments made to the expenditure data from the UK National Accounts to maintain a consistent time series to cover ASC services specifically. The most substantial of these adjustments is to remove housing services expenditure.

Non-local authority ASC expenditure in inputs

LA-organised ASC services are also partly funded by care clients themselves and by transfers from the NHS. Because our measures cover only publicly funded services, client contributions that fund ASC services are excluded from the ASC inputs, and output is also adjusted to remove activity funded by client contributions. LAs also receive funding for ASC services from the NHS.

NHS transfers to local authorities are measured using the same data source as inputs expenditure for England from FYE 2005 onwards. Because of data availability, NHS transfers are not included in the measure in the years before FYE 2005. Symmetrical adjustments are made to the output calculations to remove activity funded by client contributions and include activity funded by the NHS.

Accounting for cost inflation

The list describes the deflators we use on various components of expenditure. A similar approach to deflation is taken for both the UK and England productivity measures.

  • Skills for Care (SfC) National Minimum Dataset for Social Care (NMDS-SC), provided by the Department of Health and Social Care (DHSC), is used to deflate LA and independent sector labour; the source of price data for this deflator is the SfC since FYE 2014, and the Annual Survey of Hours and Earnings (ASHE) used prior to this.
  • The Subjective Analysis Return (SAR) Annex A -- part of local authority revenue expenditure and financing -- produced by MHCLG, is used to produce a basket of goods representing LA intermediate consumption; this is then deflated by relevant component indices taken from the ONS's Consumer Prices Index (CPI), Services Producer Price Index (SPPI), Producer Price Index (PPI), Retail Prices Index (RPI) and Average Weekly Earnings (AWE).
  • The care cost benchmark data produced by Laing Buisson is used to deflate intermediate consumption for independent sector residential and nursing care.
  • Home care costs data, produced by the UK Homecare Association (UKHCA) deflates intermediate consumption for independent sector home care.
  • A composite deflator constructed from relevant component indices of the CPI (CPI Social Protection: home care, CPI Index 12.4: Social Protection, CPI Index 00: All Items) weighted using data collected from the London Association of Directors of Adult Social Services (ADASS) improvement programme is used to deflate direct payments.

Data on the proportion of expenditure, which is deflated by each of these sources, are derived from the NHS Digital's Adult Social Care Activity and Finance Report and its predecessors for England and equivalent data sources for the devolved administrations. For Scotland and Northern Ireland, detailed data on the proportion of services which is provided by the independent sector and government providers are not available and so these proportions are based on the proportions for Wales.

To calculate the final overall inputs index, growth rates from each of the three indices (labour, intermediate consumption including direct payments and capital consumption) are weighted by their respective expenditure shares. This is then splined from financial year to calendar year for reporting at the UK level (estimates at the England level are reported on a financial year basis).

Back to table of contents

6. Policing and immigration

Policing and immigration is a service area in which output is indirectly measured. This is because of difficulties in identifying and measuring the collective nature of services delivered.

Inputs

With the exception of local government labour, police inputs are estimated by deflating expenditure on labour, goods and services, and capital.

The volume of local government labour inputs is measured directly from data on full-time equivalent employees (FTEs) and relative salaries for different groups. FTE data are sourced from Police workforce statistics for England and Wales, Workforce statistics for Police Scotland, and directly from the Police Service of Northern Ireland. Additional workforce numbers for the Police Uplift Programme are sourced from Police Uplift Statistics.

The volume of central government labour input is measured indirectly. Expenditure data are deflated by the Average Weekly Earnings (AWE) Index for Public Administration and Improved methods for total public service productivity: total, UK, 2018 details an adjustment made to police expenditure data from 2013 onwards.

The deflator for local government goods and services expenditure is constructed from subjective analysis returns (SAR) within local government financial statistics and Office for National Statistics (ONS) price indices. The deflator for central government goods and services expenditure is constructed from the Online System for Central Accounting and Reporting (OSCAR) dataset and ONS price indices.

Local and central government net expenditure on capital consumption is deflated by the implied local and central government capital deflator for industry O (public administration and defence).

A cost-weighted Laspeyres volume index is then calculated for the volume of police inputs, using chain-linked expenditure shares, and assumed to equal the volume of police output.

Back to table of contents

7. Public order and safety

Quantity output

Within the public order and safety (POS) service area there are four main components:

  • fire

  • courts, which itself has five further sub-components: magistrates' courts, county courts, Crown Courts, Crown Prosecution Service and  legal aid

  • prisons

  • probation

Police and immigration are measured separately to POS so are excluded from these measurements.

For each component, a cost-weighted activity index (CWAI) is constructed. We use direct output measures for all components.

A quality adjustment is not applied to fire service, county courts services (civil cases), or the civil component of legal aid. This is because these services are deemed to have different outcomes to the criminal justice elements of POS and have data limitations.

Fire

Fire output activities are categorised into three groups:

  • fire response (FR)

  • fire prevention (FP)

  • fire special services (FS)

These groups all form part of the fire and rescue service (FRS). Activity measures for the FRS are based on the number of incidents attended for fire response and fire special services activities, and staff hours spent on fire prevention activity.

Appropriate cost weights are based on the Economic Cost of Fire estimates for different fire incidents. The output measure combines the different activities into a single cost-weighted activity index (CWAI) using the associated unit costs as their weights, and an overall output index is then constructed as a chain-linked Laspeyres index using the previous year's prices.

Fire response services (quality adjustment is not applied to these services) include fire response for dwellings, commercial premises, vehicle, chimney, false alarms, measured by the number of incidents attended and sourced from Home Office data.

Special services response (road and non-road) is measured by the number of incidents attended and sourced from Home Office data.

Prevention is measured by inspections, investigation, community safety, for example fitting fire alarms. It is measured by number of hours of workload and is sourced from Home Office data.

Courts

Law courts (partially quality adjusted) include the Crown Court, magistrates' courts, county courts and family courts (which cover private, public, divorce and adoption cases). Separate cost-weighted activity indices for different areas of the courts system are constructed and then further aggregated based on expenditure shares.

The output of criminal courts (Crown and magistrates) uses disposal data sourced from Ministry of Justice (MoJ) Criminal Court Statistics.

For the Crown Court, data are provided on the hearing time of Crown Court cases broken down by hearing and plea type.

For triable-either-way trials and indictable-only trials, the plea types are:

  • guilty plea

  • not guilty plea

  • no plea entered

  • dropped case

For committed for sentence and appeals hearings, the plea type was not applicable.

By calculating average hearing time for all cases, it is possible to estimate an average hourly cost for each hearing. This can be interacted with hearing time for each of these categories to weight the different types of Crown Court activity.

Because of difficulties in measuring hearing time for magistrates' courts (which can hold multiple hearings in a single session, with no indication for average time for each hearing), overall disposals are taken for magistrates' courts activity.

The output for civil courts (family and county courts) is currently measured by caseload with data sourced from the MoJ. Data from the MoJ on applications, hearings and final orders are used to produce a "weighted caseload". Unit costs are periodically sourced from the MoJ and are used as weights for the output index. However, all civil courts outputs have been forecast or estimated since 2019.

Crown Prosecution Service (CPS)

The indices for magistrate's courts and Crown Courts are used to predict activity growth in this area. Both indices are aggregated based on their expenditure shares to approximate the growth in activity for the CPS.

Legal aid

From our 2022 ONS Public service productivity publication, we have expanded the categories used for legal aid, which now captures lower crime, higher crime and civil legal aid.

Lower crime mostly covers work carried out by legal aid providers in magistrates' courts and at police stations in relation to people accused of or charged with criminal offences. On the other hand, higher crime refers to legal representation in the Crown Court and for criminal cases in the higher courts. Civil legal aid refers to legal representation in civil courts.

In 2022, there were a total of 72 categories (31 for lower crime, 18 for higher crime and 23 for civil legal aid), although this number varies slightly over time.

The number of cases requiring legal aid, as well as the associated costs, are taken from Legal aid statistics. Using these data, it is possible to construct a single cost-weighted activity index for legal aid in its entirety.

Prisons

Output for prisons is measured by the average number of prisoners in UK prisons. These data are collected on a monthly basis and coverage is for Great Britain. For England and Wales, the prison population has been split by security category.

Prison population statistics

MoJ Prison population statistics are cost weighted using prison performance data, which provides the expenditure Prison and Probation Performance Statistics of individual prisons annually for England and Wales. These are published by security category, which are used to split the activity data.

For England and Wales, the Office for National Statistics (ONS) uses the categories provided in the annual cost publication to allocate cost-weights for each of nine categories: 

  • male dispersal (category A)

  • male reception (category B)

  • male trainer (category B)

  • male category C and young offender institution (YOI) trainer and resettlement

  • male open (category D)

  • female closed

  • female local

  • female open

  • male YOI young people (ages 15 to 17 years)

For Scotland, prison population data are taken from Scottish Prison Service Data. With limited data on Scottish prisons expenditure, it is not possible to split Scottish prison population by security category. As such, Scottish prison population data are included at an aggregate level, weighted using an average unit cost for England and Wales.

Probation

The current probation output measure uses supervisions to capture activity. This is taken from data published by the MoJ Offender management statistics.

Data are currently available for England and Wales. Supervisions are split into two categories:

  • community order and suspended sentence order

  • on license

Relative weights are assigned to these two activity categories using the marginal unit cost of each group of probationers provided by MoJ, allowing the ONS to create a cost-weighted activity index (CWAI) for probation. However, as these data are available only for 2023 to 2024, the ratio of unit costs for the two groups will be fixed over time and uprated using total expenditure per probationer.

Quality adjusted output

Full details of the quality adjustments can be found in our article Quality adjustment of public service public order and safety output: current method.

Quality adjustments are not applied to fire protection services, county courts, or the civil component of legal aid. As these all deliver civil cases, they are deemed to have different outcomes to the criminal justice elements.

The courts' timeliness adjustment

The timeliness adjustment relates to the average time taken for criminal cases to reach completion, on the basis that the delivery of a sentence in a timely manner is favourable.

In 2022 for Crown Court, we have introduced data from the MoJ Timeliness Tool, which has also allowed the opportunity to improve the granularity of our timeliness quality adjustment. The ONS can now ensure the timeliness adjustment is reflective of the underlying mix of case disposal activity at Crown Court (that is, aligning with the relative proportion of both case type and plea), and incorporate triable-either-way, indictable, committed for sentence, and appeals case types.

Average timeliness was used for appeals and those cases where no plea is entered (as these are excluded from the MoJ timeliness calculations).

For magistrate courts, the measure is based on the mean average number of days between charge and completion.

The recidivism adjustment

The recidivism adjustment approximates the effect the Criminal Justice System (CJS) has on reducing the volume and severity of further crimes being committed by those who have gone through it.

This adjustment is composed of three parts, the first being the change in the number of proven re-offences committed by adults and juvenile offenders categorised between crime types. An adjustment is made to adult offenders, to account for differences between cohort characteristics and their likelihood to re-offend. The final adjustment made provides a weighting by which to aggregate together all re-offences. This weighting is based upon the relative severity of the re-offence and is derived from the ONS's Crime Severity Score for England and Wales.

Data on proven reoffending from the MoJ is used, alongside other measures, to quality adjust output in the criminal justice system.

Because of the disruption to court proceedings during the coronavirus (COVID-19) pandemic, comparable reoffending data were not available.  Following discussions with the MoJ and the fact that re-offending rates returned to levels within historical ranges, we decided to remove the "covid fix" and resume the use of actual re-offending data. To cover the period impacted by coronavirus, we used linear interpolation between the last available unaffected data (July to September 2018) up to the point we resumed the use of real-time data (April to June 2022).

The prisons safety adjustment

The prisons safety adjustment relates to the number of incidents of assaults, self-harm and deaths that occur in prison custody.

We measure the number of incidents per 1,000 prisoners, which are grouped into "those resulting in a death", "severe", and "less severe". These groups are subsequently weighted and aggregated together based on their relative cost. This is achieved by using the total cost to society of workplace injuries as a proxy, taken from the Health and Safety Executive (HSE).

The custody escapes adjustment

The escape adjustment relates to ensuring prisons fulfil the role of public protection and is applied to activities used to measure the output of the prison service.

The measure is based on changes in the difference between the number of escapes and a baseline of 0.05% of the England and Wales prison population – a historical target used by the MoJ. The purpose of this is that as the absolute number of escapes approaches zero, the relative change year-on-year would have a disproportionate effect on a non-baselined quality adjustment index.

Combining the components

For each component, we calculate an overall growth factor to be applied to the basic activity index. For those areas where multiple adjustments are applied, the growth factors are applied on a weighted average basis (Table 2 outlines the weights used). All the components of public order and safety (POS), including non-quality adjusted components, are then cost-weighted together to produce an aggregate index of POS quality adjusted output.

Inputs

Inputs estimates are calculated for:

  • fire

  • courts (including probation)

  • prisons

The public order and safety volume of inputs series is a weighted combination of fire, courts (including probation) and prison (chain-linked using the UK National Accounts expenditure weights).

The inputs are mostly measured indirectly, with the exception of fire (further detail follows), using deflated expenditure to derive the volume of labour, intermediate consumption (goods and services) and capital consumption inputs. An aggregate index is then compiled for total POS inputs and weighted using expenditure levels for fire, prison, courts and probation.

For 2022, a direct labour input measure for fire and rescue services has been introduced; this is included from 2011 onwards. This relates to local government labour, which accounts for more than four-fifths of fire and rescue service labour expenditure.

Full-time equivalent (FTE) volumes for fire by rank or grade are matched with relevant average salary data from ASHE data. For the 2022 publication, improvements have also been made to the central and local government deflators used to determine volume growth in the intermediate consumption (IC) measure. Because of the different types of expenditure across service areas, a composite bespoke IC deflator has been generated for each service area, except for the fire service, where IC expenditure will be deflated using the headline CPI deflator.

Back to table of contents

8. Children’s social care

Children's social care (CSC) covers a range of services including the provision of social work, personal care, protection or social support services to children in need or at risk. CSC output includes activities such as safeguarding, non-secure accommodation, secure accommodation, adoptions and care leavers.

Main concepts in CSC productivity measurement

CSC productivity is guided by four core concepts:

  • inputs

  • outputs

  • quality adjustments

  • casemix

We provide a detailed explanation of how each concept is measured in this section, along with the data sources used for their calculation.

Quantity output

Two types of quantity outputs are constructed:

  • directly measured outputs

  • indirect measured outputs

The directly measured outputs are adjusted for quality (quality adjusted outputs) to account for changes in the value of services; this is in the context of social care-related quality of life.

In 2022, approximately two-thirds of CSC output was measured directly (71%) while 29% was indirectly measured.

Directly measured outputs

CSC output comprises directly measured services, where activity data are available. However, since CSC activity data do not cover all services, adjustments are made to account for services without direct activity data. These are measured using the "output-equals-inputs" approach, which has been applied retrospectively to 1997 – this is referred to as indirectly measured output.

As a result, quantity output is composed of:

  • directly measured output, based on available activity data
  • indirectly measured output estimated using the "output-equals-inputs" approach

Some of the activity datasets are obtained directly from government sources and others downloaded from the relevant government websites for Scotland, Wales, England and Northern Ireland. The directly sourced data from these four nations includes:

  • fostering services
  • secure accommodation
  • children's homes
  • other looked after children

Additional downloaded datasets cover activities such as adoptions, safeguarding, special guardianship and care leavers. These datasets complement the directly sourced data, providing a more comprehensive view of CSC services.

To calculate an index for looked after children (LAC), activity in fostering, children's homes and other are aggregated into a single series of non-secure accommodation, with secure accommodation left separate. More information on this approach can be found in Measuring the output of children's social care: an alternative method for looked after children (PDF, 121KB). These two series, the care leavers, adoptions, special guardianship and safeguarding activities are weighted by their expenditure shares to produce a direct measure of CSC output on a financial year basis. These estimates are splined, lagged and backcast to produce a series of appropriate length.

To derive the country-level direct output index, expenditure weights – based on direct expenditure data – are applied at the country-services level to the growth factor of the activity data for each CSC service.

The resulting cost-weighted activity data, referred to as the contribution to growth, are aggregated across services to calculate the direct output percentage growth for each UK nation. In essence, the direct output percentage growth for each nation is obtained as the sum of cost-weighted CSC activities.

At the UK level, the direct output is determined by aggregating each country's direct output, weighted by their respective country expenditure shares. This ensures that the overall UK estimate reflects the relative contribution of each nation.

Definition of the activities used in CSC processing

Secure accommodation

  • A count of the total number of days during the financial year that LAC spend in placements, including days spent in short-term placement, and excluding unaccompanied asylum-seeking children.

Non-secure accommodation

  • Fostering services: activities data on children placed for adoption and foster placements.

  • Children's homes: activities data on residential schools and children's homes.

  • Other LAC activities data: these are calculated by subtracting the sum of measured LAC activity from total LAC activities.

Adoptions

  • A count of the number of LAC who were adopted in the year; children who are adopted cease to be reported in the data collection for looked-after children.

Special guardianship orders

  • The number of LAC who were the subject of a special guardianship order during the year.

Care leavers

  • The number of LAC care leavers in the last year eligible for, and/or receiving services.

Safeguarding

  • A count of the total number of children in need (CIN) and number of children on a child protection plan (CPP); these series are summed together to estimate the total number of children receiving safeguarding services.

The expenditure data used to cost weights the direct output measures are on the following services:

  • fostering services

  • secure accommodation

  • children's homes

  • care leavers

  • adoptions

  • special guardianships

  • other looked after children

  • safeguarding

  • other non-LAC

  • total expenditure on children's and families' services

Adoptions, special guardianship orders, care leavers and safeguarding activities data for Scotland and Northern Ireland are currently not included in the direct activity measurement as corresponding granular expenditure data are not available to create a cost-weighted activity index (CWAI). Therefore, only secure accommodation and non-secure accommodation are measured directly for Scotland and Northern Ireland.

Indirect outputs

CSC quantity output includes certain services for which activity data are unavailable. To estimate these services indirectly, we rely on expenditure data from two sources:

  • total current expenditure (which represents Online System for Central Accounting and Reporting (OSCAR) expenditure data for CSC)

  • total direct expenditure (which covers specific CSC services at the country level)

Given the differing structures of these two expenditure datasets, we aggregated direct expenditure at the country-services level and treated the OSCAR data as a total. To estimate indirect output, we subtracted total direct expenditure from total current expenditure. This process isolates the portion of OSCAR spending associated with services lacking activity data, which are measured using the "output-equals-inputs" approach.

The indirectly measured part of the output refers to non-looked-after children (non-LAC) outside of safeguarding services. LAC are defined by a local authority if a court has granted an order to place them in care, or a council's children's services department has cared for the child for more than 24 hours. Non-LAC are classified as such if they are not taken out of their home environment but are being monitored.

Expenditure on other non-LAC is taken as a residual following the direct output processing, split into components, deflated and aggregated to an indirect output index using the same method as is used in the inputs' calculations.

Casemix

The casemix adjustment for CSC was introduced in the Improved methods for public service productivity: total, UK, 2019. The casemix adjustment starts in 2012.

In public service productivity, activity measures are combined using expenditure weights to create a CWAI (see Section 1: Overview for more information). This assumes that the value of an activity carried out by the service is reflected in the cost. However, this is not always the case, as some services are costlier but associated with lesser outcomes compared with cheaper services that lead to better outcomes. This necessitates the casemix adjustment because, in the absence of granular unit cost data, weighting activities according to their expenditure could overstate the value associated with them.

Since unit cost data are not available for CSC, and there is a known issue in the service area whereby some services (such as residential LAC) are more costly than others but are not associated with better child outcomes (see Residential care in England, the place of residential care in the English child welfare system, and The lifelong health and wellbeing trajectories of people who have been in care (PDF, 2.14MB)). An adjustment is made to the expenditure weighting of different output series. This captures changes in the cost of delivery that are associated with casemix characteristics.

For example, one casemix characteristic used in the adjustment for LAC is the age of the child. Age is an important factor affecting CSC LAC expenditure because, as the older a child is, the more costly their care is. Therefore, without adjusting for casemix, if a simple count of the number of fostering care days stays the same but more inputs are needed to provide these activities because of the children in foster care being older, productivity will fall.

However, if the same fostering activities are measured in a more granular way and if we are able to adjust the cost weights to reflect the age-expenditure relationship, productivity may not have been observed to have fallen at all.

The casemix adjustment is estimated using data on the selected casemix characteristics at a local authority level for England only. The final casemix deflator is applied to the directly measured activity categories of safeguarding, secure and non-secure accommodation, adoptions and special guardianships for each of the devolved administrations.

To estimate the casemix deflator for a service, a regression is run using this specification,

where:

 LA is the local authority
 t is the year considered
 s is the expenditure
 C is the deflated expenditure
 n is the number of children (safeguarding or LAC)
 f1 is the volume output with casemix factor 1
 f2 is the volume output with casemix factor 2
 e is the error term

Observations are weighted by the volume of outputs nLA,t,s~

β~n,t,CIN is the incremental expenditure associated with one unit of output that has no casemix factors (nLA,t,s). βf1,t,s is the incremental spend associated with one unit of output that has casemix factor 1, relative to a unit of output that does not have casemix factor 1. It is important to note that these coefficients do not have the same interpretation as a unit cost.

From the regression, the local authority's expected spend on casemix in year t (cEngland,t|t,s) can be estimated as well as the expected spend in year t if the casemix was instead the casemix of the following year t (cEngland,t+1|t,s) The year-on-year change in expenditure attributable to change in casemix is calculated as follows:

The casemix adjustment is estimated using the same methodology but different data according to whether the service is LAC services or safeguarding. The data that feed into the adjustment are as follows:

LAC casemix data

  • Number of LAC aged 10 years or over, published by the Department for Education (DfE).

  • The number of LAC on 31 March, published by the DfE.

Safeguarding casemix data

  • Children in need primary need code at referral, published by the DfE.

  • The percentage of children in need on 31 March with any disability, or by specific disability, published by the DfE.

  • The percentage of children in need with a child protection plan, published by the DfE.

To reach a single index of directly measured casemix-adjusted and quality-adjusted activity, the growth rate of the casemix-adjusted and quality-adjusted series is calculated for each service area. The contribution to growth of direct output of each CSC service is calculated by multiplying this growth by the corresponding expenditure weight to create a chain-linked Laspeyres volume index at the UK level.

The contributions to growth are also used to combine the indirect output with direct output. The growth in indirect output is multiplied by the share of expenditure spent on the indirect portion of total CSC spend, and growth in direct output (casemix-adjusted and quality-adjusted) is multiplied by the share of expenditure spent on direct output to produce the final output series for CSC.

Quality adjustment

Quality adjustment is applied to the quantity output index with a positive quality adjustment indicating that the quality of children's social care (CSC) services provided has improved. The quality adjustment series included in the CSC service area is applied to the corresponding directly measured activities.

Quality adjustment in CSC was first introduced in Improved methods for total public service productivity: total, UK, 2019.

Quality adjustment data for England come from the DfE, and for Wales, quality data come from the Welsh Government. Where quality adjustment data are not available for the devolved administrations, England data are used instead. This particularly affects Scotland and Northern Ireland where suitable quality adjustment data are not available.

Quality output measures used in CSC

Re-referrals and re-registrations, which start in 2012:

  • these data are applied to the safeguarding activity of CSC

  • children in need re-referral rate, measured as the percentage of children referred within 12 months of a previous referral using data from the DfE in England and Welsh Government data for Wales

  • child protection plans starting in the year 2012, which were a second or subsequent plan, measured as the percentage of child protection plans that were repeat plans using data from the DfE in England; up-to-date data are not available for Wales, and therefore Wales safeguarding activity uses the England re-registrations quality adjustment

  • since safeguarding activity is not measured for Scotland and Northern Ireland, no re-referrals and re-registrations quality adjustment is applied to them

  • this measure is lagged by one year on the basis that the quality of the service in a particular year may be lower if in the following year there is a higher rate of re-referrals and re-registrations

Placement stability, which starts in 2010:

  • these data are applied to the LAC activity of CSC

  • number of placements a child has been placed in in the last year; measured as the percentage of LAC with two or more placements during the year using data from the DfE in England and Welsh Government data in Wales; no suitable data are available for Scotland and Northern Ireland, and therefore the England placement stability quality adjustment is applied to these countries

  • an increase in placement stability is treated as an increase in quality

Care leavers, which starts in 2014:

  • these data are applied to the care leavers activity of CSC

  • the percentage of care leavers living in suitable accommodation; these data are only available for England using data from the DfE

  • percentage of care leavers that are not in employment, education or training (NEET); these data are available from the DfE for England and the Welsh Government for Wales

  • since care leavers' activity is not measured for Scotland and Northern Ireland, the care leavers quality adjustment is not applied

For series whereby an increase in the measure reflects worse outcomes, the inverse is taken so that the quality adjustment index reflects a fall in quality.

Quality adjustment is applied to each individual area of CSC activity (for safeguarding, non-secure accommodation, secure accommodation, adoptions, special guardianships and care leavers).

For safeguarding and care leavers for England, each have two indicators of quality, which need to be combined into a single index of safeguarding quality and care leavers quality. An equal weight is attributed to the suitable accommodation measure and the NEET measure for care leavers. For safeguarding, weights for re-referrals and re-registrations correspond to the percentage of safeguarding expenditure on children in need and children on child protection plans, respectively (84% and 16% in 2019).

A chain-linked Laspeyres volume index of quality-adjusted output is produced for safeguarding, care leavers, and secure and non-secure accommodation by country. No quality adjustment is applied to adoptions or special guardianship orders, these are chain-linked Laspeyres volume indices.

Inputs

Our inputs are based on OSCAR expenditure data, which provides the levels of expenditure associated with CSC in the UK.

Input data is categorised into three main components:

  • compensation of employees (labour)

  • intermediate consumption

  • capital

Intermediate consumption is further disaggregated into:

  • goods and services – local authority

  • other provision – wages and salaries

  • other provision – goods and services

using central shared database (CSDB) procurement weight averages from the period 2000 to 2006.

Different deflators are applied to convert expenditures from current prices to volume terms. For example, labour input volume is estimated by deflating labour expenditure using a constructed pay deflator, ensuring an accurate representation of real input changes over time.

Before 2011, labour expenditure is deflated using salary data from the Annual Survey of Hours and Earnings (ASHE), mapped by Standard Occupational Classification (SOC) codes. From 2011 to 2021, the Index of Labour Costs per Hour (ILCH) deflator is used, and from 2022, the Average Labour Compensation per Hour (ALCH) is used.

The volume of goods and services inputs is derived by deflating expenditure using the Social Care Local Government Intermediate Consumption deflator. The wages and salaries component is adjusted using the ALCH labour deflator, consistent with the approach used in labour input estimates.

The volume of capital inputs is calculated by deflating consumption of fixed capital using a constructed social protection local government capital deflator, based on ONS price indices.

Finally, these three input components are aggregated using their relative expenditure weights, producing a comprehensive UK estimate of children's social care inputs.

Back to table of contents

9. Social security administration

Social security administration (SSA) is the administration of different types of benefits including the processing of new benefit claims and maintaining existing benefit load.

For the total public service productivity (PSP) annual publications between 2018 and 2021, because of the difficulties in capturing the composition of Universal Credit (UC) cases via traditional unit cost weighting, SSA had employed an "output-equals-inputs" approach. As a result, no productivity metrics were calculated from this service area.

Following the PSP Review, the output model has been updated to account for UC cases, and productivity metrics are now available for SSA. Furthermore, following the PSP Review, quality adjustment has been applied to SSA for the first time, which determines the "correctness" of administered benefits based on Department for Work and Pensions (DWP) fraud and error rates.

Quantity output

SSA output is directly measured using data provided by the DWP.

Reforms affecting SSA mean that the output measure has evolved over time, with multiple benefit schemes being replaced by UC, which presents two conceptual challenges to the measurement of UC output.

Firstly, the application of a conventional cost-weighted activity index (CWAI) would prevent the measurement of any productivity change resulting from the transfer to UC; if UC could deliver equivalent benefits at lower cost, then a conventional CWAI would place a lower value on the new UC activity despite an equivalent value of service delivered.

Secondly, as UC claims contain a high degree of heterogeneity, and generally more simple claims migrated from legacy benefits to UC before those consisting of many entitlement components, one UC claim administered in a later year may involve more value delivered than a claim administered in an earlier year.

As part of development work undertaken through the PSP Review, for the 2022 release, a method for directly measuring the output of UC has been employed.

To integrate UC into the SSA output index, UC activity has firstly been adjusted to account for changes over time in the number of entitlements per claim using data on the proportion of claims with various entitlements and the average cash payment associated with these entitlements. The adjusted UC activity is then weighted together with legacy benefit activity on a benefit-weighted basis. As a result, a benefit payment bundle of equivalent value from UC and its legacy benefits are given equivalent weight in output, enabling any change in the input cost of delivering such broadly equivalent sets of benefits to be accounted for in the productivity measure.

For benefits other than UC and its legacy benefits, output is calculated using DWP data on benefit claims and caseload weighted by the unit cost of each. The combined UC and legacy benefit output index is integrated with other benefits on a cost-weighted basis.

This approach will remain under review, as the migration from legacy benefits to UC is expected to be completed in 2025 and a decision will be made as to whether or not it is worthwhile continuing to adjust UC claims for the number of entitlements per claim after the transition is complete. The measure currently excludes data on benefits administered by HM Revenue and Customs (HMRC), in particular, Tax Credits and Child Benefit, and further research is intended to extend coverage to these.

Quality adjustment

Following the PSP Review, the SSA service area has implemented a quality adjustment based on fraud and error rates for benefits administered by the DWP. Benefits can be overpaid or underpaid, with errors categorised as one, or a combination of:

  • official error (erroneous payments made by DWP or authority)

  • customer error (mistake made by the customer)

  • customer fraud (deliberate fraud made by the customer)

For the SSA quality adjustment measure, the total DWP benefit fraud and error rates are used from FYE 2008. These refer to the percentage of the total amount of benefit expenditure administered each year that relates to overpayments and underpayments. Statistics on fraud and error are published annually by DWP and cover England, Scotland and Wales.

Firstly, total fraud and error rates are derived from overpayment and underpayment data. The gross overpayment and underpayment rates per year, not net, are used to inform the measure. For example, if there is a 2% overpayment and 2% underpayment administered for a particular year, then the total fraud and error rate would be 4%. Using the net derived value from overpayment and underpayment data would conceal the total error rate within the system and is thus not appropriate.

Basing the quality adjustment index on movements in total fraud and error rates over time would not have been appropriate because of small variations in the fraud and error rates causing disproportionate volatility. Therefore, quality adjustment is based on "correctness" rate, where the fraud and error rates are deducted from 100%. For example, a fraud and error rate of 4% would return a "correctness" rate of 96%. The growth rates in "correctness" rates are used to inform quality adjustment.

Because of HMRC benefits (Working Tax Credits and Child Benefits) not being currently captured in the SSA output model, the quality adjustment only comprises data from DWP. When HMRC data are included in the output measure, the quality adjustment will be amended to account for the relative "correctness" contributions from HMRC benefits. An aggregate percentage "correctness" rate will be determined by weighting the relative monetary value of "correct" payments from DWP and HMRC in relation to total benefit expenditure administered by both departments.

Inputs

The SSA inputs index consists of UK National Accounts-consistent deflated expenditure on labour, intermediate consumption and consumption of fixed capital under the Classification of the Functions of Government (COFOG) 10N1: Social Protection. Currently, there are no directly measured components for SSA inputs. Current price expenditure is drawn from the UK National Accounts and is deflated to produce a constant price series.

As of the 2022 publication, labour is deflated using the ALCH for industry O (Public administration and defence; compulsory social security), while a constructed SSA general government capital deflator is provided directly by the ONS Capital Stocks team. For intermediate consumption, a composite deflator is used to reflect price changes in the cost of goods and services used within the SSA service area. The composite deflator is constructed by sourcing data on the prices and quantities of goods and services used in the provision of SSA.

Changes in these prices are then aggregated into a Paasche price index, which weights the changes in prices by their relative volumes in the current year. Weights are determined using expenditure data, sourced from DWP's Annual Report and Accounts.

The labour, goods and services, and capital indices are aggregated together using their respective UK National Accounts general government expenditure shares to form a chain-linked Laspeyres volume index. By 2022, the approximate weights for labour, goods and services, and capital, were 47%, 47%, and 6%, respectively.  

Back to table of contents

10. Tax administration

Quantity output

The unit of output for the tax administration service area is the number of taxpayers or registered traders or operators for each tax. These data are sourced from the HM Revenue and Customs (HMRC) Numbers of taxpayers and registered traders publication in addition to specific datasets supplied directly by HMRC for National Insurance and Corporation Tax.

Expenditure data for calculating cost weights are sourced from an annual dataset supplied to the Office for National Statistics (ONS) by HMRC. This dataset includes net expenditure and receipts for each tax collected from tax year ending 2019 to tax year ending 2023.

These unit costs are then used in a Laspeyres index, where output growth for periods t-1 to t is calculated as activity weighted by the unit costs from period t-1.

Activity and cost data enable direct measurement of 11 taxes:

  • Income Tax

  • National Insurance

  • Value Added Tax (VAT)

  • Corporation Tax

  • Capital Gains Tax

  • Insurance Premium Tax

  • Air Passenger Duty

  • Landfill Tax

  • Climate Change Levy

  • Aggregates Levy

  • Inheritance Tax

Revenue adjustment

A "revenue adjustment" is applied that adjusts the cost weights by the revenue raised per £ of administrative cost for different taxes.

Data for revenue adjustments are sourced from the HMRC tax receipts and National Insurance contributions for the UK publication, which provides the receipts by financial year for the 11 taxes covered in this analysis.

An increase in revenue raised does not directly result in output growth. The effect of an increase in the revenue raised by one tax relative to others will increase the effect of increases or decreases in the number of payers of that tax on the overall output index.

While this enables efficiency improvements from changes in the number of tax payments made for low-cost taxes relative to high-cost taxes to be represented in the measure, it does not address any aspects of quality in tax administration. As such, the measure is categorised as non-quality-adjusted.    

Inputs

Inputs estimates are based on expenditure figures that are calculated by HMRC by allocating their overall expenditure to tax administration functions based on reported staff time spent on those activities. These expenditure values are also used by HMRC in their "Cost of Collection" statistics, which are published annually in the HMRC annual report and accounts.

Expenditure values net of customs have been provided directly to the ONS for the period tax year ending 2019 to tax year ending 2023, which are converted to calendar year by the same cubic splining method used in other PSP areas.

A number of small adjustments are applied to maintain coherence and comparability with the UK National Accounts and other public service productivity service areas.

Detailed Online System for Central Accounting and Reporting (OSCAR) data can be used to estimate the proportions of labour, intermediate consumption and consumption of fixed capital within this expenditure, with deflators being applied in these proportions to estimate the volume of inputs.

The labour deflator is constructed by weighting together changes in salaries of the grades within HMRC by their shares of staff in post. The composite deflator information is sourced from HMRC workforce and salaries data provided to directly to the ONS.

A combination of ONS price indices and OSCAR data have been used to construct a composite intermediate consumption deflator and consumption of fixed capital is deflated using the Public sector finances (central government): depreciation deflator.

Back to table of contents

11. Other government services

Central and local government expenditure data are obtained for:

  • general public services, for example, executive and legislative organs, basic research

  • economic affairs, for example, general economic, commercial and labour affairs including transport, agricultural, forestry and fishing

  • environmental protection, for example, waste management, pollution abatement

  • housing and community amenities, for example, housing development, water supply and street lighting

  • recreation, culture and religion, for example, recreational and sporting activities, broadcasting and publishing

  • other public order and safety services, for example, research and development

Total current expenditure on these categories is deflated using the Consumer Prices Index (CPI) to obtain a constant price expenditure series. This series is then used to generate an index of volume of inputs, which is assumed to equal the volume of output.

Tax administration is removed from the "other" grouping from 2018 onwards. Inputs growth net of tax administration for the period 2018 to 2019 is calculated and chain-linked to the index including tax administration up to 2018. Thus, the removal of tax administration does not affect the growth rate or interpretation of the "other" index when it is removed in 2018.

Back to table of contents

12. Difference between annual and quarterly statistics

Alongside the annual estimate of Public service productivity: total, UK, 2022, which is badged as an Accredited official statistic, we also publish quarterly official statistics in development (also known as experimental statistics) measures of total public service productivity.

The quarterly series offers a timelier measure, as the annual series has a significant time lag.

However, compared with the annual estimates, the quarterly statistics:

The inputs for the quarterly estimates are based on Office for National Statistics (ONS) current price expenditure on labour, intermediate consumption, capital and social transfers in kind (STIK). Appropriate deflators are applied to approximate the volume of inputs from expenditure data. For more recent quarters, full-time equivalent (FTE) data derived from the ONS public sector employment estimates are used instead to calculate labour inputs for health, education, social protection, fire and justice. Deflated expenditure data on NHS bank staff are also introduced for later quarters.

The output only accounts for the volume of activity, not the quality of output, and uses non-seasonally adjusted ONS chained volume measures (CVM).

Inputs and output of experimental quarterly estimates of productivity are seasonally adjusted at the total level.

Expenditure and CVM data are consistent with non-seasonally adjusted quarterly national accounts (QNA) data as published in ONS breakdowns of general government final consumption expenditure. However, published quarterly productivity estimates use different seasonal adjustment methods and may differ from seasonally adjusted data published in QNA.

We published the quarterly estimates of healthcare inputs, output, and productivity for the first time in February 2025, alongside the estimates of total public service productivity, inputs, and output.

To provide more timely estimates of annual productivity (for total productivity and healthcare) we include quarterly annualised growth rate (QAGR) estimates in our quarterly publication. The QAGR method uses the growth rate in annualised quarterly PSP estimates to produce nowcast annual estimates.

Back to table of contents

13. User and stakeholder needs

The Office for National Statistics (ONS) actively seeks feedback from users of its public service productivity statistics to inform its future work priorities. We are particularly interested in user views on the value of these statistics to inform policy debates and research projects within the academic and national accounts fields. The updated Quality and Methodology Information (QMI) for the total public service productivity article includes further information on user needs and perceptions.

We use various methods to engage with users about our statistics, including regular stakeholder engagement, pre-publication quality assurance from government experts, user consultation meetings and pre-announced methods changes, such as improved methods for total public service productivity: total, UK, 2021. Any feedback or comments are welcome and can be sent to productivity@ons.gov.uk.

Back to table of contents

14. Cite this methodology

Office for National Statistics (ONS), released 22 April 2025, ONS website, methodology, Public service productivity estimates: sources and methods

Back to table of contents

Contact details for this Methodology

Public Service Productivity team
psp.review@ons.gov.uk