Skip to content

Index of Services Quality assurance

Skip to the top of the page

This page is currently under review as part of the GDP Output Improvement Programme (226.7 Kb Pdf) .

Estimation by survey cannot be 100 per cent accurate; there is always a margin of error.

For many well-established statistics, ONS is able to measure and publish the sampling error associated with the estimate, using this as an indicator of reliability.

Data sources

The Index of Services, however is currently constructed from a wide variety of data sources, some of which are not based on random samples.

Because of this it is not yet possible to measure the sampling error.

ONS' IoS Production team uses a variety of procedures to try to ensure that the estimate is of high quality.

This section describes the procedures for examining the data and identifying suspected errors. How those errors are treated is described in the section on Quality Adjustments.

The Index of Services is part of the UK National Accounts and this is an important factor in assuring its quality.

The National Accounts framework binds the IoS into a coherent and consistent set of accounts covering the whole economy, based on many diverse data sources.

How this framework affects the IoS is described in the section on National Accounts.

The general quality of the data used in the IoS is an issue being addressed by the ONS in a programme of industry reviews, which will seek to improve the data and methods used to measure short-term changes in gross value added across all services.

More information about these reviews can be found in the section on Future Improvements.

Back to top

Data analysis

An initial quality assurance of the monthly Index of Services data was carried out during 2000, as part of the run-up to launching the experimental index in December of that year. In this process all of the input time series from 1995 were examined graphically to identify any anomalies, e.g. step changes in the data or unusually extreme peaks and troughs. These anomalies were investigated with data suppliers and the reasons, where found, were documented. Where appropriate, e.g. where the anomaly was found to be caused by an error or by a change in survey methodology, quality adjustments were made to the data.

Since that time, ONS has put in place regular monthly quality assurance procedures to augment the quarterly quality assurance procedures already in use. The purpose of these procedures is to understand and be able to explain movements in the data, to allow quality adjustments to be made in an informed manner and to check that the computer system is calculating the published indices correctly.

All of the time series that are inputs to the system are examined graphically to identify anomalies, as are higher level aggregates up to and including the headline IoS. Tables of data and revisions to data, showing index numbers and growth rates, help to identify unusual behaviour. Briefing information is supplied each month by the ONS’ Monthly Inquiry into the Distribution and Service Sector (MIDSS), data which are used in the Index of Services. Similar, but less detailed, information is available on request from other data suppliers. In addition, a monthly economic briefing is provided by an ONS economist, giving information about movements in comparable data sources and major news stories that might affect services' output.

The monthly Index of Services and the equivalent quarterly indicator of services' output within the output measure of Gross Domestic Product, GDP(O), use a consistent approach to making quality adjustments. Criteria for making adjustments have been agreed by the IoS and GDP(O) Statisticians and regular meetings are held to agree on specific quality adjustments for both monthly and quarterly data, drawing on the information sources listed above. These criteria are described briefly in the section on Quality Adjustments and in more detail in Annex E.

The monthly IoS is benchmarked to quarterly GDP(O) for services to ensure consistency. Comparisons are made between the data before and after benchmarking, in order to check that the monthly path has not been significantly distorted. Any significant differences are investigated, since they could be the result of a processing error in one or other of the (independent) systems. Benchmarking is explained in more detail in the section on Time Series Methods. There is also a series of automatic system checks carried out each month to ensure that all procedures are functioning correctly.

Back to top

Content from the Office for National Statistics.
© Crown Copyright applies unless otherwise stated.