'Chapter 10: Methodology' of the Occupational Pension Schemes Survey (OPSS) annual report 2011 covers information on the survey methodology including estimates of standard errors.
The aim of the Occupational Pension Schemes Survey (OPSS) is to provide a picture of occupational pension provision in the UK. The reference date for the 2011 survey was 6 April 2011. All schemes were asked for their status (open, closed, frozen or winding up) as at this date. Questionnaires were sent out to schemes in September 2011. Schemes were asked to provide information on membership from the latest available trustees’ report and accounts.
The 2011 questionnaire was based on the one used for the 2010 survey. However, to improve coverage of scheme sections in the public sector, an additional form type was introduced for public sector schemes with multiple sections. This form was based on the existing form for public sector schemes.
Experience from previous surveys also led to improvements to some questions and changes to others which were no longer appropriate, for example because of changes to pensions legislation.
In 2011, eight types of questionnaires were sent out. These were for:
Private sector single section defined benefit schemes with 12 members or more (127 questions)
Private sector multiple section defined benefit schemes with 12 members or more (127 questions)
Private sector single section defined contribution schemes with 12 members or more (107 questions)
Private sector multiple section defined contribution schemes with 12 members or more (107 questions)
Public sector single section schemes (see Chapter 1 for definition) (42 questions)
Public sector multiple section schemes (see Chapter 1 for definition) (42 questions)
Schemes with between 2 and 11 members, regardless of whether they were defined benefit or defined contribution schemes (33 questions)
Schemes which were winding up, regardless of size or whether they were defined benefit or defined contribution schemes (15 questions)
All public sector schemes in the survey were defined benefit. These schemes were asked a reduced set of questions based on those on the private sector defined benefit questionnaire.
For the 2011 survey, multiple section forms were introduced for all public sector schemes in the sample. In the 2010 survey, only the key public sector schemes in the sample were asked if they had multiple sections. Extra questionnaires were then issued to those schemes and they provided section-level information where previously they had provided information at the scheme level. This affected results where members would be classified differently depending on whether classification was at the level of the scheme or the section.
In the 2011 questionnaires, schemes were asked to provide details of the total numbers of pensioner members and members with preserved pensions. Prior to 2008 these totals were derived by summing their constituent components. Where any components were missing, the total of the available components was used. The approach used since 2008 has led to an improvement in the estimation of total membership figures. It may also be responsible for part of the increase in numbers of pensions in payment and preserved pension entitlements between 2007 and 2008 (see Chapter 3).
As in the previous surveys, ONS used the internet to capture responses electronically. Respondents were given a choice to answer the questionnaire in a paper form or online, with the online questionnaires being replicas of the paper questionnaires sent to each scheme.
A new online system was developed in 2010 to make access to the questionnaires and capturing responses easier. This online system was also used for the 2011 survey. The online questionnaires contained a hover over help function that referred to individual questions and contact details were provided if there were any problems, as with the paper copies.
The online questionnaires went live at the same time the paper questionnaires were despatched. As a result the number of schemes using the internet increased compared with 2010. In 2011, 60 per cent of all responses were received via the online questionnaires with electronic responses more common among the schemes with more than 100 members. This compares with 34 per cent of responses via the online questionnaire in 2010.
The online questionnaire was also used for data entry of responses received in paper form, and proved useful in highlighting invalid or inconsistent answers from these responses during the data entry process.
To be eligible for selection for the survey, schemes needed to have ‘live’ status on the Pension Scheme Register (see The Register). In this context, ‘live’ status meant that a scheme was either open, closed, frozen or winding up; that is, it had undischarged liabilities to pay pension benefits. Schemes might have ceased being ‘live’ if they had completed the winding up process. Alternatively, they might have been removed from the register if they had ceased to have more than one member. All the schemes were selected at random from the register.
The Building and Civil Engineering Retirement Pension Benefit Scheme (closed to further accruals from April 2001) was excluded from the 2011 survey to maintain consistency of coverage with earlier surveys.
Schemes can be classified on the Pension Schemes Register according to their total number of members - this makes it possible to stratify the sample according to the scheme size. Six size bands were used to stratify the sample, as shown in Table 10.1.
In the private sector, the vast majority of schemes are very small with fewer than 12 members, and most of the remainder have fewer than 100 members. However, the bulk of scheme membership is concentrated in a much smaller number of very large schemes. In creating the stratified sample to measure variables associated with membership, a higher sampling fraction was therefore used for the largest schemes: 100 per cent of schemes with 5,000 or more members were included to ensure greater coverage of these schemes as they have the bulk of members.
In the public sector, which comprises mainly central and local government schemes, the number of schemes is far smaller than in the private sector. All public sector administrative units with 5,000 or more members were included in the survey. On the pension scheme register public sector schemes are self classified.
After consulting with the Methodology department at ONS, it was decided to increase the number of private sector forms sent to schemes with between 2 and 11 members to improve the reliability of the scheme numbers estimates. Although the 2011 sample was slightly larger for very small schemes which has, in turn, had an impact on estimates of scheme numbers, the sample size remains smaller than that required to produce reliable estimates for scheme numbers (see Chapter 2).
The new proportions sampled from each size band are shown in Table 10.1.
The Pension Scheme Register is a register of all occupational pension schemes in the UK which is maintained by the Pensions Regulator. It includes information about the scheme’s status (open, closed, frozen or winding up), total membership, benefit structure, date of registration and contact details.
For the public sector, the Pension Scheme Register has a list of ‘administrative units’ rather than a list of schemes. These units may be schemes, sections or other administrative units. For example, there are many entries for the Local Government Pension Scheme as this is administered at a local level. This means that although the survey reports membership for the public sector, it is not possible to report on the number of schemes in the public sector.
Although the register is a good source of information about occupational pension schemes, the information it holds may not have fully reflected the situation at April 2011. This is because while schemes are able to amend their data at any time, they are only required by the Pensions Regulator to review and amend their registered data annually, when receiving their scheme return notification, or triennially for defined contribution schemes with 2 to 11 members.
|Total on register||Proportion Sampled||Total in sample|
|5,000 to 9,999||218||100.0||218|
|1,000 to 4,999||1,049||22.5||237|
|100 to 999||3,743||8.5||318|
|12 to 99||4,561||2.0||90|
|2 to 11||45,401||0.7||300|
|5,000 to 9,999||24||100.0||24|
|1,000 to 4,999||69||40.0||28|
|100 to 999||63||33.3||22|
|12 to 99||37||33.3||12|
|2 to 11||30||33.3||9|
Once the sample had been selected, a cleaning exercise was carried out to confirm schemes’ details prior to the forms being sent out. During this exercise, schemes were asked to confirm their benefit structure (defined benefit or defined contribution), the size band appropriate to their scheme, their contact details and their status.
The benefit structure of all schemes, as shown on the Pension Scheme Register, was checked to ensure they received the appropriate form. Larger private sector schemes were also asked whether the scheme was sectionalised. If they were, they were given the appropriate form for that section’s benefit structure, with up to a maximum of four questionnaires for any one scheme.
During the course of this exercise, some of the private sector schemes in the sample reported that they had already wound up, that they now had just one member, or that they were contract-based pensions (personal or stakeholder pensions - see Chapter 1) rather than an occupational scheme. A small number of schemes indicated that they had merged into other schemes. None of these schemes were sent forms but their information was used in subsequent calculations (see Weighting of the results).
Responses received in paper form were checked on arrival. Electronic responses were reviewed shortly after they were posted via the website. Where the answers given were unclear, either on the initial scan at receipt or on subsequent data entry, the respondent was asked to clarify the information provided.
The quality of the responses was very good. In general, almost all respondents were able to complete the questionnaire fully; the most frequent exceptions were the questions on the status of the scheme at different dates and the number of exits from the scheme.
The questionnaires were despatched to all schemes in the sample in September 2011. The target date for responses was 30 September 2011. Reminder letters were sent out in October 2011, followed by a telephone contact exercise. Responses continued to come in until 22 December 2011 when fieldwork was formally closed. Once this phase of the project was completed, the achieved response rate was assessed and analysis of the data began.
The overall response rate from private sector schemes was 90 per cent (Table 10.2), which was higher than that for the 2010 survey. In terms of their membership, responding private sector schemes covered 75 per cent of the total membership of all private sector occupational pension schemes on the register.
The overall response rate of 90 per cent masks variation between schemes of different sizes. Response rates were highest from the largest schemes.
|Total sample||Responding sample||Response rate (per cent)|
|5,000 to 9,999||218||198||90||82||91|
|1,000 to 4,999||237||216||91||93||91|
|100 to 999||318||287||87||87||90|
|12 to 99||90||75||74||81||83|
|2 to 11||300||254||72||76||85|
For the public sector, responses were received from 98 per cent of administrative units (Table 10.3); these responses covered 99 per cent of members of public sector occupational pension schemes on the Pension Scheme Register.
|Total sample||Responding sample||Response rate (per cent)|
|5,000 to 9,999||24||23||96||96||96|
|1,000 to 4,999||28||28||95||100||100|
|100 to 999||22||21||94||100||95|
|12 to 99||12||11||100||75||92|
|2 to 11||9||9||81||82||100|
As the OPSS is a sample of pension schemes, any estimates produced from the survey responses are weighted in order to produce estimates that are valid for the whole population of schemes. Weighting is also used to account for those schemes in the sample which did not respond.
The method used for weighting the data on membership is explained in section a). The method used for weighting the data on scheme numbers (which changed in 2010) is explained in section b).
Weights were calculated in three steps. The first step set the weights to the inverse of the sampling fractions, which allowed for different sampling fractions in the different size bands and between the public and private sectors (Table 10.1).
The second step made use of information on total scheme membership from the Pension Scheme Register. While this membership was similar to the total membership collected by the OPSS, the two differ due to a delay in the reporting of membership to the Pensions Regulator. The correlation between the two membership measures was exploited to improve the precision of the estimates produced from the survey. Scheme rating-up factors were calculated by multiplying the inverse of the sample fractions by the ratio, for those schemes that responded to the survey, of the total membership from the register to the total membership from the survey. The weighting factors resulting from these calculations are shown in Tables 10.4 and 10.5.
|Schemes responding||Schemes in levy band||Response (per cent)||Sampling Fraction||Weight factor|
|5,000 to 9,999||1,383,221||1,527,832||91||1.00||1.10|
|1,000 to 4,999||460,721||508,202||91||4.44||4.90|
|100 to 999||93,570||106,633||88||11.76||13.41|
|12 to 99||3,018||3,585||84||50.00||59.39|
|2 to 11||746||874||85||151.52||177.51|
|Units responding||Units in levy band||Response (per cent)||Sampling Fraction||Weight factor|
|5,000 to 9,999||175,955||184,549||95||1.00||1.05|
|1,000 to 4,999||74,266||74,266||100||2.50||2.50|
|100 to 999||10,904||11,307||96||3.00||3.11|
|12 to 99||353||397||89||3.00||3.38|
|2 to 11||43||43||100||3.00||3.00|
These factors related to returns as a whole. In some cases, there had to be further adjustments to these factors for individual questions where there was item non-response. For example, if response in a fully enumerated stratum had been 90 per cent, then the resulting weighting factor would, in theory, be applied to all questions. In some cases, however, a question might not have been answered by all respondents.
If only 98 per cent of the respondents had answered a particular question, then the grossing factor for that question would not be the inverse of 90 per cent applied to the rest of the form but the inverse of 98 per cent times 90 per cent (that is, the inverse of 88.2 per cent). Separate weights were calculated for key variables of interest; for example, active members, pensioner members and ‘deferred members’ (former employees with preserved pension entitlements who are not yet receiving a pension).
The methodology used to produce weights for the 2011 survey was similar to that used for previous surveys. However, it should be noted that during work on the 2007 survey some adjustments were made to the 2006 and 2007 survey results. This was partly to do with adjustments relating to late returns for the 2006 survey, and partly to do with a review of the estimation methods applied to the survey, in particular the treatment of non response and schemes that were found to be out of scope. Due to these developments in the estimation methods, when comparing results from 2006 onwards with previous years, caution should be exercised.
Following a robust methodological review, scheme numbers were re-introduced into the OPSS annual report in 2010, for the first time since 2007. Figures for scheme numbers had always been weaker than other OPSS estimates as the survey is designed primarily to measure membership numbers.
The 2011 scheme numbers were calculated using the new methodology introduced in a methodology paper in May 2011. This new methodology attempts to correct for the identification of 'out of scope' schemes at different stages of survey processing. Examples of out-of scope schemes are a scheme with fewer than two members, or a scheme that has finished the winding up process.
The review has improved the methodology for weighting estimates of scheme numbers, but the problem of sampling variability which produced a set of unusual results in 2008 has not been solved by the new methodology. The only way to solve this problem would be to allocate additional resources to the survey so that sample size could be increased, particularly for very small schemes. However ONS does not consider this to be a priority in terms of resource allocation at a time of tight budgets. Although the 2011 sample was slightly larger for very small schemes which has, in turn, had an impact on estimates of scheme numbers, the sample size remains smaller than that required to produce reliable estimates for scheme numbers.
The standard error measures the accuracy with which a sample represents a population. In statistics, a sample mean deviates from the actual mean of a population; this deviation is the standard error. The smaller the standard error, the more representative the sample will be of the overall population. Tables 10.6 and 10.7 show the estimates of standard errors for three key numerical variables which occur in the survey – the total number of active members, the number of pensions in payment and the number of preserved pension entitlements – as reported by respondents. Estimates are shown for each of the six size bands, and also the overall standard error for all six size bands combined.
|Active members||Pensions in payment||Preserved entitlements|
|10,000 +||+/- 58,000||+/- 68,000||+/- 44,000|
|5,000 to 9,999||+/- 16,000||+/- 8,000||+/- 7,000|
|1,000 to 4,999||+/- 64,000||+/- 42,000||+/- 47,000|
|100 to 999||+/- 32,000||+/- 23,000||+/- 33,000|
|12 to 99||+/- 9,000||+/- 8,000||+/- 15,000|
|2 to 11||+/- 5,000||+/- 3,000||+/- 5,000|
|Overall||+/- 94,000||+/- 84,000||+/- 74,000|
|Active members||Pensions in payment||Preserved entitlements|
|10,000 +||+/- 61,000||+/- 38,000||+/- 38,000|
|5,000 to 9,999||+/- 1,000||+/- 1,000||+/- 2,000|
|1,000 to 4,999||+/- 8,000||+/- 5,000||+/- 3,000|
|100 to 999||+/- 3,000||+/- 3,000||+/- 1,000|
|12 to 99||+/- 0||+/- 0||+/- 0|
|2 to 11||+/- 0||+/- 0||+/- 0|
|Overall||+/- 61,000||+/- 38,000||+/- 38,000|
Some of the tables that appear in this report have had individual cells suppressed in order to protect the confidentiality of respondents. The confidentiality of respondent information is protected by suppressing cells that might directly reveal results for individual schemes, known as primary suppressions. Other cells must also be suppressed to prevent their values being calculated by subtraction from the marginal totals of the table. These are known as secondary suppressions.
Estimates have also been rounded to further protect confidentiality. To avoid bias in rounding, controlled rounding was used, which in this case means that cells are rounded up or down to the nearest even number. For example, values of 23.5 and 24.5 may both be rounded to 24. Values may be rounded down to zero and so all zeros are not necessarily true zeros. Table totals may therefore not equal the sum of individual cells due to rounding.
Details of the policy governing the release of new data are available by visiting www.statisticsauthority.gov.uk/assessment/code-of-practice/index.html or from the Media Relations Office email: email@example.com
These National Statistics are produced to high professional standards and released according to the arrangements approved by the UK Statistics Authority.