1. Overview of the Labour Market Survey

As part of the Office for National Statistics (ONS) and UK Statistics Authority Business Plan for April 2019 to March 2022 (PDF. 745KB), the Census and Data Collection Transformation Programme (CDCTP) is leading an ambitious programme of work to put administrative data at the centre of the population, migration, and household statistical systems. This programme of research is underpinned by the Digital Economy Act 2017, which allows the ONS to access data directly from administrative and commercial sources for research and statistical purposes for the public good.

In addition to administrative data, however, there will also remain a need for some residual survey data collection, which ONS intends to be digital by default. In the context of household surveys, this means providing online self-completion mode to respondents (as well as face-to-face and telephone collection). An online mode will:

  • enable respondents to provide data at their own convenience

  • reduce respondent burden

  • reduce operational costs

To collect labour market data via an online-first collection design, a new prototype product is being developed called the Labour Market Survey (LMS). The LMS is a mixed-mode survey that offers an online collection mode by default and focuses on the core data collection requirements needed to produce labour market estimates. It is anticipated that, after the integration of administrative data into the estimation system for labour market statistics, the LMS will replace the Labour Force Survey (LFS) and be the instrument used to collect any residual survey data requirements.

The LMS would also be used to collect socio-demographic variables, which would allow the survey data to be linked to the administrative data sources. The current design of the LMS is still in development and as such is currently a prototype. It is based upon the design of the LFS but there are some differences between the two products, which this report will detail.

The LMS Attrition Test detailed in this report builds upon the research and learning from a series of previous iterative tests as well as a suite of qualitative and user research. This test marks the first instance of testing the LMS longitudinally across multiple waves, investigating attrition and retention rates for sampled households. The test is online only for each wave and has not been designed to produce statistical estimates of employment-related data. Instead, the objective is to look at the uptake and response rates, and the characteristics of the households that complete at each wave.

Back to table of contents

2. Approach to questionnaire design

The existing Labour Force Survey (LFS) is designed for face-to-face and telephone capture modes. With the introduction of an online mode for the Labour Market Survey (LMS), it is not possible to simply translate the existing LFS questionnaire design and flow to this new mode. Initial internal research demonstrated that the content is not suitable for self-completion as it was developed for interviewer-led collection. As a result, a transformative approach has been taken to the development of the Labour Market Survey, which makes it significantly different to the Labour Force Survey.

The questions included in the prototype have been designed in a respondent-centric way. This is a departure from the way that questionnaires have been traditionally designed at the Office for National Statistics (ONS). Previously, questionnaire content has been designed and developed using terms, concepts and flows that satisfy the data user requirements. Respondent-centred design shifts the focus of the design effort to the respondent user. It ensures that the terms are respondent friendly (for example, recycling the language that they use to describe their circumstances in the question wording) and adapts the questionnaire flow to meet their mental model of a concept. These designs create a questionnaire that the respondent can identify with while still gathering accurate data to satisfy the analysts' needs.

To design and deliver in this way required an extensive qualitative design research programme, involving over 1,000 members of the public (to date) to iteratively design the questions to deliver the specified data user needs for the core LMS. This process is described in brief in this section.

There are four phases to the qualitative redesign:

  • Discovery

  • Alpha

  • Beta

  • Live

These phases are the design phases recommended in the GOV.UK service manual.

Discovery phase

The Discovery phase was the first step in the redesign process, and it involved gathering the data user needs for each variable. The researchers put the existing question set to one side and explored with the data user each data point, aiming to determine the purpose of the analysis and original question. Once this was established it enabled the researchers to develop testing guides and research plans to explore the core of these concepts with the public and ONS field interviewers.

The next step was to conduct insight sessions with ONS interviewers to discuss the current data collection process with the main aim to learn about what was working well and not when it came to questionnaire wording and flow. Observations of live data collection with interviewers also took place for existing questions to see how they performed in the field first-hand.

The third step was to interrogate data already available such as the current LFS questionnaire. This enabled inefficiencies in the questionnaire flow to be identified as well as opportunities to modify routing to improve the respondent experience. For example, questions that LFS data demonstrates are only applicable to a small number of respondents can be moved to more appropriate sections of the questionnaire.

Finally, in-depth interviews were conducted with members of the public on certain topics to learn about their mental model for particular employment statuses. Once this was completed, user stories that documented the user needs were developed along with an Alpha phase research plan.

In the Alpha phase a series of prototype questions are developed. In the at desk design phase, the prototypes are developed primarily for mobile devices (smartphones or tablets) to accommodate the smaller screen sizes on such devices. This was done to encourage the researcher to be strict with content because of the limited space, it helped to create leaner questions and reduced the opportunity to add in on-screen information, in turn forcing the design challenge to be addressed in an innovative way. Each prototype question is optimised by mode (online, face-to-face, telephone) and is developed for both in person and proxy completion.

Each question is designed for online first and tested until the research demonstrated they met the respondent and data user needs. The ONS interviewers were also given sight of the questions before testing with the public to get their professional opinion of the redraft. Once the online mode was completed, this design was then used as a base question for researching the other modes. This approach was taken as a question that had been fully researched and designed in a user centred way to be understood in a self-completion mode should also be understood in a mode with an interviewer present.

The research work then focused on testing that question in the alternative mode to discover where adaptations were required to optimise the question for the mode to meet the data need. Each prototype is tested for readability using an online tool, which checks the reading age of the wording. The overall question look or "pattern" is designed to be accessible to all users based on Government Digital Service (GDS) standards.

Alpha phase

The Alpha phase consists of iterative testing, otherwise known as rounds of testing multiple versions of the questions, edited based on research activity insights. The LMS prototypes were tested qualitatively with members of the public via in-depth interview and cognitive tests.

The samples for these rounds were recruited through a recruitment agency that is widely used across UK government in the design of services. The samples were targeted based on the research plan and required learning. The interviews cover both cognition and usability in the same session as question layout can impact upon comprehension. Up to 25 questions were included in a round to ensure that questions were asked in context, which influences comprehension.

The "mental model" concept was explored further in the Alpha phase to validate whether the changes made using the Discovery insights were accurate. Respondents were interviewed to learn about how they understood, processed and responded to the questions they were being asked to consider. Interviews were transcribed and analysed using thematic analysis techniques to identify common themes and issues.

The prototypes were then iteratively redesigned, and the testing cycle was repeated at least three times:

  • test one aims to test the initial draft

  • test two aims to test the changes from test one insights

  • test three explores whether the changes have fixed the issue

On the occasion where the issue still existed after test three, the questions were integrated into subsequent rounds to refine them further.

Once this process was complete, the question design moved into the Beta phase.

Beta phase

The Beta phase involves quantitative testing using a large sample, incorporating the operational design for communications with users. Several quantitative tests of the Labour Market Survey have taken place, including this current test. The large-scale data obtained from quantitative tests is analysed to obtain information on:

  • data quality

  • routing issues

  • drop-offs

  • paradata

If any issues are identified, then further rounds of Alpha and Beta testing are conducted.

A minimal set of questionnaire checks were included in this prototype LMS questionnaire. Logic checks such as invalid dates and preventing alpha characters being input into numeric fields were included, but more detailed and specific checks such as ensuring consistency in the household relationships (for example, a grandfather and granddaughter are coded correctly) were not included. There were no questions that were "hard checked" and had to be completed; any question could be bypassed without answering. This was by design to determine how a minimal set of checks and skippable questions would influence completion rates and respondent journeys.

Back to table of contents

3. Quantitative testing to date

In addition to the qualitative research outlined, a series of iterative quantitative tests have also taken place to provide evidence to inform the transformation of the Labour Force Survey (LFS).

Test one: effectiveness of communication strategies

The first test in this series was conducted in July 2017 and was a single wave, online-only, response rate test of an initial prototype Labour Market Survey (LMS). The test aimed to determine the effectiveness of different communication strategies (letter content, envelope colour, postal days, and envelope branding). The outcomes of the test provided a baseline online response rate of 19.5%, an indication of the most appropriate communications strategy to use in future tests and validated the approach that was being taken. Further details on this test can be found in the Labour Market Survey response rate experiments report for test one.

Test two: incentivisation strategies

Subsequent to this, a second test conducted between September and October 2017 tested different incentivisation strategies (£5 or £10 unconditional vouchers, £5 or 10 conditional vouchers, reusable canvas carrier bag). This test demonstrated that the cost-effective canvas bag incentive could produce a Wave 1 online response rate of 27.5%, and further reinforced the effectiveness of the engagement strategy. Further details from this test can be found in the Labour Market Survey response rate experiments report for test two.

Test three: LMS statistical test

The third iterative test, the LMS Statistical Test, was designed to produce important labour market statistical estimates, which could be compared with the LFS over a similar time period. This was the first instance in which the prototype LMS survey had been used to produce such estimates and provided an initial basis for quantifying the similarities and differences between the statistical outputs from each survey.

The test used a mixed-mode approach (online and face-to-face) and marked the first time the ONS has tested such a mode combination at scale for a household survey. Only wave 1 collection was included in the LMS Statistical Test rather than the longitudinal model used for the LFS. There were three reports published from this test:

  • a technical report

  • a characteristics report

  • a comparative estimates report

All of these reports can be accessed from the Labour Market Survey technical report.

Back to table of contents

4. Objectives of the test

The objectives of the test were:

  • to obtain further evidence of online uptake and response rates at wave 1

  • to measure online uptake, response and attrition rates between waves 1 and 2, and between waves 2 and 3

  • to test different incentivisation strategies at waves 2 and 3

  • to test the effect of between wave engagement (BWE) on attrition rates between waves 1 and 2, and between waves 2 and 3

  • to measure the effect of reducing the number of communications sent to respondents at waves 2 and 3

Data collection was performed by the Office for National Statistics (ONS) using Blaise 5 online data collection software. The infrastructure for this was hosted by Northern Ireland Statistics and Research Agency (NISRA).

This report details the technical design of the Labour Market Survey (LMS) attrition test and provides detail on the design of this test and between the designs of the LMS and the Labour Force Survey (LFS), the current source of labour market data. Socio-demographic data can be found in the LMS attrition test characteristics report. The technical report should be used in conjunction with this report to provide context to the results.

Back to table of contents

5. Sample design

The sample for addresses in England and Wales was drawn from AddressBase, an Ordnance Survey and GeoPlace product comprised of local authority, Royal Mail and Council Tax data, available to the Office for National Statistics (ONS) under the Public Sector Mapping Agreement. This product will have future use in sampling ONS address level surveys such as census or social surveys.

Currently, the Postcode Address File is used as the sampling frame for social surveys - this is a list of all addresses to which Royal Mail deliver mail. At the time of the test, AddressBase did not contain data on addresses in Scotland, so the Postcode Address File was used as the sampling frame for Scottish addresses.

A proportional sample of 50,000 addresses was drawn across England (43,376), Scotland (4,490) and Wales (2,134) using a stratified simple random selection process, similar to the process used for the Labour Force Survey (LFS). Any households that had been sampled for another ONS household survey less than two years before the sample for this test was drawn were excluded from the sampling frame.

The target population of the Labour Market Survey (LMS) was based upon the population of Great Britain who are resident in private households. Unlike the LFS, residents in National Health Service (NHS) accommodation, young people living away from home in student halls of residence or other similar institutions and addresses North of the Caledonian Canal were excluded from the sample for this test of the LMS. This was for the purposes of this particular test, rather than a design feature of the LMS. In the longer-term, the LMS would include these addresses in its sample.

Table 1 demonstrates the composition of the LMS test sample by country and English region. Addresses in England comprised 86.8% of the sample; 4.3% of sampled addresses were in Wales, and 9.0% were in Scotland.

The definition of an "ineligible" address for this LMS test was consistent with the definition used for the LFS. Properties that were vacant, demolished or under construction were not eligible for the test, nor were communal establishments or institutions, holiday or second homes and non-residential addresses such as businesses.

It is difficult for many of these categories to be assessed using an online-only survey - information on second homes, for example, relies on respondents contacting the ONS to inform them of the status of the property. As a result, the proportion of ineligible cases measured by online surveys is very low, requiring an estimate of eligibility to be made for response rates to be more reflective of the actual measure. Based on previous surveys, in particular the LMS Statistical Test, the proportion of eligible households was estimated to be 95% (47,500 of the 50,000 households sampled).

Back to table of contents

6. Questionnaire content

The content of the prototype (or "Beta") Labour Market Survey questionnaire has been developed from the core requirements specified by Labour Market and Households Division in the Office for National Statistics (ONS). This is based on the ONS framework for labour market statistics, which has been developed in accordance with International Labour Organisation (ILO) principles.

The framework is based upon the concepts of supply and demand in the labour market. The "supply" aspect consists of those people defined as being employed, or those who are unemployed or economically inactive but can be considered as potential labour supply. The "demand" aspect relates to employers who require the work to be done. The prototype LMS has content based on the supply side of the framework, and this test covers the "core" requirements only.

The core requirements collected as part of this test related to:

  • demographic information for all household members

  • individual demographics such as date of birth, nationality, marital status

  • individual employment such as employment or unemployment status and hours worked

  • highest educational qualification

For the purposes of this test, all core requirements have been collected using the survey instrument - the future vision for social surveys is that administrative data will form the basis of the data source, and the survey element will capture residual requirements and data linkage variables. Research is continuing into the use of administrative data sources, and this test does not incorporate any administrative data - this will form the basis of future testing and research.

It should be noted that the questionnaire content for this test does not represent the final content of the proposed Labour Market Survey. The content is a prototype that has been, and continues to be, developed iteratively based on the evaluation of testing and further research. The survey design used in this test will be re-evaluated based on the outcomes of the test and will be iteratively improved upon.

The ONS Labour Market Framework covers additional labour market requirements, which this test did not capture - for example, temporary work, guaranteed minimum hours, and expanded self-employment concepts. This is called the "expanded core". As per the ONS strategy, work is ongoing to determine if administrative data sources can be used to provide these data and, where it is not possible, residual survey data collection will be investigated.

Back to table of contents

7. Rolling reference week

To reduce potential recall bias, the Labour Market Survey (LMS) test uses a rolling reference week for the labour market content, compared with the fixed reference week used on the Labour Force Survey (LFS).

The rolling reference week is defined as the week prior to the date on which a household started to complete either the online or face-to-face survey; this is an automatic process performed by the collection instrument. Once the rolling reference week for a household has been defined it remains static; if a household returns to the survey at a later date to enter further data, the reference week remains unchanged.

A fixed reference week is a reference week for which the date has been pre-determined prior to the start of the data collection. The reference week is again determined by the collection instrument, but it will be to a fixed week within the collection month. This is the type of reference week used in the LFS.

The use of a rolling reference week was put in place to aid respondent recall. This was based on evidence from the qualitative testing process, which supported the use of the rolling week to aid the ability of respondents to provide the correct information. If the reference week is closer to the collection date then respondents are more likely to recall the information they are being asked to provide, thereby reducing recall bias and error. This would be particularly applicable for the labour market statistics collected by the LMS, which asks respondents to recall labour market information from certain dates, but this test is not looking to produce such labour market estimates. The explanation of the use of the rolling reference week is provided to describe the survey design, rather than as a factor which influences response rates.

Back to table of contents

8. Engagement strategies

The previous Labour Market Survey (LMS) tests have provided evidence towards the optimal engagement strategy for an online-first survey, and this evidence-based approach was used to define the materials, content, and mailing strategy for this online-only test. As the attrition test marked the first instance of longitudinal collection, there were some experimental conditions included at waves 2 and 3 to test how different communications and incentives could affect response rates.

At wave 1, all sampled addresses were sent a pre-notification letter that included details informing respondents that they had been sampled to take part in a social survey, and that an invite letter would be arriving in the coming days. This initial communication also included information on how to find out more about the survey by going online or contacting the survey helpline (the ONS Survey Enquiry Line). Letters were sent by second-class post and dispatched on a Wednesday, with the expectation that they would be delivered either on the Friday or the Saturday of that week.

The invite letters were sent one week after the pre-notification letter and included instructions for respondents on how to complete the survey. This involved going to the URL www.ons.gov.uk/takepart (the landing page) and clicking a "start now" button. Respondents were then directed to a website where they could enter a 12-digit numeric unique access code (UAC) to access the survey. Each invite letter contained a UAC, which was associated with the sampled household only. The invite letter also contained an unconditional incentive for each sampled address - the incentive was a tote bag; a reusable bag made from canvas which had a graphic representing statistics produced by the ONS on one side, printed in colour. This type of incentive is unique to the LMS and is part of the test. It is not currently in use for the LFS or any other social survey.

A reminder letter was sent to all addresses who had not accessed the online survey after five days (the Monday after the invite letter was sent). This letter was sent on a Tuesday, second class, to arrive on the Thursday of that week. Respondents were informed that the survey would close 11 days after data collection started, but the survey remained open for another week as previous tests had demonstrated that there are households that would attempt to access and complete the survey after the stated deadline. The timeline of the engagement strategy in operation was as follows:

  • T minus nine days - pre-notification letter for the online survey is dispatched (Wednesday)

  • T minus two days - invite letter for the online survey, including the UAC, is dispatched (Wednesday)

  • T - online data collection starts (Friday)

  • T plus four days - reminder letter is dispatched (Tuesday)

  • T plus 11 days - date of survey closure as stated on the invite and reminder letters

  • T plus 18 days - actual closure of survey online data collection starts (Friday)

  • T plus four days - reminder letter is dispatched (Tuesday)

  • T plus 11 days - date of survey closure as stated on the invite and reminder letters

  • T plus 18 days - actual closure of survey

Back to table of contents

9. Data collection operations

The Labour Market Survey (LMS) attrition test was online only - there was no interviewer-led data collection for this test. The online data collection hosting was provided by the Northern Ireland Statistics and Research Agency (NISRA), with the questionnaire instrument programmed by the Office for National Statistics (ONS). The Survey Enquiry Line (SEL), a help and advice service for respondents, was provided by the ONS.

At each of the three waves, all eligible addresses were issued at the same time - this means that at wave 1, all 50,000 sampled addresses were sent their initial communications at the same time. At waves 2 and 3, all eligible addresses were also issued their next set of communications at the same time. This was done to reduce the overall data collection period for each wave of the test.

The online data collection period was three weeks in total at each of the three waves of the survey. As detailed in the Engagement strategy section, a series of letters were sent to each sampled address to inform them they had been sampled to take part in a social survey, and to provide them with the information they required to access and complete the survey online.

There were no restrictions on how often a household could use their unique access code (UAC) to access their online questionnaire during the data collection period. For security and data protection, the questionnaire was "locked" when a household exited the survey either upon completion or when they closed their internet browser and their session ended. If the UAC was then used to access the survey again, the respondent would be taken to the last question answered and could not return to previous answers - this was to prevent data disclosure.

Back to table of contents

10. Experimental groups and conditions

This test included eight experimental groups, which aimed to determine how different between wave engagement (BWE) strategies, different incentivisation strategies, and a reduced communications strategy at waves 2 and 3, would impact upon response rates.

In attempt to increase engagement with respondents, different BWE strategies were tested. Two of the experimental groups received an email six weeks before the start of wave 2 (the approximate mid-point between the end of wave 1, and the start of wave 2). The email thanked the respondent for their participation in the survey and explained how their data could be used to produce labour market outputs. Two separate experimental groups received the same content but in the form of an A5 postcard, which was mailed to their sampled address. If households then responded at wave 2, they were again sent an email or a postcard six weeks prior to the start of wave 3, and again this thanked the respondent for their participation and contained further information about how their data could be used. The remaining four experimental groups did not receive any form of between wave engagement.

At wave 2, four of the eight experimental groups were given an unconditional incentive in the form of a £5 gift voucher. The other four experimental groups received no incentive. The purpose of this was to determine the effect that a monetary incentive would have on the attrition rate between waves 1 and 2, with the expectation that the incentive would increase response and reduce attrition, but it was unknown by how much.

At wave 3 only one of the eight experimental groups received an unconditional incentive, which was again a £5 gift voucher. This group also received a voucher at wave 2. The purpose of this group was to test the effect on the attrition rate of giving an incentive at each wave, as opposed to giving an incentive at wave 1 or at both waves 1 and 2, but then providing no incentive at wave 3.

The final experimental group received fewer communications than the other experimental groups. While all the other groups received a pre-notification letter at waves 2 and 3, this experimental group did not and instead their first piece of communication at waves 2 and 3 was the invite letter containing their unique access code (UAC) for accessing the survey. This experimental group also received no incentive at waves 2 or 3. The purpose of this group was to measure the effect of sending fewer letters to respondents and how this would impact upon the attrition rate.

The experimental conditions outlined were combined to form the experimental groups, and their composition can be seen in Table 2. It should be noted that the sample sizes of the test do not allow for response and attrition comparisons to be made between the groups, rather the comparisons are to be made between the experimental conditions. For example, the effect of an email BWE with a £5 gift voucher cannot be compared with the effect of a postcard BWE with no incentive - rather, the comparison would be between all email engagement groups against all postcard groups. The sample sizes required to allow comparisons between individual groups would have been significantly larger than the 50,000 used for this test.

The allocation of the sample for the experimental groups was performed after the completion of wave 1. This was because only those households that provided an email address could be allocated to the email BWE groups, and this information was provided as part of the wave 1 survey. Allocation after wave 1 also meant that each experimental group could be equal in size. Any households that had fully or partially completed the survey at wave 1 were allocated to an experimental group. Households that did not participate in wave 1 (non-contacts, refusals or ineligibles) were not allocated at wave 2. Similarly, those households that were non-contacts, refusals or ineligibles at wave 2 were not issued to wave 3.

Back to table of contents

11. Response and engagement rates - wave 1

The engagement rate is an important metric for online surveys as it provides a metric on the effectiveness of the engagement strategy. If households read the materials they are sent, access the website, then enter their unique access code (UAC) but proceed no further, then it suggests that the communication strategy has been effective in getting such a household engaged to the point of viewing the survey. Further research will be performed to determine how households can be encouraged further to provide their data.

All engagement rates and response rates are based upon an estimated eligible sample proportion of 95% based on previous research using the same sampling frame and sampling method; it is estimated that of the 50,000 households sampled, 47,500 were eligible for the Labour Market Survey (LMS) test.

The overall wave 1 engagement rate, defined as the proportion of households who either visited the survey website and entered their unique access code to access the survey, or provided any degree of information via the online questionnaire, was 29.4% (Table 3).

The overall wave 1 response rate, defined as the proportion of households that provided full data for at least one-person resident in that household, was 28.4%. The proportion of fully completing households (those that provided full details for all residents) was 25%.

Partial completions were categorised in two ways: usable partial data, where at least one member of a household has completed all of their survey sections, and unusable partial data where no household members fully completed. This can range from households who entered their unique access codes online but entered no data, through to households where an individual member started to complete the survey but did not reach the end of the interview.

The definition of a usable partial is different on the LMS to the Labour Force Survey (LFS). The LFS defines this as a household where at least one question block (that is, a series of related questions) has been completed. This definition for the LMS is not fixed and may change in future. Usable partial response was 3.5% with unusable partials being 1.0%.

Table 4 shows the proportion of individuals that completed, partially completed or did not start the survey. Of individuals that were added to the household grid of the LMS, 87.5% completed, 4.0% partially completed (did not supply all data required to close their case) and 8.6% did not start or access their individual survey.

Proxy response data are data that are supplied on behalf of someone else, usually because the person in question is absent or unavailable to provide their data. Proxy responses only cover factual data; any opinion-based data are not collected. All data for individuals aged 16 years or under are collected by proxy. The LMS definition of a proxy response was different to the definition used for the LFS. The LMS did not place any restrictions on who could provide answers on behalf of another person. In comparison, the LFS requires proxy data to be provided by another person who is a member of their household, a carer or an English-speaking relative for non-English speakers.

Excluding those aged under 16 years, the proportion of wave 1 data from individuals provided by proxy was 21.0% overall (Table 5); this finding is consistent with previous LMS online tests, which have demonstrated a proxy rate of approximately 20%.

Back to table of contents

12. Response and engagement rates - wave 2

There were 12,342 households eligible for participation at wave 2. This was the responding sample for wave 1 minus any households that did not consent for follow up or subsequently refused participation via the Survey Enquiry Line. The 12,342 eligible households were assigned to one of the eight experimental groups and so experienced different communication and/or incentivisation strategies.

The attrition test was designed to look at the response rates, engagement rates, and experimental conditions at the Great Britain level from wave 2 onwards. As such the results for waves 2 and 3 will not be provided at a lower level than Great Britain. Significance testing has been performed to compare individual engagement rates for the different experimental conditions against one another; this took the form of one-way ANOVA statistical tests and these are reported on where appropriate.

Table 6 shows that the overall wave 2 engagement rate for Great Britain, using the same definition as that for wave 1, was 62.5% across all experimental groups. The response rate was 61.4%, and the proportion of fully completing households was 56.4%. The proportions of partial responses were 1.1% for usable data, and 0.1% for unusable data.

The engagement rate for those households that were given an unconditional £5 voucher at wave 2 was 69.7%, compared with an engagement rate of 55.4% for households that received no incentive, which was a statistically significant difference (Table 7). This provides evidence that offering an incentive at wave 2 can increase the retention rate in a longitudinal online survey.

For the between wave engagement trial there were three conditions:

  • an email

  • a postcard

  • no engagement

Table 8 shows the engagement rates for these conditions at wave 2. It can be seen that the email engagement groups produced the highest engagement rate at 64.5%, with the postcard group engagement rate being 60.2%. This difference was found to be statistically significant and suggests that email should be used as a way to improve engagement rates at wave 2. The "no between wave engagement" groups had an engagement rate of 62.6%, which was higher than the postcard group - one possible explanation could be that households received too many postal communications with the introduction of the postcard and this may have had a negative effect on response. These findings will need to be investigated further through qualitative testing.

As there was only one experimental group for the reduced communications experimental condition, it is appropriate to compare this group with a control group, namely the group of households that received no incentives nor any between wave engagement (BWE).

Table 9 shows the engagement rates for both of these groups. For the reduced communications group the engagement rate was 56.6%, compared with an engagement rate of 54.7% for the control group. This was not found to be a statistically significant difference, suggesting that having fewer communications at wave 2 does not negatively affect engagement rates.

Back to table of contents

13. Response and engagement rates - wave 3

There were 6,924 households that were eligible for participation in wave 3 of the Labour Market Survey (LMS) attrition test, comprised of the responding households to wave 2 minus any subsequent refusals or ineligibles. The 6,924 eligible households remained assigned to the experimental groups they were allocated to at the start of wave 2.

Table 10 shows that the overall wave 3 engagement rate, using the same definition as that for waves 1 and 2, was 73.8% across all of the experimental groups. The response rate was 73.6%, and the proportion of fully completing households was 69.6%. The proportions of partial responses were 4.0% for usable data, and 0.1% for unusable data.

At wave 3, only one of the experimental groups received an unconditional £5 voucher as an incentive. This was to test the effect of continual incentivisation for a longitudinal online survey against no incentive at wave 2 and 3, and incentivising at wave 2 but removing the incentive for wave 3.

The engagement rate for those households that were given an unconditional £5 voucher at both waves 2 and 3 was 84.1%, compared with 72.7% for those who received an incentive at wave 2 only, and 71.6% for those who received no incentive at waves 2 or 3 (Table 11). The difference in engagement rates between incentivising at both waves and the other two conditions was found to be statistically significant. These results suggest that there is no significant impact at later waves between having no incentive and having an incentive at a previous wave, but there is a significant increase in the engagement rate if the incentivisation is maintained throughout the waves.

Table 12 shows the engagement rates for the three between wave engagement (BWE) experimental conditions. Those who received BWE at wave 2 also received the same form of BWE at wave 3, and those who received no BWE at wave 2 did not receive BWE at wave 3 either.

As with wave 2, the postcard experimental group had the lowest engagement rate at 70.2%. This was significantly different to the engagement rates for the email group (74.0%) and the group that received no BWE (75.3%). The difference between the email and no BWE groups was not significant. These findings suggest that the postcard approach for between wave engagement has a negative effect upon engagement.

For the reduced communications condition, the engagement rate was again compared with that of the control group (Table 13). The engagement rates were significantly different with the reduced communications group achieving a rate of 69.9% compared with 73.6% for the control group. This difference suggests that, despite there being no significant difference between engagement at wave 2, the reduction of communications will have a negative impact upon engagement at later waves.

Back to table of contents

14. Conclusions

The Labour Market Survey (LMS) attrition test wave 1 results reinforced previous testing, which demonstrated that the communication and engagement strategies, materials used, and survey design for the LMS can achieve an online engagement rate approaching 30.0%.

The wave 2 results have provided evidence that sending an email as a form of between wave engagement (BWE) can increase response, as can the use of an unconditional £5 voucher. It also provided evidence that an additional postal communication in the form of a postcard for between wave engagement could negatively impact upon response.

At wave 3, it was again found that emails can boost response when compared with postcards, but the effect of an email against no between wave engagement was not significant. An unconditional voucher again increased response implying that continued incentivisation for a longitudinal survey will continue to increase response, but this would need to be considered against the cost of such an approach. The wave 3 results also demonstrated that, despite reducing the number of communications at wave 2 and finding no significant impact upon response, this would not apply at wave 3. The reduction caused response to drop and so this approach is not one to be recommended.

Back to table of contents

Contact details for this Methodology

Colin Beavan-Seymour
lms.transformation@ons.gov.uk
Telephone: +44 (0)1633 455536