During the coronavirus (COVID-19) pandemic, the Office for National Statistics has published estimates of personal well-being using both the Annual Population Survey (APS) and the coronavirus module of the Opinions and Lifestyle Survey (OPN). This methodology article considers the impact the pandemic has had on data collection, the extent to which it has influenced estimates compared with pre-pandemic and reviews the comparability of the estimates between the APS and the OPN.
Both the OPN and the APS see a substantial worsening of personal well-being at the start of the pandemic. Both surveys have since shown significant improvements in average scores of happiness and anxiety. The same significant improvements have not been seen in life satisfaction and feeling that things done in life are worthwhile, and data from the OPN suggest that both life satisfaction and feeling that things done in life are worthwhile have worsened throughout the pandemic.
Personal well-being is reported slightly more favourably on the APS than on the OPN. Estimates of personal well-being from these two surveys are difficult to directly compare because of differences in data collection, such as differences to mode of interview, geographic coverage, sample size and contextual effects.
Whilst it is not possible to quantify the influence of the differing modes of data collection between the APS (collected via telephone interviews) and the OPN (collected via online self-completion), it is believed that mode of collection is the main driver behind the differences in the estimates of personal well-being between the two surveys.
Results from an analysis in this article, using APS data from the calendar year 2019, found significant differences between average well-being scores when comparing interviews conducted face-to-face and by telephone. This supports previous research into mode effect. The largest differences in scores by mode of collection were seen for anxiety, with the greatest significant difference found for those in the oldest age groups.
The weights applied to both the APS and the OPN reduce much of the bias caused by differing characteristics of the underlying sample. For this reason, it is unlikely that varying sample compositions play a role in the differences found in personal well-being estimates between the two surveys.
Because of the pandemic, it was necessary for the Labour Force Survey (LFS), from which APS data are derived, to shift from face-to-face and telephone interviewing to solely telephone interviewing from March 2020. This was associated with a shift in sample composition - including a notable decrease in the proportion of respondents living in rented accommodation. The Office for National Statistics (ONS) added housing tenure to the weighting process to mitigate the impact of potential non-response bias caused from this operational change. This new weighting has been applied to personal well-being estimates from the second quarter (April to June) of 2020.
Results from analysis in this article suggest that the shift in mode of collection from face-to-face and telephone to telephone only may have caused a slight exaggeration to the average scores of anxiety on the APS, and in particular this affects the oldest age groups, which users should be mindful of when comparing estimates pre and during the pandemic.
The APS will continue to be published as our National Statistic of well-being on an annual basis. In addition, estimates of personal well-being will continue to be updated on a quarterly basis.
Updates to estimates of personal well-being will also continue to be monitored throughout the pandemic on a weekly basis through the OPN survey, with findings available in Coronavirus and social Impacts on Great Britain.Back to table of contents
Annual Population Survey
The Office for National Statistics (ONS) has published estimates of personal well-being from the Annual Population Survey (APS) on an annual basis since 2011; these estimates are accredited as National Statistics. Since November 2019, the ONS has also published quarterly estimates of personal well-being using the APS as Experimental Statistics and these have been published from Quarter 2 (April to June) of 2011 through to the latest available estimates.
The APS is the UK's largest continuous household survey. It is not a standalone survey but uses data combined from wave 1 and wave 5 (the first and last wave) of the main Labour Force Survey (LFS) plus a boost from the Local Level Labour Force Survey for England, Wales and Scotland.
The LFS uses a rotational sampling design, whereby a household, once initially selected for interview, is retained in the sample for five consecutive quarters. These five quarters during which a household remains in the sample are referred to as waves 1 to 5. For further information on the construction of the LFS, please see Volume 1: background and methodology (PDF, 1.56MB).
In March 2020, as a result of the coronavirus (COVID-19) pandemic, the way in which people are contacted for initial interviews for the LFS had to change, moving from face-to-face interviewing to telephone-based. There follows a timeline of events summarising how the pandemic impacted data collection methods.
9 March 2020
The public's heightened awareness of the coronavirus started to affect participation in the LFS.
17 March 2020
Wave 1 face-to-face data collection was suspended, in line with government guidelines, while systems were developed to allow interviewers to conduct telephone interviewing from their homes.
Telematching, which uses lookup information on telephone numbers associated with addresses (already used for respondents north of the Caledonian Canal) was extended to the rest of Great Britain to obtain additional telephone numbers for addresses in wave 1.
23 March 2020
Commencement of official UK lockdown measures.
The transition to telephone interviews from face-to-face interviews for wave 1 respondents started.
Where possible, wave 1 interviews were collected via telephone.
Additional advance materials were prepared to allow respondents to contact interviewers.
30 March 2020
Telephone interviewing was rolled out fully to face-to-face interviewers.
Opinions and Lifestyle Survey
The Opinions and Lifestyle Survey (OPN) is a well-established omnibus survey that is conducted eight months of the year (January, February, April, May, July, August, October, November). This survey covers residents of Great Britain who are aged 16 years and over.
To understand how the coronavirus (COVID-19) pandemic is affecting life in Great Britain, the ONS adapted the monthly omnibus survey to a weekly survey. The weekly survey was created to collect timely information on people's experiences and opinions related to the pandemic. Each week, some of the survey questions change to reflect changing circumstances and priorities during the pandemic.
The weekly survey, like the OPN survey prior to the pandemic, has been primarily collected through a self-completion online questionnaire, however, some responses have been collected with the help of telephone interviewers , for example, when respondents indicate they are unable to complete the online survey themselves.
A table summarising the data collection differences between the OPN and the APS can be found in the Data sources and quality section, Table 3.Back to table of contents
Analysis run on the Annual Population Survey (APS), for the calendar year of 2019, reinforces previous findings that personal well-being estimates are sensitive to the mode of data collection. Whilst it is not possible to precisely quantify the influence caused by differences in mode of collection between the APS (telephone interviews) and the Opinions and Lifestyle Survey (OPN) (online self-completion questionnaire) the survey weighting process is deemed to have controlled for much of the sample composition differences across the sources and hence that mode of data collection is therefore believed to be the biggest driver of differences in estimates of personal well-being between the two surveys.
Annual Population Survey
As described in Section 2, the Labour Force Survey (LFS) uses a rotational sampling design, whereby a household, once initially selected for interview, is retained in the sample for five consecutive quarters. These five quarters during which a household remains in the sample are referred to as waves 1 to 5.
The Annual Population Survey (APS) uses data from waves 1 and 5 plus a boost from the Local Level Labour Force Survey for England, Wales and Scotland. Prior to the pandemic, face-to-face interviews were used for households in wave 1. For the following waves 2 to 5, interviews were via telephone, where possible.
Considering data from the 2019 calendar year, 62.3% of eligible respondents to the personal well-being questions were interviewed face-to-face, the remaining 37.7% were interviewed via telephone. This distribution differs across the population; for example, a greater proportion of 16- to 24-year olds and those renting their accommodation were interviewed face-to-face (75.7% and 73.2%) compared with older age groups and those who own their accommodation, respectively.
In March 2020, as a result of the coronavirus (COVID-19) pandemic, the LFS had to change the way in which it contacted people for initial interviews, moving from face-to-face to telephone interviewing, causing a considerable shift in data collection.
Previous research shows that mode of data collection influences responses to the personal well-being questions. The research found that, on average, respondents provide more favourable responses to the personal well-being questions when asked over the telephone compared with face-to-face interviews.
Table 1 presents estimates of personal well-being by mode from the APS for the 2019 calendar year.
Life satisfaction, feeling that things done in life are worthwhile and happiness were all slightly more favourably reported on average when collected via telephone compared with face-to-face. By contrast, average scores for anxiety were better for those who were interviewed face-to-face compared with via the telephone; 5.0 percentage points lower.
Download this table Table 1: Average (mean) personal well-being, by mode of interview, January 2019 to December 2019.xls .csv
Results found differences linked to data collection mode in how respondents reported their well-being across a wide range of population groups. This was the case for each of the four measures of well-being, with the greatest differences identified between groups in responses to the anxiety question. Please refer to the Data sources and quality section of this article to see how these tests were run, including information on adjustments made due to the quantity of tests which were run.
See Table 6.1 through Table 6.4 in the underlying datasets to see specific information on which parts of the population were found to have significant mode effects for each of the four well-being measures.
The results show that the main differences in scores by mode were in anxiety scores. The largest differences were found to the anxiety measure for people of different ages. For example, among those aged 75 years and over anxiety scores were 20.5 percentage points higher for those responding via telephone compared with face-to-face.
Large and significant differences in anxiety scores were also found in other parts of the population, including those in the Asian ethnicity group, those living in the West Midlands and those who report that their illness restricts daily activities (anxiety scores 18.4, 15.6 and 10.8 percentage points higher through telephone collection compared with face-to-face, respectively). Further detail on how score differs by mode of collection for different population groups and for each of the personal well-being measures, can be found in Table 6.1 through to Table 6.4 in the underlying datasets.
The results suggest that the shift in data collection mode on the APS to solely telephone interviews does influence estimates of personal well-being and may affect scores reported by some population groups more than others. In particular, the increase in anxiety scores may slightly exaggerate anxiety levels compared with the pre-pandemic estimates obtained via both face-to-face and telephone interviewing. Especially, the scores may be slightly exaggerated for those in the oldest age groups, those of Asian ethnicity and those living in the West Midlands.
To explore this further, Figure 6 shows a comparison of initial well-being estimates for Quarter 1 of 2020 (January to March), where collection was both face-to-face and via telephone, compared with estimates that have been weighted to telephone respondents only. The findings show that the move to interviewing only by telephone does not significantly affect average scores of happiness, life satisfaction or feeling that things done in life are worthwhile. However, there is a slight but significant increase in average anxiety ratings.
Opinions and Lifestyle Survey
As with the monthly version of the Opinions and Lifestyle Survey (OPN), prior to the pandemic, responses were predominantly collected through an online self-completion questionnaire (92.1%) and where necessary, via telephone (7.9%).
A pooled dataset that combines the weekly data covering Quarter 2 of 2020 (3 April to 28 June 2020) is used to consider how the method of data collection affects personal well-being scores, comparing mode effects. Overall, no significant differences were found in average scores of personal well-being between telephone and online self-completion.
Direct conclusions about the mode effect between telephone and online self-completion cannot be made from this analysis because of the limitations of a small sample size for those responding via telephone.
One of the most researched mode effects on survey responses is social desirability bias. Much of the research finds that the presence of an interviewer increases vulnerability to social desirability bias. For example, research considering survey method matters: how online and offline, and face-to-face or telephone interviews differ found statistically significant differences between telephone and online surveys for various positive mental health scales including subjective happiness and life satisfaction where people respond more favourably by telephone interviews when compared with online self-completion.
Furthermore, participants responded less favourably to negative mental health scales online, with anxiety and stress scales scored more favourably in telephone and face-to-face interviews.
Whilst it is not possible to quantify the social desirability bias that may be caused by differences in mode of data collection, it is believed that it is the main factor in explaining more favourable scores of personal well-being on the APS compared with the OPN. This conclusion has been drawn based on analysis of modal effects on the APS as well as external research showing subjective well-being measures being sensitive to modal effects.Back to table of contents
As previously stated, the Office for National Statistics (ONS) weights social survey data using the most up-to-date official population data so that analysis produced is not biased and results are representative of the UK. For this reason, differing sample compositions are unlikely to be driving much of the difference in personal well-being estimates between the APS and the OPN. This section outlines how operational changes to the survey have influenced sample composition and how the ONS has adapted its weighting process to account for these compositional changes.
Annual Population Survey
The Annual Population Survey (APS) datasets are weighted by age, sex and local authority to reflect the size and composition of the general population using the most up-to-date official population data.
Previous analysis of the most important factors affecting life satisfaction and anxiety found that different characteristics such as age, housing tenure, self-perceived health, employment status and marital status can influence personal well-being (a summary of these analysis be found in Table 4 in the Data sources and quality section).
We also explored whether the change in mode of data collection from face-to-face to telephone interviews was associated with a change in those participating in the survey in terms of respondent characteristics.
The most notable difference in the sample composition between Quarter 1 (January to March) and Quarter 2 (April to June) of 2020 when the change of data collection mode took effect, was in relation to respondents' housing tenure. The proportion of respondents in rented accommodation decreased from 27.2% in Quarter 1 to 20.6% in Quarter 2. As fewer people renting their home participated in the survey in Quarter 2, the proportion of respondents who said they own their home outright increased from 44% in Quarter 1 to 49.6% in Quarter 2.
It is important to note that housing tenure is strongly associated with household income. The APS does not collect information on household income so it is not possible to show how the mode effect has changed the structure of the sample by income, but it is possible that having fewer people who rent their home in the sample could imply a reduction in the sample of those who are less financially secure. These may be people who are more susceptible to lower levels of personal well-being.
Results from regression analysis undertaken in 2019 using APS data found that renters were more likely to report lower levels of life satisfaction, feeling that things done in life are worthwhile and happiness, and more likely to report high levels of anxiety compared with homeowners. This supports findings from previous research conducted by the ONS that discovered that homeowners were more likely to have a higher life satisfaction than private or social renters. Previous analysis has found renters to report feeling lonely more often than homeowners, and it is known that feeling lonely is the most significant factor associated with reporting high anxiety during the pandemic.
A notable difference can also be seen in the age distribution of the sample in Quarter 2. In Quarter 1 those aged 16 to 39 years made up 22.1% of respondents while in Quarter 2, the proportion decreased to 16.9% of respondents. This was accompanied by an increase in the proportion of older respondents. For example, in Quarter 1, 38.3% of respondents were aged 60 to 79 years, which increased to 42.4% of respondents in Quarter 2. People aged 65 to 74 years and 75 years and older were more likely to report higher anxiety than any other age group since the beginning of the pandemic. This differs to prior to the pandemic where the oldest age groups had lower levels of anxiety on average.
The shift in sample composition can also be identified when looking at marital status. In Quarter 1, 52.7% of respondents reported that they were married and living with their spouse, while 24% reported that they were single, never married. In Quarter 2, the proportion who reported being married and living with their spouse was 55.4%, while the proportion who reported being single, never married had fallen to 20.5%. Before the pandemic, people who were married or in a civil partnership were more likely to report higher life satisfaction than people who were not. Conversely, since the start of the pandemic people who are married or in a civil partnership were more likely to report higher anxiety than any other marital status group. It is suggested that this could be because of the requirement of home-schooling.
The three characteristics mentioned previously can easily influence one another, with a range of possible implications. For example, younger people are less likely to own a landline, which means that fewer people from the younger age groups responded. As younger people are less likely to own their home and more likely to be renting, this caused a skew in the sample composition relating to housing tenure. This also affected marital status, as younger people are more likely to be single and less likely to be married. Employment status and self-perceived health were also affected.
Some of the changes will be accounted for in the usual weighting process, however, this only covers age, sex and location. Housing tenure was not previously accounted for, so it was decided to add tenure to the weighting to mitigate the impact of this potential non-response bias.
At the time of this article, the adjustment to the weighting calibration has been incorporated into the APS datasets and estimates for all quarters from Quarter 2 of 2020 onwards.
Opinions and Lifestyle Survey
When the Office for National Statistics (ONS) first set up the coronavirus module of the Opinions and Lifestyle Survey (OPN), the survey was sent out to approximately 2,000 to 2,500 people. Respondents consisted of people who had previously responded to an ONS social survey and indicated that they were willing to be contacted again to take part in future research.
In order to respond to information needs associated with the pandemic in a timely way, various sampling frames were used to ensure a consistently achieved sample size of at least 1,000 respondents. This included inviting people who had previously responded to the OPN, LFS, and the Labour Market Surveys to participate in the OPN. For detailed information on data collection metrics, including sample frame, fieldwork length and dates, please see Table 5 in the Data sources and quality section.
The use of different sampling frames resulted in differences in sample composition wave on wave; in particular, there have been differences in sample age structure, housing tenure, marital status, employment status and highest level of education achieved. Table 2 provides a high-level summary of the most substantial variations in sample characteristics between waves.
|Characteristic||Minimum (%)||Maximum (%)|
|Age: 16 to 24 years||0.9%||9.1%|
|Marital Status: Married||51%||62.1%|
|Employment status: Employed or self-employed||36.6%||50.7%|
|Higher education: Degree or equivalent||29%||38.2%|
Download this table Table 2: Range in the most variable sample characteristics, 3 April to 28 June 2020.xls .csv
Different population groups are known to report differing levels of personal well-being. For example, regression analysis run on the APS for the calendar year of 2019, found that those aged 16 to 24 years are almost three times more likely to report high levels of anxiety when compared with those aged 75 years and over. Regression outputs for all four measures of personal well-being can be found in Tables 7.1 through 7.4 for the APS and Tables 4.1 through 4.4 for the OPN.
Similarly to the APS, weighting is also applied to the OPN data and this has the effect of reducing much of the bias that may come from differences in sample size and composition wave on wave. Using the most up-to-date official data, weighting is applied to each wave so that the data represent both the size and composition of Great Britain's population.
Weighting the Opinions and Lifestyle Survey also includes calibrating by other factors including region, qualification, housing tenure, employment, three interactions of sex and age. Further information on the sampling frame and calibrations used at each wave can be found in the Data sources and quality section.
However, even after weighting is applied, there is still potential for some bias arising from under-representation of certain population groups. The weighting relies on the assumption that responses from the parts of the population that are under-represented within a sample have views that are representative of that whole group, and there is potential they may not be.
Since 24 September 2020, the sampling frame has been consistently drawn from previous respondents to the Labour Market Survey (LMS). When the samples are drawn from a consistent sampling frame there is less variation in the sample composition week on week.Back to table of contents
The size of the underlying sample can influence the accuracy of personal well-being estimates, with larger sample sizes causing a reduction in variability of the estimates. The Annual Population Survey (APS) has a larger underlying sample than the Opinions and Lifestyle Survey (OPN), and this is reflected in the variability of the estimates. Increased accuracy, seen through reduced variability in the estimates, has been seen on both the OPN and the APS when the samples received a boost. The following section examines sample size of the APS and the OPN throughout the coronavirus (COVID-19) pandemic to consider how it may affect the quality of the well-being estimates during that time.
Annual Population Survey
Prior to the pandemic, the achieved sample size of the Annual Population Survey (APS) was just over 277,100 individuals for the year January 2019 to December 2019. The personal well-being questions are only asked of those answering on their own behalf as they relate to subjective assessments of an individual's own personal well-being and only these are considered "valid responses". For this reason, proxy responses are not allowed. Over half of the respondents (156,900) to the APS in 2019 were able to provide responses to the personal well-being responses.
Each quarter of 2019, the achieved sample size was approximately 70,000, with just under 40,000 providing a valid personal well-being response.
During the first quarter of 2020 (January to March), growing awareness of the coronavirus pandemic among the public alongside the suspension of face-to-face interviewing meant that the number of survey responses fell. Between January and March 2020, the achieved sample size and valid (non-proxy) responses declined to 59,500 and 34,300, respectively.
In the second quarter of 2020 (April to June 2020) there was a further decline in number of responses, with the achieved sample size and valid responses falling to 47,800 and 28,200, respectively.
To address this, in the period from July 2020 to March 2021, the wave 1 Labour Force Survey (LFS) sample size was doubled in order to improve achieved sample sizes while response rates were lower.
It is important to note that as a result of the decreasing sample size, there has been greater sample variability in the personal well-being scores among respondents. This is reflected in larger standard errors around estimates and thus wider confidence intervals. In particular, standard errors have increased for those reporting high anxiety (scoring between 6 and 10 and of 10) and high levels of happiness (scoring between 7 and 8 out of 10).
The Opinions and Lifestyle Survey
Initially the Opinions and Lifestyle Survey (OPN) was sent out to between 2,000 and 2,500 adults across Great Britain, aged 16 years and over, each week.
With a response rate of approximately 60% or above, the achieved sample size reached approximately 1,500 respondents each week. Most respondents (approximately 98%) provided a response to the four personal well-being questions.
In order to respond to the increasing need for more granular analysis, from 21 October 2020, the OPN weekly sample size was trebled to reach out to just over 6,000 adults on a weekly basis. From this point forward, the average response rate was approximately 70%, meaning an achieved sample of approximately 4,000 respondents each week. To note, this sample boost was within England so only boosted the sample in England. This is accounted for once the data have been weighted.
The increase in sample size has been a benefit to the weekly personal well-being estimates as it has increased their precision, as can be seen by the reduction in the standard errors and thus confidence intervals.
Back to table of contents
This section considers how context of a survey can influence estimates of personal well-being, some of which have been controlled for, others may in part be playing a role in the differing estimates we see between the two surveys. From previous cognitive testing, it is known that personal well-being estimates are influenced by contextual effects. For this reason, the Annual Population Survey (APS) and the Opinions and Lifestyle Survey (OPN) follow best practice guidance by positioning personal well-being questions early in the survey, straight after demographic questions, and following a specific order - positive questions followed by negative. Unfortunately, it is not possible to control for contextual effects such as the survey title and pre-text, which in part helps to explain some of the differences in personal well-being scores between the APS and the OPN.
In the development of the Office for National Statistics (ONS) personal well-being questions, cognitive testing (PDF, 328KB) was undertaken to consider how respondents came to their scores of personal well-being. The research found that placement of the questions influences respondents' scores. For example, placing the well-being questions after questions relating to health or labour market may affect respondents' answers.
Therefore, the four personal well-being questions in both the Labour Force Survey (LFS) and Opinions and Lifestyle Survey (OPN) are placed as early as possible, straight after any demographic questions. This follows the harmonised standard that has been applied when the questions are used in many other surveys as well.
In line with best practice guidance internationally, the ONS takes a holistic approach to measuring personal well-being, using three measures: evaluative (life satisfaction), eudemonic (feeling that things done in life are worthwhile), and experience (happiness and anxiety). Please see Glossary for further explanation of evaluative, eudemonic and experience measures.
Further information on best practice guidelines for asking questions on personal well-being is given in Personal well-being in the UK Quality and Methodology Information and also the OECD Guidelines on Measuring Subjective Well-being.
Results from the cognitive testing also showed that respondents preferred the positive well-being questions first, and they are consequently always used in the same order: life satisfaction, feeling that things done in life are worthwhile, happiness and then anxiety.
Additional contextual influences
As it is known that previous questions to the personal well-being questions influence respondents' answers, it is also important to consider other possible contextual influences. Respondents are made aware of the name of the survey and on the OPN, respondents receive a letter about the survey, both of which may also influence responses.Back to table of contents
Personal well-being has been found to be reported more favourably on the Annual Population Survey (APS) when compared with the Opinions and Lifestyle Survey (OPN). In addition, there is increased variability in estimates in the OPN when compared with the APS.
It is believed the driving factor behind differences in personal well-being estimates between the survey is the mode in which the data are collected; telephone interviews for the APS and online self-completion questionnaires for the OPN. Other influencing factors of the differing estimates between the two surveys include geographic coverage, sample size and contextual effects.
The weights applied to both the APS and the OPN reduce much of the bias caused by differing characteristics of the underlying sample. For this reason, it is unlikely that varying sample compositions cause much of the difference found in personal well-being estimates between the two surveys.
Annual Population Survey
In March 2020, as a result of the coronavirus (COVID-19) pandemic, the Labour Force Survey (LFS), upon which data from the Annual Population Survey (APS) is derived, changed the way in which it contacted people for initial interviews, shifting from a combination of face-to-face and telephone interviews to just telephone interviews. Naturally, this resulted in a large proportional increase in the numbers responding via telephone.
Findings reported in this release suggest that the shift in data collection mode on the APS to solely telephone interviews is likely to have influenced estimates of personal well-being and may influence scores for some population groups more than others. In particular, the observed increase in anxiety scores may slightly exaggerate anxiety levels compared with the pre-pandemic estimates obtained via both face-to-face and telephone interviewing. Especially, the scores may be slightly exaggerated for those in the oldest age groups.
With the change in data collection mode, there was also a change to the characteristics of respondents to the LFS, from which the APS is derived. The most notable change was in the housing tenure of respondents. Although most of the changes would have been accounted for by existing weights, housing tenure was not previously included in the weighting process. For this reason, the ONS added housing tenure to the weight calibration, enabling the responses to better represent the true composition of UK's population and for greater comparability to estimates from the APS prior to the pandemic.
Housing tenure weights have been applied to estimates of personal well-being from the second quarter (April to June) of 2020 onwards, which means estimates following the operational changes can be confidently compared with estimates prior to these changes.
With heightened awareness of the pandemic, the number of responses to the LFS also declined, and subsequently affected the sample size of the APS. To address this, the ONS doubled the sample size of wave 1 of the LFS between July 2020 and March 2021, with the aim of improving achieved sample sizes. As a result of the decreased sample size prior to the boost in July, there has been an increase in sampling variability, which is reflected in the increased standard errors.
Opinions and Lifestyle Survey
The weekly version of the Opinions and Lifestyle Survey (OPN) was created to help understand how the pandemic is affecting people in Great Britain. The survey is primarily collected through an online self-completion questionnaire, and via telephone where necessary.
When the OPN was initially set up, varying sampling frames were used, associated with varying sample compositions. Weighting of the data reduces much of the bias from this sampling variation, however, there is still potential for some bias arising from the under-representation of certain population groups in the sample. From 24 September 2020, the OPN sample has been drawn from a consistent sample frame, the Labour Market Survey (LMS), which has reduced sampling variation, further information on the LMS can be found either through the link or summarised in the notes of Table 5 in the Data sources and quality section.
The OPN was initially achieving sample sizes of approximately 1,000 respondents per week. To allow for more granular level analysis this has since been boosted to approximately 4,000 respondents per week and this has in turn increased the precision of the personal well-being estimates, as seen by decreased standard errors.Back to table of contents
The "eudemonic" approach is sometimes referred to as the psychology of functioning and flourishing approach, which draws on self-determination theory and tends to measure such things as people's sense of meaning and purpose in life, connections with family and friends, a sense of control and whether they feel part of something bigger than themselves. "Overall, to what extent do you feel things you do in your life are worthwhile?" is the eudemonic question used.
The "evaluative" approach asks individuals to step back and reflect on their life and make a cognitive assessment of how their life is going overall, or on certain aspects of their life. "Overall, how satisfied are you with your life nowadays?" is the evaluative question used.
The "experience" approach seeks to measure people's positive and negative experiences (or effect) over a short timeframe to capture people's personal well-being on a day-to-day basis. There are two experience questions, which include one positive question "Overall, how happy did you feel yesterday?" and one negative "Overall, how anxious did you feel yesterday?".
On the Annual Population Survey, the well-being questions are only asked of persons aged 16 years and over who gave a personal interview, as proxy answers are not accepted. Proxy answers are those when someone in the household answers on someone else's behalf. The personal well-being questions involve subjective assessments of one's own life so are not suitable to be answered by someone else on your behalf.Back to table of contents
The data published for our quarterly personal well-being figures are all seasonally adjusted (although non-seasonally adjusted estimates are also available). This aids interpretation by removing recurring fluctuations caused, for example, by holidays or other seasonal patterns.
The regARIMA model used to correct the series before applying moving average filters to the seasonal adjustment was reviewed at the end of 2020. There was a slight change to the model, which will be updated in the Personal well-being quarterly estimates technical report in due course.
From reviewing the model, it was found that two series were identified as having an Easter effect, these were part of the happiness sub-group. The effect was negative for the mean and positive for the low happiness threshold series. The implication is that happiness seems to decrease in the period immediately before Easter. All the seasonally adjusted series have been identified as having outliers or level shifts for Quarter 1 (January to March) 2020, for Quarter 2 (April to June) 2020, or for both. More information on this modelling can be found in the Seasonal adjustment methodological note.
Conducting multiple t-tests
To understand whether the mode effect is consistent or varies across population groups in the UK, a series of t-tests were carried out. This analysis tested whether average scores of well-being were the same, using either method of data collection for each sub-group of the population assessed. A wide range of different population groups (57 groups) were assessed including those with different socio-economic circumstances, in better or worse health, differing demographic characteristics and different housing tenures.
Initial t-tests found 28, 30, 22 and 41 of the 57 population groups considered saw significant differences in scores by mode of interview for life satisfaction, worthwhile, happiness and anxiety, respectively.
It is important to consider that with multiple tests being run, there is still a possibility that some of these differences may have occurred because of chance. Therefore the "Holm" method of adjustment was applied to these tests to reduce the chance of any of these differences in scores by mode occurring by error. The corrected number of cases to show true and significant differences then decreased to 11, 18, 8 and 30 groups for each well-being measure, respectively.
Data collection differences between the surveys
|Opinions and Lifestyle Survey – COVID-19 module||Labour Force Survey|
|Mode of collection||Mixed mode collection using an online self-completion questionnaire. Alternatively, if required, the interview can be conducted by telephone.||Telephone interviews|
|Geographic coverage||Great Britain||United Kingdom|
|Frequency||The survey is in the field approximately weekly, see Annex Table 6 for fieldwork periods.||Quarterly|
|Achieved sample size||Approximately 1,000 reaching approximately 4,000 following the sample boost from 25 November 2020||Approximately 50,000 (based on Q2 and Q3 of 2020)|
|Accredited status||Official Statistics||Annual estimates – National Statistics|
Quarterly estimates – Experimental Statistics
Download this table Table 3: Summary of data collection differences between the Opinions and Lifestyle Survey and the Annual Population Survey during the coronavirus (COVID-19) pandemic.xls .csv
Known factors to influence personal well-being
|Factor||Aspect of PWB||Strength of the Relationship pre-COVID-19||COVID-19 Relationship (3rd April – 10 May 2020)|
|Self-reported Health||Life Satisfaction||People reporting very good or good health were more likely to have a higher life satisfaction than people who reported poor or very poor health|
|Age||Life Satisfaction and Anxiety||Younger people aged 16-39, and older people aged 60+ were more likely to report a higher life satisfaction than those aged between 40-59||People aged 65-74 and 75+ were more likely to report higher anxiety than any other age group. This is significantly higher than pre-COVID, as anxiety had tended to reduce to a lower level compared to middle age groups (strong association)|
|Marital Status||Life Satisfaction and Anxiety||People who are married or in a civil partnership were more likely to report a higher life satisfaction than people who are not||People who are married or in a civil partnership were more likely to report higher anxiety than any other marital status group, showing an interesting inversion of quality of life compared to before the pandemic. It’s suggested that this could be due to home-schooling (strong association)|
|Employment Status||Life Satisfaction||People who are retired were more likely to report a higher life satisfaction than people who are employed, and people who are unemployed or economically inactive were less likely to report high life satisfaction than employed people|
|Housing Tenure||Life Satisfaction||Homeowners were more likely to have a higher life satisfaction than private or social renters|
|Household Spending||Life Satisfaction||Fairly strong positive relationship; stronger than household income|
|Household Income||Life Satisfaction||Fairly strong positive relationship; weaker than household spending|
|Dependent Children||Life Satisfaction and Anxiety||Having dependent children is related to higher life satisfaction compared to having no dependent children||See marital status|
|Sex||Life Satisfaction and Anxiety||Females were more likely to have a higher life satisfaction than males||Females were more likely to have higher anxiety than males (strong association)|
|Loneliness||Anxiety||People who feel lonely frequently are more likely to report high anxiety than people who feel lonely less frequently than them or people who never feel lonely (strong association)|
|How safe or unsafe do you feel in your home?||Anxiety||People who reported feeling unsafe or very unsafe in their homes were more likely to report higher anxiety than those who reported feeling safe or very safe in their homes (strong association)|
|In which ways is COVID-19 affecting your life?||Anxiety||People who reported that COVID-19 was affecting their work were more likely to report higher anxiety than people reporting that COVID-19 has not affected their work (strong association)|
|Disability||Anxiety||Disabled people were more likely to report higher anxiety than people who are not disabled (strong association)|
|Country of residence||Anxiety||People in England were more likely to report higher anxiety than people in Scotland. Further regional variation of anxiety was identified across the UK.|
Download this table Table 4: Known factors to influence personal well-being from previous research.xls .csv
|Fieldwork period||Fieldwork length||Sample frame||Achieved sample size|
|20 March to 30 March 2020||10 days||Labour Market Survey||1,588|
|27 March to 6 April 2020||10 days||Labour Market Survey||1,581|
|3 April to 13 April 2020||10 days||Opinions and Lifestyle Survey||1,203|
|9 April to 20 April 2020||11 days||Opinions and Lifestyle Survey||1,431|
|17 April to 27 April 2020||10 days||Labour Force Survey||1,327|
|24 April to 4 May 2020||10 days||Labour Force Survey||1,360|
|30 April to 11 May 2020||11 days||Labour Force Survey||1,108|
|7 May to 17 May 2020||11 days||Labour Force Survey||1,181|
|14 May to 17 May 220||4 days||Labour Force Survey||995|
|21 May to 24 May 2020||4 days||Labour Force Survey||1,028|
|28 May to 31 May 2020||4 days||Opinions and Lifestyle Survey||1,247|
|4 June to 7 June 2020||4 days||Wave 6 of the Labour Force Survey||1,914|
|11 June to14 June 2020||4 days||Wave 6 of the Labour Force Survey||1,896|
|18 June to 21 June 2020||4 days||Wave 6 of the Labour Force Survey||1,920|
|25 June to 28 June 2020||4 days||Wave 6 of the Labour Force Survey||1,994|
|2 June to 5 July 2020||4 days||Remainder Labour Force Survey, Remainder Wave 6 of the Labour Force Survey, Opinions and Lifestyle Survey||1,788|
|8 July to 12 July 2020||5 days||Remainder Labour Force Survey, Remainder Wave 6 of the Labour Force Survey, Opinions and Lifestyle Survey||1,743|
|15 July to 19 July 2020||5 days||Labour Force Survey||1,606|
|22 July to 26 July 2020||5 days||Labour Force Survey||1,564|
|29 July to 2 August 2020||5 days||Labour Market Survey||1,235|
|5 August to 9 August 2020||5 days||Living Costs and Food Survey||1,424|
|12 August to 16 August 2020||5 days||Living Costs and Food Survey||1,533|
|26 August to 30 August 2020||5 days||Labour Force Survey||1,644|
|9 September to 13 September 2020||5 days||Labour Force Survey||1,694|
|16 September to 20 September 2020||5 days||Labour Force Survey||1,689|
|24 September to 27 September 2020||4 days||Labour Market Online||1,587|
|30 September to 4 October 2020||5 days||Labour Market Online||1,573|
|7 October to 11 October 2020||5 days||Labour Market Online||1,663|
|14 October to 18 October 2020||5 days||Labour Market Online||1,653|
|21 October to 25 October 2020||5 days||Labour Market Online||4,226|
|28 October to 1 November||5 days||Labour Market Online||4,111|
|4 November to 8 November 2020||5 days||Labour Market Online & Labour Market Covid||4,378|
|11 November to 15 November 2020||5 days||Labour Market Online||4,400|
|18 November to 22 November 2020||5 days||Labour Force Survey & Labour Market Covid||3,631|
|25 November to 29 November 2020||5 days||Labour Market Survey & Labour Market Covid||4,395|
|2 December to 6 December 2020||5 days||Labour Market Survey||4,151|
|10 December to 13 December 2020||4 days||Labour Market Survey||3,214|
|16 December to 20 December 2020||5 days||Labour Market Survey||3,330|
Download this table Table 5: Opinions and Lifestyle Survey response metrics, 20 March to 20 December 2020.xls .csv
Updates to estimates of personal well-being will continue to be monitored weekly through the coronavirus (COVID-19) pandemic using the Opinions and Lifestyle Survey (OPN), with findings reported in Coronavirus and the social impacts on Great Britain.
The Annual Population Survey (APS) well-being estimates will also continue to be published, both annually and quarterly, with the annual estimates accredited as our National Statistics of well-being.
Because of differences in data collection, mainly mode of collection, but also geographic coverage, context of survey and sample size, estimates between the OPN and the APS should not be directly compared.
It is preferable to compare estimates of well-being from the APS over time and we now have a 10-year data time series for this purpose. Much of the bias, which may have been introduced by changes in mode of data collection and subsequent changes in sample composition and response rates, has been accounted for with additional weights and sample boosts. However, it is not possible to account for social desirability biases caused by operational changes in mode of collection. For this reason, it is possible that anxiety is slightly overestimated compared with pre-pandemic levels, particularly for those in the older age groups.
It is recommended that quality information is considered when making comparisons. Alongside estimates of personal well-being the Office for National Statistics (ONS) produce quality information on the estimates. This includes sample size, and confidence intervals, which help to understand the sampling variability around the estimates.
Also included in estimates from the APS is information on the coefficient of variation (CV). Estimates with a CV of 20% or greater are suppressed for quality reasons. The CV indicates the amount of variation in the average (arithmetic mean), that is, the higher the CV, the more the responses are spread out around the average value. Estimates of personal well-being are colour-coded in the data tables to indicate the quality of the data based on the CV values.Back to table of contents
Contact details for this Methodology
Telephone: +44 (0)1329 444256 or +44(0)2071 120107