The national accounts provide a conceptual framework in which to record and reconcile economic flows, and they provide a lens through which the economic impact of the coronavirus (COVID-19) pandemic might be measured. In general, transactions in goods and services in the marketplace are well served by this framework. Aggregate values of production and expenditure are usually well-aligned in these markets, as is the value of the income generated by these activities. However, it has long been recognised that non-market output is a measurement challenge. It is also likely to have been significantly impacted by the COVID-19 pandemic.
Non-market output comprises the production of goods and services by the government or non-profit institutions serving households (NPISH), either supplied for free or at prices that are not economically significant. In the UK, this includes most of healthcare and education provision as well as other public services such as the courts and criminal justice systems, public order and defence, social protection, and other administrative activities.
The policy response to the COVID-19 pandemic – including the closure of schools and the switch to remote learning – has had a particular impact on the education system. It has also required an evolution of our approach to measurement.Back to table of contents
Consistent with international guidance, the Office for National Statistics (ONS) uses separate approaches to estimate the current price value of education output and the volume of education output. The former is a component of nominal gross domestic product (GDP), while the latter contributes to the widely quoted chained volume measure (CVM) estimates. Both approaches reflect the absence of a market-based price mechanism for education services, which prevents the application of methods of valuation used in other settings.
In current price terms, the output of these activities is estimated through the "sum of costs" approach: adding together the intermediate consumption, labour costs and depreciation of fixed assets associated with these activities. This covers the value of goods and services consumed in the production process as well as the costs of the factors of production. These data are generally available for central government through the Online System for Central Accounting and Reporting (OSCAR) and from local government data collections, although estimates for depreciation are calculated through the ONS' perpetual inventory model.
In volume terms, the measurement of education output in the UK is based on cost-weighted activity indices. This involves gathering data on changes in the number of students in different educational settings and weighting them together according to their relative unit costs of production. Increases in the number of students in a relatively high (low) weight activity consequently increase measured education output by a relatively large (small) amount.
More about coronavirus
To implement this approach, we gather information on the number of students in eight different educational settings – ranging from nursery schools to secondary schools, special schools to teacher training courses – from England and the devolved administrations. To measure school activity – which is the largest part of education output – we have generally relied on annual estimates of student numbers provided by the school censuses carried out across the UK.Back to table of contents
To meet the measurement challenges, we have considered a range of different options, settling on a novel approach that reflects the spirit of these issues.
In this approach, we have been guided by consultation with national accounting experts, both from inside and outside the Office for National Statistics (ONS), and by the existing international guidance on measurement. However, as formal guidance on some of the methodological points involved is not yet available, further iterations of this approach may be required. This in turn generates the possibility for larger than usual revisions.
Our approach involves broadly two steps. First, an effort to calculate accurate numbers of students actually attending at school. Secondly, a method for assessing the contribution of "remote learning" to student volumes.
The first stage of our response has been to incorporate newly available data on the number of students attending school. As educational provision has continued for the children of key workers and for vulnerable children, these data provide a helpful baseline for our output estimates. These series are available for England and Wales, providing daily returns from shortly after the school closures.
These data enable us to estimate the number of student attendances in the first quarter of 2020 – including both the period of normal provision and that which followed the school closures. They have also led us to consider our activity metrics for different settings in some detail. Estimated student numbers in the first quarter of 2020 are now based on the following calculations:
We projected annual estimates of student populations in each setting using growth rates from the Department for Education (DfE); these were for overall growth in student numbers of 1.5% and 1.2% in 2019 and 2020 respectively.
We used our usual cubic splining methodology to calculate a quarterly path through these student projections, giving us our quarterly student numbers starting point.
We divided total attendance in the first quarter of 2020 by the total number of teaching days (62), multiplying by the number of these for which the schools were open as normal (55); this gave us a measure of normal provision up to the point of the school closures.
We calculated an average attendance rate for the remaining seven days of provision using data on attendances from the DfE to weight the remaining amount of "normal" provision; we added this to the "pre-closure" level of output, assuming that every student still attending school is equal to one full-time equivalent (FTE).
This approach carries several limitations. Some of these we will revisit in coming months, while others are intrinsic to our approach. In particular:
our pupil projections and absence rates, and our attendance data, are an average across the range of school settings in England; we will try and include as much information from the devolved administrations as possible in the coming months
we have not sought to adjust pupil projections for different settings to reflect policy developments; in particular, attendance at academy schools has been growing strongly in the history – reflecting the conversion of primary and secondary schools – and we will consider this in the coming months as well
we have assumed that educational instruction provided in schools at present is "equivalent" to normal even though we know that as, for example, the National Curriculum has been suspended, this is unlikely to be the case; however, reflecting the underlying ambiguity of the measures, we have not attempted to adjust for the relative value of "childcare" and "education" services
we have assumed that educational instruction is at a constant rate through the available school days in the quarter
The result of this approach is a considerable fall in the number of FTE attendances in schools in the first quarter of 2020. This effect has been muted by the "normal" operation of schools for much of this period.
Counting "remote learners"
Simply counting the number of attendances in schools is likely to understate the "true" value of education services provided because it does not consider remote learning. However, estimating the contribution of remote learning precisely is difficult. Here, there is a spectrum of options ranging from counting each remote learning FTE as equivalent to an attending FTE (making the assumption that instruction in school and at home is equivalent), to discounting these "remote learners" entirely (assuming no educational provision in the home). Approaches closer to the first option are likely to overstate provided education, while options closer to the latter are likely to understate it.
To produce an appropriate "weight" for remote learners, we have considered two factors: the change in average teacher input and the fraction of learning that is dependent on parents and guardians.
For the former, we propose to adjust the number of "remote learning" FTEs by the change in average teacher input. Here, we argue that if teachers are working fewer hours than normal, then this reflects a reduction in the amount of input they are providing to school children compared with normal. We assume that this reduced input has a proportional impact on the volume of output produced.
For the latter, we propose a further adjustment to capture the dependence of remote learning on instruction and support in the home. Here, we argue that some of fraction of the remaining provision is being provided by parents – and consequently accrues as household production, rather than contributing to gross domestic product (GDP).Back to table of contents
To implement these adjustments, we have delivered a survey to teachers via Teacher Tapp, a survey app run by Educational Intelligence Limited. This company was established by a team with experience from the academic sector, schooling and education journalism. Teacher Tapp maintains a sample frame of teachers across England in a range of different educational settings. Their smartphone app poses questions to teachers on a daily basis. The results are weighted using the English School Workforce Census (on the basis of the sex, age and leadership status of teachers and on the region and setting of the school) and the resulting intelligence is used to inform policy debates.
To inform our proposed adjustments, Teacher Tapp posed three questions to teachers on Sunday 26 April 2020. These and the results they provided are shown in this section. We propose to base our adjustments on this evidence, backdated to the first quarter of 2020.
This question received 6,436 responses from teachers in state-funded primary and secondary schools. It enables us to estimate average hours worked over the reference period (week commencing 20 April 2020) in these settings – using mid-points for the brackets provided. This suggests that there is relatively little difference between the hours worked by primary and secondary school teachers over this period: 26.3 hours on average in primary schools and 27.3 hours on average in secondary schools.
This question received 6,434 responses and enables us to ask about the extent to which teacher hours have changed compared with the pre-closure position. For both state-funded primary and secondary schools, a large fraction of teachers report working fewer hours during the reference week than during the pre-closure period, although some teachers in both settings report working longer working weeks.
The difference here is more notable for primary than for secondary schools: in primary schools, the average working week was 11.0 hours shorter during our survey week than in normal times, whereas in secondary schools, it was just 9.0 hours shorter.
This question received 6,434 responses across state-funded primary and secondary school teachers and gives us some sense of the importance of delivery through parents and guardians in the home. Using these data, we can estimate the extent to which learning was supported by household members in primary and secondary settings.
Among primary school teachers, the largest weight of the responses fell on 0%, 50% and 100%. This indicates considerable division among teachers in this group about the extent to which parents were important in supporting instruction. Among secondary school teachers, however, the majority reported that instruction was not at all dependent on parents. These results align with our expectations: the stronger reading, IT and independent learning skills of older students make it more likely that they will be able to work through instruction provided directly by teachers than primary school children. These results indicate that 40% of learning outcomes in primary schools are dependent on parents, whereas 9% of learning in secondary schools is dependent on parents.
Using the data provided, we proceeded to estimate the “discount factor” for remote learning full-time equivalents (FTEs). We first estimated the change in average hours worked for primary and secondary school teachers separately. These are 71% and 75% respectively. We then further discount by the share of learning outcomes that are dependent on teachers. This implies that a primary school student who is learning from home is worth around 42% of an FTE (1FTE multiplied by 71% multiplied by 60%, allowing for rounding), and a secondary school student who is remote learning is worth around 68% of an FTE (1FTE multiplied by 75% multiplied by 91%, allowing for rounding).
These factors give us a basis for estimating FTEs among remote learners, which we add to the FTE numbers who are attending in each setting. These adjusted FTE data form the basis for the education output data that we contribute to the compilation of gross domestic product (GDP).
Finally, it is worth noting two further things that we do not do.
We have not, at this stage, considered any changes to the cost weightings that are applied to different forms of education. Arguably, if the costs of provision have changed more notably for some forms of activity than others, then we might need to consider these changes in the future.
We have not taken account of how much (or not) students are participating in the learning provided. This is consistent with draft international guidance and is reflective of our 'normal approach' to measurement. We do not, in normal times, adjust for the level of student engagement with classes – and so on this basis we have not sought to further discount remote learning FTEs by any factor to represent this consideration.Back to table of contents
The coronavirus (COVID-19) pandemic has had a significant impact on our measures of government education output. While our current price metrics are largely unaffected to date, our estimates of healthcare volumes have been impacted – in particular, by the closure of schools at the end of March 2020.
To account for this, consistent with international guidance, we have approached the measurement of education output in two stages for the first quarter of 2020. First, we have incorporated estimates of the number of children actually attending school in England, counting these attendances as equivalent to the period prior to the closure of schools. Secondly, we have chosen to "count" remote learning towards education output, with two adjustments: one for the change in the amount of teacher input applied over this period and a second for the amount of delivered education that is dependent on instruction in the home.
Our practical implementation of this conceptual approach has involved a number of assumptions. Some of these are a matter of expediency and will be reviewed in the coming months. Others are more intrinsic to our approach. We await finalised international guidance on these matters, which may alter or increment our approach, which reflects our best interpretation of the available guidance and our understanding of the UK context. In view of both these features of our approach, we expect our initial estimates to be subject to larger revisions than normal.Back to table of contents
Contact details for this Article
Telephone: +44 (0)1633 651609