1. Executive summary

The census is a once-in-a-decade, compulsory survey that gives us the most accurate estimate of all the people and households in England and Wales.

The data collected help central and local government, health authorities, and many other organisations to target their resources more effectively and to plan services such as housing, education, health and transport.

In December 2018, the government presented a White Paper to Parliament: Help Shape Our Future: The 2021 Census of Population and Housing in England and Wales. The White Paper detailed the topics the Office for National Statistics (ONS) recommended for inclusion in Census 2021. It also set out that, for the first time, it would be an online-first census.

An online-first census

Census 2021 has been developed so the electronic questionnaire will be the primary mode of completion. The ONS has a target of 75% of census returns to be submitted online.

Paper forms will still be available on request.

Welsh language versions of both the electronic and paper questionnaires will be available in Wales.

Extensive stakeholder engagement, research and testing over more than three years have informed the development of these questionnaires.

Back to table of contents

2. Aim of the Census 2021 question and questionnaire development overview

The Office for National Statistics (ONS) is publishing a series of Census 2021 question development topic reports. These will provide a detailed explanation of the research, testing and evaluation we have carried out to arrive at our question designs for Census 2021.

This report explains:

  • our approach to question development
  • the design process of the online and paper census questionnaires
  • how we ordered the census questions
  • the criteria we used to evaluate the final question designs
  • our next steps as we prepare for Census 2021

This report complements the question development reports on the topics that make up Census 2021:

  • armed forces community (veterans)
  • communal establishments
  • counting residents and visitors
  • demography (including household relationships and marital and civil partnership status)
  • ethnic group, national identity, language and religion
  • gender identity
  • health and unpaid care
  • housing
  • labour market: economic activity and hours worked
  • labour market: occupation, industry and travel
  • qualifications
  • second address and migration
  • sexual orientation

We will also publish the following reports that cover areas of development not related to the final question designs: Output and enumeration bases: residential address and population definitions for Census 2021 and Volunteering question -- not recommended for inclusion in Census 2021.

Back to table of contents

3. Approach to question development

We have conducted extensive stakeholder engagement, research and testing to inform the design of the questions on the questionnaires for Census 2021 in England and Wales. A comprehensive list of tests carried out by the Office for National Statistics (ONS) between 2016 and 2020 that have informed the design of the Census 2021 questions is provided in the Summary of testing for Census 2021.

Each question's development was unique, based on our findings at each stage. However, all testing followed a basic structure, beginning with engagement with data users to understand their requirements, followed by a programme of qualitative, quantitative and user experience (UX) testing.

The same data are collected on both the electronic and paper questionnaires. However, we have optimised the question designs separately for each version of the questionnaire to ensure that we collect the best quality data.

Stakeholder engagement

The starting point for our engagement, research and testing programme was the 2015 public consultation, The 2021 Census -- Initial view on content for England and Wales (PDF, 3.6MB). In 2016, we published our consultation response (PDF, 796KB). This detailedthe scoring mechanism we used to evaluate users' data needs, provided the rationale for our decisions, and outlined our proposals and research plans.

Since the initial public consultation in 2015, the ONS has continued to work with stakeholders, topic experts and other interested parties to better understand the detailed needs for specific questions and inform our final questionnaire design. We did this through a range of forums, including:

  • meetings and correspondence with stakeholder organisations
  • consulting experts on specific topics for their independent, specialist knowledge
  • presenting and seeking feedback at census events
  • seeking advice from Census Advisory Groups (CAGs), which represent the interests of local authorities, central government departments, academics, third sector bodies, business and professional bodies
  • collaborating with the Welsh Government and the Welsh Language Commissioner
  • establishing topic-specific working groups with representation from topic experts from the ONS, Welsh Government, National Records of Scotland (NRS) and Northern Ireland Statistics and Research Agency (NISRA)
  • conducting stakeholder surveys on specific topics

Following the public consultation, we have published information on our research and testing, including two census topic research updates: 2021 Census topic research: December 2017 and 2021 Census topic research update: December 2018. These reports focused on decisions around which topics to include rather than the exact wording of the questions.

In December 2018, the government presented a White Paper to Parliament: Help Shape Our Future: The 2021 Census of Population and Housing in England and Wales. The White Paper detailed the topics the ONS recommended for inclusion in Census 2021.

Following the publication of the White Paper, we continued our programme of research and testing into the final recommended question designs for Census 2021. The question development topic reports and the Summary of testing for Census 2021 provide details of this programme.

Research methods

We used a variety of qualitative and quantitative research methods to develop the questions for Census 2021. Table 1 provides an overview of the qualitative methods used, and Table 2 provides an overview of the quantitative methods used.

Most qualitative testing was conducted in either English or Welsh, while most quantitative testing was conducted in English and Welsh.

Sampling methods

When carrying out research, it is rarely feasible or economical to study every member of the population or sub-population being researched. Sampling involves selecting a segment of the population to take part in the study so that you may make inferences about the whole population.

Throughout the census research and testing programme, we have used several different sampling methods, depending on the aims of the test.

The sampling method used should reflect the aims of the research. The sample size should be representative of the population being studied and large enough that findings can be generalised to the whole population where appropriate. However, if the aim is to infer the extent of views on a topic, a different approach to sampling may be required.

Developing questions in the Welsh language

The Welsh language versions of the census questions are not simply a translation of the English questions. New and amended questions have been developed in both Welsh and English.

To ensure questions adhere to Cymraeg Clir guidelines, some changes to the text or questions were translated by our contracted specialist Welsh language translation service provider. These changes were quality assured by the Welsh Language Census Question Assurance Group. This group was convened to give advice on the accuracy, clarity and acceptability of the language as well as other policy issues pertaining to the Welsh language and bilingual design. It includes Welsh language and policy experts from the Welsh Language Commissioner and Welsh Government.

We have completed dedicated question development testing in the Welsh language to optimise the question wording. You can find details of this testing in the Summary of testing for Census 2021.

Further information relating to Welsh language question development and testing can be found in the relevant question development topic reports.

Censuses in Scotland and Northern Ireland

The questions for England and Wales have been developed through close collaboration with the National Records of Scotland (NRS) and Northern Ireland Statistics and Research Agency (NISRA), which are responsible for developing the censuses in Scotland and Northern Ireland respectively.

A statement of agreement (PDF, 165KB) between the National Statistician and the Registrars General for Scotland and Northern Ireland was published in October 2015. The agreement details the conduct of the census and states that, while each country will be autonomous in their decision-making on the three separate censuses, they will aim to work collaboratively to meet the needs of users and provide harmonisation across outputs and procedures. In November 2016, an update on progress (PDF, 321KB) was published. In November 2019, we published a further progress update.

Back to table of contents

4. Electronic questionnaire design

In March 2014, following an extensive programme of research and public consultation, the National Statistician recommended an online-first census for 2021.

The Office for National Statistics' (ONS') target for Census 2021 is for 75% of all household responses to be completed online. This figure is based on previous and international experience of online collection and predictions of the digital take-up of services.

The aims of the online-first approach are to:

  • improve data quality
  • ease respondent burden
  • enable responses to be processed faster
  • reduce costs

The electronic questionnaire has been developed according to Government Digital Service (GDS) standards. We have worked closely with the GDS to ensure that Census 2021 meets the required standards. As part of this work, we assessed the electronic questionnaire used for the 2017 Census Test, and the findings were published in the Census Test 2017 - Beta Assessment.

When we refer to the Census 2021 electronic questionnaire, this includes several different types of questionnaire. These are:

  • household questionnaire
  • individual questionnaire
  • communal establishment questionnaire

The electronic questionnaire design elements described in this report apply to all versions of the electronic questionnaire, unless stated otherwise.

User experience (UX) testing

As Census 2021 will be online-first, a central component of our research was to redesign the questionnaire to optimise for online response. This research focused on:

  • problems preventing successful completion of the online questionnaire
  • respondent ability to navigate around the questionnaire
  • respondent understanding of the questions
  • overall respondent burden

We have conducted UX testing on a rolling basis since November 2017. As of October 2019, 458 interviews had been conducted at 99 events. Participants were purposively selected to cover a wide range of ages and digital abilities, and we included participants with physical and mental health conditions or illnesses. The research took place using a range of devices and assistive technologies.

UX testing is ongoing and contributes to the decisions on the design of the electronic questionnaire.

Responsive design and accessibility

The electronic questionnaire features a responsive design. This means the layout of questions and web pages will adapt to be optimised for different devices, such as mobile phones, laptops and tablets. We have tested the electronic questionnaire on all web browsers that are used by more than 2% of the population.

The design is inclusive, accessible and works with a range of assistive technologies. The design conforms to level AA of the Web Content Accessibility Guidelines (WCAG) 2.1.

Electronic questionnaire structure

All households will receive a household access code by post. This provides the household with access to the electronic questionnaire.

When a respondent first accesses the electronic questionnaire, they will first need to confirm their address is correct and complete the "People who live here" section. This will generate the remaining sections of the questionnaire.

The sections that appear in the electronic questionnaire are:

  • People who live here
  • Household accommodation
  • Individual questions (each household member has a separate section)
  • Visitor questions (each visitor has a separate section)

Section and information pages

Each section of the electronic questionnaire begins with a page describing what the respondent will be asked and what information they will need in order to answer the questions to follow.

Additional information pages appear before the individual questions on qualifications, main job and last main job. These provide information that will help respondents answer the questions that will follow.

Hub pages

The hub page shows a list of questionnaire sections and their completion status. Respondents can use these pages to:

  • select a new section to answer
  • navigate to the summary page for a completed section
  • review their progress through the questionnaire

Questions

Questions consist of, at minimum, a question stem and one or more response options. Where possible, we have sought to minimise the amount of additional information shown on screen for each question. However, testing has shown that some additional information can help respondents provide more accurate answers, leading to better-quality data.

This additional information can be presented in different ways - we refer to each way as a design element. No single question includes all available question design elements.

Question stem

The main question text is displayed first. The content of the question stem can change based on previously provided responses. For example, the first line of the household address might appear in the question stem for questions about the household.

Question description and include panel

Some questions are followed by additional lines of guidance to inform respondents of what should be included in their response. The question description is displayed as plain text, and the include panel appears in a box.

Accordion guidance

On the electronic questionnaire, we have included additional detailed guidance in an accordion. The respondent can click on the title to reveal the more detailed guidance. The accordion might appear above or below the response options, depending on its purpose.

An upper accordion defines words or acronyms used within the question, such as what we mean by "main language" or "national identity". This provides additional context to the question stem.

A lower accordion explains why we are asking the question, such as why we collect workplace address. This helps to explain to respondents why the information is being collected, particularly where there may be concerns over privacy.

Response options

Response options are the possible answers to the question. Response options appear as interactive elements and fall into one of three categories:

  • tick-box: respondents can select one or more responses to the question
  • radio button: respondents can only select one possible response to the question
  • write-in: respondents can provide a written response to the question

Response options may also include additional information as labels. These provide more context to help respondents answer the question. For example, the central heating "solid fuel" response option includes the label "For example, coal" on a second line.

Summary pages

At the end of each section, respondents will be returned to the hub page. From there, they can choose to review a summary of their responses on the summary page. If any questions have not been completed, the respondent can click "Change" to navigate back to the question and change their response.

Every section must be completed before the questionnaire can be submitted

Electronic questionnaire features

We have incorporated several features that will help respondents to complete the electronic questionnaire. These features aim to reduce the respondent burden and improve the quality of the data collected. All electronic questionnaire features have resulted from a respondent need and developed iteratively through UX and other testing.

Clear navigation

The electronic questionnaires have been designed so that respondents can easily:

  • navigate to the next question
  • review previous answers
  • return to the hub page to complete the next section

Online help web page

The questionnaire will contain links to the online help web page, which provides a comprehensive source of census information and guidance.

Welsh language switch

Respondents in Wales will be able to switch between Welsh and English language forms at any stage of the electronic questionnaire using a link located in the top-right corner of the screen. On Welsh language screens, this will read "English", and on English language screens this will read "Cymraeg".

Automatic routing

Respondents will only be presented with questions that are relevant to them, based on the answers previously given. For example, respondents who have never worked will not be presented with the questions on their most recent place of employment.

On the paper questionnaire, all respondents can see every question, whether or not it is relevant to them. Routing instructions are provided to allow respondents to skip to the next relevant question or section. This increases the burden on respondents, who have to read and correctly follow the routing instructions to avoid answering questions that do not apply to them.

Automatic routing on the electronic questionnaire allows for additional routing that is not possible on the paper questionnaire. For example, the respondent's date of birth can be used throughout the questionnaire to skip over certain questions.

Automatic routing also allows us to present respondents with different versions of a question depending on the answers they have previously given. For example, only respondents aged 16 years and over will see the guidance note on the sex question that "A question about gender will follow". Respondents aged 15 years and under will not see this guidance note, as they will not be asked the gender identity question.

Automatic text fill using previous responses

Automatic text fill is the process of using a respondent's previous responses, such as a name or address, elsewhere in the questionnaire. This reduces respondent burden by making questions more clear and reducing the number of times that respondents need to enter the same information.

For example, at the start of the individual questions, respondents are asked if they are answering about themselves or on behalf of another household member. If they are answering on behalf of someone else (that is, providing a proxy response), that person's name will be included in the question stem.

Search-as-you-type

For some questions, when a respondent starts typing a write-in response, a list of suggested answers will appear. This search-as-you-type functionality is included on the following questions:

  • country of birth
  • national identity
  • ethnic group
  • religion
  • main language
  • passports held

There will remain the option to write in a response that does not appear as a suggested answer, if the respondent chooses to do so.

Mutually exclusive response options

Where two or more responses to a single question cannot both be true at the same time, the electronic questionnaire includes functionality to de-select previously ticked responses.

For example, if a respondent selects one or more responses and then selects "None of these apply", then any previously selected responses will be de-selected automatically.

This functionality removes the possibility of a multi-tick error when processing the online returns.

Saved responses

Responses are saved after every question. If a respondent closes or logs out of a partially completed questionnaire to complete it at another time, their responses will not be lost.

In order to resume a partially completed questionnaire, the respondent will need to input their household access code or, if they have requested an individual questionnaire, individual access code.

Validating responses

The electronic questionnaire allows us to build in checks to improve the quality of the data collected. This is done using error messages and additional questions.

Error messages

If a respondent provides an unexpected response, such as an invalid date (for example, "02 13 2020"), then tries to submit their answer, they will be presented with an error message that explains how they should correct their answer.

Some questions must be answered before the respondent can proceed to the next question. For example, the response to the date of birth question will determine which questions the respondent will see later in the questionnaire. If a respondent tries to advance without answering a required question, they will receive an error message to provide an answer in order to proceed.

Additional questions

Some additional questions will be used on the electronic questionnaire to prompt the respondent to check their answer before submitting it. For example, after a respondent answers the date of birth question, they will be shown an additional question displaying their age and asking if this is correct. This helps to ensure that we collect a vital demographic variable as accurately as possible.

Back to table of contents

5. Paper questionnaire design

Some people will need, or prefer, to respond to the census using a paper questionnaire. The Office for National Statistics (ONS) will provide paper forms to any respondent who asks for them through our dedicated contact centre.

When we refer to the Census 2021 paper questionnaire, this includes several different types of questionnaire. These are:

  • household questionnaire
  • household continuation questionnaire
  • individual questionnaire
  • communal establishment questionnaire

The paper questionnaire design elements described in this report apply to all versions of the paper questionnaire, unless stated otherwise.

Changing the structure of the household form

The starting point for the design of the Census 2021 paper questionnaires was the design used in the 2011 Census. However, the addition of three new questions for Census 2021 required that we included an extra page per person on the paper questionnaire. To offset the increased cost of printing longer forms, we reduced the number of respondents on the household form from six to five. Any households with more than five members can request a continuation form so that all household members can be recorded.

In the 2011 Census, the household questionnaire included a page of information for respondents at the end. In Census 2021, this information will be provided on a separate information leaflet. This will make the guidance more obvious to respondents.

Improving readability and accessibility

We have changed the colours on the paper forms to increase contrast, improving the readability of the questionnaires:

  • question numbers have been changed to purple text on a white background
  • question stems appear on a white background, rather than a purple background
  • instructions are written in purple text on a white background
  • routing instructions have a white background to make them stand out

The changes in design on paper from the 2011 Census to Census 2021 are illustrated in Figures 5 and 6.

Back to table of contents

6. Order of the census questions

Where possible, we have sought to keep the Census 2021 questions in the same order as the 2011 Census for consistency. However, some changes to question order have been necessary.

The factors leading to a change in question order for Census 2021 are:

  • the addition of new questions
  • interaction between questions
  • grouping of similar questions

The household questions remain in the same order as in the 2011 Census, but we have made alterations to the order of the individual questions.

Changes to the 2011 Census question order

This subsection summarises the list of changes to the question order between the 2011 Census and Census 2021. More details on these changes can be found in the relevant question development topic reports. The changes include:

  • the date of birth question has been moved before the sex question, as the respondent's date of birth is used to change how the sex question is displayed for respondents aged under 16 years old and respondents who are aged 16 years old or over

  • the new questions on sexual orientation, gender identity and armed forces leavers will only be answered by respondents aged 16 years or over; the questions on sexual orientation and gender identity are asked before the qualifications questions, and the question on armed forces leavers will be asked after the qualifications questions

  • the questions on general health and inactivity owing to health problems or disability have been moved together to help the flow of the questionnaire and meet the Government Statistical Service (GSS) harmonisation recommendation on how to order these questions

  • the question on address one year ago has been moved before the national identity question owing to space considerations on the paper questionnaire

Back to table of contents

7. Question design evaluation

In May 2016, we published our response to the 2021 Census user requirements consultation (PDF, 796KB). In this report, we evaluated each census topic against five criteria to ensure good quality data could be collected in a manner proportional to users' needs. Each topic was rated as having a low, medium or high impact on each of the five criteria.

This evaluation informed our research and testing programme. For example, where including a topic presented a potential negative impact on one or more criteria, we tested alternative question designs that could reduce this impact.

The final recommended questions designs have been re-evaluated against the same criteria. Our revised evaluations for the final recommended questions will be published in the relevant question development topic reports.

Evaluation criteria

We evaluated each census question for its potential for impact on data quality, public acceptability, respondent burden, financial concerns and questionnaire mode.

Each question is scored against a number of subjective and objective factors that impact each of the five criteria. These factors include question design considerations and evidence collected from testing. The details of these factors can be found in Annex 2.

Questions receive an overall evaluation score of having a low, medium or high potential for impact on each of the five criteria. We refer to the "potential for impact" as we are unable to replicate the context of Census 2021 in testing.

Data quality

The data collected in the census should be of sufficient quality for outputs to be useful. The census should not seek to collect information that is not readily known or remembered accurately.

Public acceptability

The census should not ask sensitive or intrusive questions that have a negative impact on response or lead to respondents giving socially acceptable (rather than accurate) answers. The census should not inquire about opinions or attitudes. The census is carried out for statistical purposes. It should not collect data that would deliberately promote political or sectarian groups or sponsor particular causes.

Respondent burden

The inclusion of questions on a topic should not impose an excessive burden on respondents. Burden could result from lengthy instructions or explanations, large numbers of response categories, or large numbers of questions on a single topic.

Financial concerns

Questions should not present major coding problems or require extensive processing. Questions should not significantly add to the overall cost of the census by causing high levels of field follow-up owing to non-response.

Questionnaire mode

The move to predominantly online data collection creates new opportunities as well as challenges. Although the primary mode of data collection will be online, there will also be a paper questionnaire. Questions should be designed so that respondents interpret the questions and answer consistently across all modes, allowing consistent data to be collected.

Back to table of contents

8. Equality impact assessment (EIA)

Under the Equality Act 2010, public sector bodies must have due regard to the need to:

  • eliminate discrimination, harassment, victimisation and any other conduct that is prohibited by or under the Act
  • advance equality of opportunity between persons who share a relevant protected characteristic and persons who do not share it
  • foster good relations between persons who share a relevant protected characteristic and persons who do not share it

An EIA is used to examine whether a policy or project has the potential to affect communities or people differently.

By its nature, the census is designed to be completed by everyone in England and Wales. In December 2018, we published the EIA for the 2021 Census. This assessment was based on the design as understood in the White Paper.

Back to table of contents

9. Next steps

A draft Order in Council in respect of the census in England and Wales has been laid before Parliament. This sets the date of the census and prescribes the particulars to be stated in census returns. This will be followed by Census Regulations, laid separately for England before Parliament and for Wales before the National Assembly for Wales. These set out how the census will be run and the final questions to be used.

We are further investigating the online questionnaire structure and functionality to be used for Census 2021. We will evaluate the results of the 2019 Rehearsal to inform our decisions. This includes:

  • the position of summary pages
  • the user journey for accessing online help
  • exploring search-as-you-type functionality
  • evaluating the online guidance

Our work in the lead up to Census 2021 will focus on understanding and meeting user needs for outputs and processing.

Back to table of contents

10. Annex 1: Summary of question order on the household questionnaire

Household questions

  • Who usually lives here
  • Number of people
  • Names of people in household
  • Type of visitors
  • Number of visitors
  • Household relationships
  • Type of accommodation
  • Self-containment of accommodation
  • Number of bedrooms
  • Central heating
  • Tenure
  • Type of landlord (if renting)
  • Number of cars and vans

Individual questions (before those aged 16 years and over age routing)

  • Name
  • Date of birth
  • Sex
  • Legal marital or civil partnership status
  • Type of marital or civil partnership
  • Second address
  • Second address type
  • Full-time education
  • Term-time address
  • Country of birth
  • Date of arrival in the UK
  • Intention to stay in the UK
  • Address one year ago
  • National identity
  • Ethnic group
  • Religion
  • Welsh language skills (asked in Wales only)
  • Main language
  • English language proficiency
  • Passports held
  • General health
  • Long-term health problems or disability
  • Inactivity owing to health problems or disability
  • Unpaid care

Individual questions (after those aged 16 years and over age routing)

  • Sexual orientation
  • Gender identity
  • Apprenticeship
  • Degree
  • Any other qualifications
  • Armed forces leavers
  • Employment status last week
  • Reasons for not working
  • Looking for work
  • Availability to work
  • Waiting to start a job already accepted
  • Ever worked
  • Employer's organisation name
  • Job title
  • Job description
  • Employer's main activity
  • Supervisory status
  • Number of hours worked
  • Method of travel to work
  • Type of workplace
  • Workplace address

Visitor questions

  • Visitor's name
  • Visitor's date of birth
  • Visitor's sex
  • Visitor's usual address
Back to table of contents

11. Annex 2: Evaluation criteria

Data quality

Question design factors that could impact data quality

  • Have new response options been added to the question since 2011?
  • Does this question have a write-in response option?
  • Is there potential for respondents to remember details inaccurately?
  • Does the question ask for information that a proxy respondent might not know?
  • Does the question ask for sensitive or subjective information that a proxy respondent may not be able to answer?

Evidence from testing used to inform our data quality evaluation

  • How many respondents did not answer this question?
  • How many respondents stopped completing the questionnaire at this question?
  • How many respondents selected mutually exclusive responses?
  • How many respondents clicked on the guidance accordion?
  • How many respondents used the "Previous" button to return to an earlier question?
  • How long on average did respondents spend on this question?
  • Did respondents provide negative feedback about finding the question difficult to answer?

Public acceptability

Question design factors that could impact public acceptability

  • Is there potential for respondents to provide a socially acceptable answer, rather than an accurate answer?
  • Is this question new for Census 2021?
  • Does the question ask for information that a person might not want a proxy respondent to answer on their behalf?

Evidence from testing used to inform our evaluation

  • How did the question perform in public acceptability testing?
  • How many respondents did not answer this question?
  • How many respondents stopped completing the questionnaire at this question?
  • How many respondents clicked on the guidance accordion?
  • Did respondents provide negative feedback about finding the question unacceptable?

Respondent burden

Question design factors that could impact respondent burden

  • How many words make up the question stem, instructions and response options?
  • Are there more response options than in the 2011 Census?
  • Does the question include a write-in response option?
  • Does the question ask respondents to provide a subjective answer?
  • Is there potential for respondents to remember details inaccurately?
  • Does the question ask for information that a proxy respondent might not know?

Evidence from testing used to inform our evaluation

  • How many respondents did not answer the question?
  • How many respondents stopped completing the questionnaire at this question?
  • How many respondents selected mutually exclusive responses?
  • How many respondents clicked on the guidance accordion?
  • How many respondents used the "Previous" button to return to an earlier question?
  • How long on average did respondents spend on this question?
  • Did respondents provide negative feedback about finding the question difficult to answer?

Financial concerns

Question design factors that could impact financial concerns

  • Does the question have mutually exclusive response options?
  • Does the question ask for any potentially sensitive information that could cause a respondent to request an individual form?
  • Does the question require any additional processing?

Evidence from testing used to inform our evaluation

  • How many write-in responses required manual coding?
  • How many respondents did not answer the question?
  • How many respondents stopped completing the questionnaire at this question?
  • How many respondents selected mutually exclusive responses?

Questionnaire mode

Question design factors that could impact questionnaire mode

  • Does the question ask for any potentially sensitive information?
  • Is there potential for respondents to provide a socially acceptable answer, rather than an accurate answer?
  • Are the question stems different between the paper and electronic questionnaires?
  • Are the response options different between the paper and electronic questionnaires?
  • Does the question use radio buttons to limit the number of response options that can be selected?
  • Do the response options include a write-in box?
  • Do the write-in response options use search-as-you-type functionality?
  • Do the online responses use mutually exclusive functionality?
  • Are the instructions different between the paper and electronic questionnaires?
  • Does the online question include additional guidance in an accordion?
  • Are there any visual differences between the paper and electronic questionnaires?
  • Has the online question been broken up into multiple questions?
  • Can the respondent proceed without answering the question online?

Evidence from testing used to inform our evaluation

  • Did any respondents provide negative feedback about questions with automatic text fill being confusing?
  • How many respondents opened the additional guidance accordion?
  • How many respondents selected multiple response options for a paper question that uses radio buttons in the online question?
  • How many respondents selected mutually exclusive response options on the paper questionnaire?
  • How many respondents did not answer the question on the paper questionnaire?
  • Is the average length of responses to a write-in response option longer than the space available on the paper questionnaire?
Back to table of contents