NHS patient surveys: introducing online data collection

Page last updated: 12 May 2022

We are always looking to improve the accessibility, quality and value of our research. As part of this, we are testing how we can introduce optional online data collection to our NHS patient experience surveys.

Until recently, our surveys were entirely paper-based and, as with other national surveys, we have seen a gradual decline in response rates. We have also noticed that some demographic groups are more likely to respond than others. This suggests a need to explore other ways of providing feedback that do not involve paper.

In 2017, the Office for National Statistics announced its intention to move to electronic data collection. Researchers in the wider survey industry were also experimenting with the best ways of encouraging people to complete questionnaires online, using methods known as ‘push-to-web’ or ‘online first’. These methods mean people are encouraged to complete online questionnaires, but when this does not work, they are offered more traditional paper- based options.

We wanted to test these emerging methods in the survey programme and explore whether using SMS mobile reminders encouraged a broader range of people to feedback. Our aim was to provide easy access to our surveys for more groups of people and reduce survey burden.

With 5 different surveys in the programme, each survey comes with its own unique population characteristics. This means online options may not be practical or achievable for each survey.

Because of this, we are testing the feasibility of adopting a mixed-mode, online first method for each survey.

By testing one survey at a time, lessons from each pilot will feed into the next test.

We have carried out extensive development work and consulted with experts, stakeholders and people who use our data. The aim of this has been to make sure we maintain a high standard of quality, and that any changes do not have a negative impact on those who use our data.

If you would like to contact the NHS Patient Survey Team about this work, email patient.survey@cqc.org.uk.

Phase 1: Scoping

We carried out a scoping exercise to check whether mixed-method, online first data collection was appropriate for the programme and all those connected to it. People we consulted with included our stakeholders, experts, users of our data and other organisations.

Our NHS patient experience surveys have National Statistics accreditation and meet the UK Statistics Authority's Code of Practice for Statistics. Any development work we undertake must meet the standards of the code and serve the public good by being trustworthy, of high quality and provide value.


It was important that we carried out extensive consultation with users and stakeholders.

This is because we know that any changes will have an impact on our stakeholders and those that engage with our data. This could include changes to costs and the usability of data. This could include changes to data quality, costs and the usability of data.

Some of the consultation work we carried out during scoping is described in our digital engagement report.

Our consultation work included:

  • online engagement with the general public
  • focus groups and interviews
  • attending engagement events and holding discussions at conferences
  • conducting research with users of our data, including the public and colleagues from trusts

We also sought advice from the Head of Profession, noting that potential changes could impact on comparability with previous surveys.

The survey programme produces a newsletter which goes out to around 13,000 subscribers.

We used the newsletter to tell people about our plans to adopt a ‘push-to-web’/’online first’ methodology.

We asked people receiving the newsletter to complete a survey about the proposed changes. The survey was confidential and asked people to share their thoughts and concerns.

The survey was also sent to:

  • people who had accessed our surveys using the UK Data Service
  • survey leads at all acute and mental health NHS trusts
  • staff at NHS England and NHS Improvement (NHSE&I)


We also collaborated and built relationships with experts and key stakeholders.

For instance, we worked with the GP Patient Survey (GPPS) team, meeting on an ad-hoc basis to share pilot reports.

The NHS Patient Survey team input into GPPS pilot decisions and formed part of the GPPS advisory group.

We also attended and shared findings with the Cross-Government Survey Advisory Group shared by the Office for National Statistics (ONS).

To lead our work, we were keen to work with a research partner with proven expert knowledge of mixed-mode methodology, particularly incorporating online surveys.

We were keen to work with a research partner with proven expert knowledge of mixed-mode methodology, particularly incorporating online surveys.

We also needed access to an online survey platform that was:

  • capable of handling the volume of data we would be collecting
  • able to maintain high data quality
  • easy to use for all respondents
  • accessible for participants with additional needs

We carried out a procurement exercise with potential suppliers and appointed Ipsos MORI as our survey coordination centre for mixed-method surveys.

We have been able to access best practice drawn from our contractor's work with other clients, using this knowledge to develop similar approaches. We have also had access to expert survey methodologists' leading work in this field.


This scoping exercise highlighted key issues for us to consider.

These include:

Impact on our partners

Approved contractors and survey coordination centres carry out key survey tasks on our behalf. For example, they collect and process sample and response data.

The approved contractors provide a direct service for NHS trusts, reducing the time and resource burden on trusts.

We found that collating data from paper surveys alongside online surveys may see the cost to NHS trusts increasing, if not managed appropriately.

Approved contractors carried out a costing exercise for potential methodological changes for each pilot. We factored these costs in when making final decisions about feasibility and methodology.

Break in trends

Many of the surveys in the programme have a wealth of trend data at the national level, spanning up to 10 years. This data is valuable when monitoring changes in patient experience over time.

Changing the mode of data collection and the respondent profile can have methodological implications. This can introduce issues when comparing findings to earlier surveys.

We consulted with the CQC Statistics Team, the Head of Profession and the Government Statistical Service Good Practice Team.

After consultation, we took the decision that trends may be broken, but this should be assessed on a survey-by-survey basis. Although a break in time series data can be disappointing, it offers an opportunity to review the entirety of a survey’s methods and the scheduling of the whole programme.

Surveys are carried out during the same time period for each iteration. Breaking trends, gives us the opportunity to see if we could change scheduled dates to make better use of resources and staff time for all partners.

Resourcing and programme planning

The NHS Patient Survey Programme is a demanding and busy programme.

We carry out up to 5 surveys a year. Each survey takes around 15 months to complete from development to publication.

We use a programme plan for our mainstage surveys which helps us to plan our surveys, and required resource, years in advance.

We regularly review this plan. This helps us identify the most appropriate times to carry out feasibility testing, especially around staff availability and other programme demands.

Challenges of diverse populations

The survey programme consists of 5 surveys, each monitoring the experiences of a different NHS service. As a result, each survey population is different.

For example, the adult inpatient survey has the oldest population, while the children and young people survey has a population entirely under the age of 16.

This means we need to carry out feasibility testing for each survey. While some lessons can be passed on from survey to survey, in depth research and development will be required for each subject area.

We also need to consider how people with limited internet access or skills can access an online survey.

Professional capability

Our team is made up of experts in all aspects of survey methodology, as well as specialists in the policies covering the surveys they lead.

But we recognised early in our exploratory work that, as mixed-methodology is a new and emerging area, we would benefit from working with organisations who were already experts in this area.

We carried out a procurement exercise with potential suppliers and appointed Ipsos MORI as our survey coordination centre for mixed-method surveys.

This has allowed our team members to gain a depth of knowledge about ‘push-to-web’ methodology, while working alongside experts in the field. Additional training in online methodology was also offered to members of the team.

Next steps

We found that the majority of our users and stakeholders are in support of the ambition to introduce an online option for data collection.

Many said it reflected the greater use of online methods in day-to-day life, as well as how NHS trusts are starting to engage with service users online.

But there were also concerns about maintaining access for those who may have difficulties using an online approach.

Following the engagement work and considerations detailed above, we decided to pilot a mixed-mode adult inpatient survey, as well as the maternity survey and children and young people’s survey.

We will follow these with a second wave of pilots covering the community mental health survey and the urgent and emergency care survey.

After looking at our work plan over the coming years, Ipsos MORI have devised a road map that will help us deliver these pilots at a suitable pace.

We will continue to work with our stakeholders, service users, NHS trusts and survey data users, exploring concerns and challenges together as they arise.

We will also continue to maintain a good working relationship with the Head of Profession, as well as the GSS and the CQC Statistics Team. This will help us make sure our decisions:

  • are based on sound evidence
  • are impartial
  • meet the standards set by the UK Statistics Authority Code of Practice

Phase 2: Adult inpatient survey

The adult inpatient survey is the largest scale and longest running survey within the NHS Patient Survey Programme.

The survey was introduced in 2002 and has been running annually since 2004.

It has been entirely paper-based since its inception.

In 2019, 143 NHS trusts took part in the survey, with 76,915 people sharing their experiences of adult inpatient services; a response rate of 45%.

About the adult inpatient survey pilot

This pilot tested the feasibility of conducting the adult inpatient survey using a ‘push-to-web’ method, designed to encourage online questionnaire completion.

This included:

  • sending letters to participants asking them to complete the questionnaire online
  • administering the questionnaire online
  • sending SMS invitations and reminders to respondents

On the recommendation of our Head of Profession, we compared the findings of the ‘push-to-web’ and the postal method to assess the levels of non-response bias.

Development – Subject area

The consultation work during the scoping phase highlighted many of the challenges and key concerns that we would face when adopting a new methodology.

As well as these issues, we also need to understand the challenges and concerns specific to this subject area.


Each research and surveys officer in the surveys team has ownership over a single survey.

As a result, they develop expert knowledge of the subject area over a number of years. They are familiar with policy and standards relevant to their survey, as well as the respondent profile and challenges the profile may present.

The adult inpatient survey has historically had lower response rates from minority ethnicities and younger people. We carried out focus groups with these groups, as well as people who may be negatively impacted by a move to ‘push-to-web’ (e.g. people for whom internet access and use is low).

Stakeholder interviews

We carried out interviews with 5 stakeholders to gain a full understanding of:

  • how any policy changes could impact the pilot questionnaire (including horizon scanning)
  • how survey data is used
  • concerns about method changes

These interviews also strengthened our relationships with stakeholders, helping to develop an environment where feedback on the pilot could be shared easily.

Overwhelmingly, stakeholders agreed that a change to a push-to-web approach would be welcomed. However, concerns were raised about reaching populations, specifically older people, without internet access and whether they would respond to the SMS contact.

Interviews also discussed questionnaire design and a potential reduction in the number of questions. Interviewees were generally positive about a reduction in questions however, they also noted that the end result must also be meaningful and cover issues that are important to patients.

Development - Methodology

Methodology design

Based on the recommendation of our contractor, two variants of the ‘push-to-web’ approach were tested during the pilot. We also included a control group, using the standard postal methodology, to allow for comparisons.

Experiment 1 – 7 contacts in total: 4 paper mailings and 3 SMS mailings, including 1 paper questionnaire at the fifth contact.

Experiment 2 – 6 contacts in total: 4 paper mailings and 2 SMS mailings, including a paper questionnaire at the fifth and sixth contacts.

Control – 3 paper-based contacts, with a questionnaire at the first and third contacts.

Trust recruitment

We were aware of the burden that the pilot could place on trusts, in addition to the standard mainstage surveys.

Because of this, based on the recommendation from Ipsos MORI, we designed the pilot to achieve a sample size of 4,410 responses across 10 trusts only. This sample size was large enough to enable comparison between the experimental methodologies and the control group with reasonable statistical significance.

To ensure a good spread of trust types, trusts were selected using quotas based on:

  • trust size
  • response rate to previous Adult Inpatient Surveys
  • deprivation level (based on IMD area)
  • previous CQC service ratings

It was also important to allocate the sample to new and old methodologies within trusts to control for variability in trust characteristics.

We explained the purpose and requirements of the pilot via email and through a webinar session. It was also important for us to make trusts aware that participation was voluntary and not connected to CQC’s regulatory activity.

Questionnaire and supporting materials

The 2019 Adult Inpatient Survey questionnaire contained 82 questions over 12 pages, covering topics such as admission to hospital, the hospital and ward, as well as discharge from services.

Following discussions with Ipsos MORI, we came to understand that respondents will typically dedicate less time to online completion than paper-based alternatives (approximately 10-12 minutes). Because of this, best practice for online surveys required us to significantly reduce the questionnaire length.

A review of the questionnaire was undertaken by Ipsos MORI and the surveys team. We removed a number of questions and adapted others to a format more suitable for online completion. For example, vertical scales work better online than horizontal scales. The final result was a survey 50 questions in length.

Supporting materials, such as invitations to participate and reminder letters, were also adapted. This was done to reflect industry best practice and make sure they were appropriate for the pilot methodologies.

A copy of the questionnaire and other materials can be found in the appendices section of the Adult Inpatient 2019 Mixed-mode pilot results report.

All materials were cognitively tested with 9 service users, varying by region, sex, age, socio-economic background and internet usage. Interviews lasted between 45 and 60 minutes and covered:

  • The extent to which materials were engaging, persuasive and likely to secure participation
  • The extent to which the documents were comprehensive
  • Understanding of the language used
  • Layout of materials
  • Question validity and comprehension

Fieldwork and Analysis


Fieldwork ran for 12 weeks, from 3 October 2019 to 20 December 2019. This is 7 weeks shorter than the 18 weeks adopted by the 2019 mainstage survey.

10 trusts volunteered to draw samples following the mainstage survey sampling instructions.

For further details on the sampling process during the pilot, see the Adult Inpatient 2019 Mixed-mode pilot results report.


Data cleaning and analysis largely followed that of our mainstage adult inpatient surveys. We looked at differences for all questions (including demographic questions) to see where there were differences between the control group and experimental groups. We also carried out regression analysis and controlled for demographics to fully assess the differences in response rates.

Any other deviations from our standard processes are discussed in the Adult Inpatient 2019 Mixed-mode pilot results report. This report discusses both national and trust level results, as well as additional analysis, such as the fourth mailing analysis.

Quality Assurance

There is an institution-wide quality assurance policy in CQC, as well as a five-stage quality assurance process. While the number and type of outputs differed to our mainstage surveys, this policy and process was adopted for the Adult Inpatient Survey pilot and will be adopted across all future pilots.

For details on our QA processes, refer to the survey programme's quality statement.

For details about how the data we use is quality assured, refer to the Survey programme's statement of administrative sources.


The results from the Adult Inpatient pilot were analysed against the following elements:

  • Response rate and online response rate of the experimental groups
  • The costs associated with the revised response rates
  • The impact on comparability of results between trusts
  • Demographic profile differences
  • Questions response differences
  • Impact on trends

Key results

  • Overall response rates of the experimental groups were similar to the control group
  • When comparing the experimental methodologies, experiment 1 (in which patients were provided with one paper questionnaire) was more effective at moving people online
  • Experiment 2 (with two paper questionnaires) secured a higher response rate
  • Experimental groups are either as representative or more representative than the control group
  • Those in experimental groups reported slightly, but statistically significant, lower levels of long-term conditions
  • Some key questions show responses to be more negative in the experiment groups. This implies that a shift to mixed-mode for this survey would affect trends, causing a break in time series data
  • Around 20% of people who logged in, dropped out before completing the first question
  • Analysis was consistent at trust level as well as national level, implying a shift to mixed-mode methods would not impact trust comparability

The experimental groups included a fourth mailing which is an additional postal mailing not currently included in the mainstage survey. Analysis was conducted to determine the impact on response rate and demographics. The response rate would be negatively impacted if the fourth mailing was removed, however the extra mailing would come with additional costs for trusts. We consulted with NHS trusts on the increase in costs and they opted for a three-mailing approach.

Next steps

Considering all findings and feedback on pilot results, we decided it was feasible to adopt a mixed-mode approach to data collection for the Adult Inpatient Survey 2020. Following discussions with the Lead Official (Helen Louwrens – Director of Intelligence) and the Intelligence Senior Leadership Team, we decided to adopt the methodology from experiment one, with respondents receiving one paper questionnaire at the fifth contact. Following consultation with NHS trusts and approved contractors, we have also decided to adopt a three-mailing approach to reduce costs.

After determining a break in time series will be necessary, other aspects of the Adult Inpatient Survey methodology have been reviewed (e.g. sample period) and the questionnaire will be redeveloped in collaboration with stakeholders and data users.

Reports and sharing findings

All findings from the pilot are shared in the Adult Inpatient 2019 Mixed-mode pilot results report.

We are keen to share this experience and our learning with the wider research community and add to the growing knowledge of online survey methodology. To help achieve this, we spoke about the pilot and overall findings at the General Online Research Conference in 2020 alongside Ipsos MORI, and presented to the Cross-Government Surveys Group. We are also recording this journey online as a source of information for others who are exploring transitioning to mixed-method research.

In addition to this, we maintained our close working relationship with NHSE&I. As one of our main users, it was important to include them in the development of the pilot, as well as sharing results, to share examples of methodological development.

We also sought advice from the Head of Profession regarding how we engage with our stakeholders regarding the break in time series data, ensuring that our approach to sharing findings adhered to the Code of Practice.

About the mainstage adult inpatient survey 2020

In line with the strategic direction of the NHS Patient Survey Programme, we piloted this survey using a mixed-mode methodology, introducing an option for online questionnaire completion. The pilot found that this survey would be suitable for a switch to mixed mode as there was no negative impact on response rate and overall, the new methodology improved representation of key demographic groups.

However, we did find some key questions showed responses to be slightly more negative when people completed questionnaires online. Given this subtle but consistent shift in the way people responded using our new method, we decided it would not be appropriate to compare results with previous surveys. We accepted a break in time series data was necessary and appropriate, and took the opportunity to update a number of facets of the method and questionnaire content.


As with every mainstage survey, we carry out extensive development work to make sure that the survey we create is as refined and relevant as possible. Details of the development of the survey can be found in the adult inpatient survey 2020 development report.


Learning from the Adult Inpatient pilot, as well as consultation with NHS trusts and approved contractors about costs, we have adopted the following mailing schedule for the Adult Inpatient Survey 2020:

  • Mailing 1 (week 1) - Letter with URL directing to online questionnaire
  • SMS1 (+3 days) - SMS reminder
  • Mailing 2 (week 2) - Letter with URL directing to online questionnaire
  • SMS2 (+3 days) - SMS reminder
  • Mailing 3 (Week 4) - Letter with URL and a paper questionnaire

We also made changes to the length of the fieldwork period. Year on year we actively look for time-savings to reduce the length of time between questionnaire mailout and close of fieldwork. As a 12-week fieldwork period for the pilot had no impact on data quality, we decided to reduce the fieldwork period from 18 to 15 weeks for this survey, and with a view to reducing further for future surveys.


Given the break in time series for this survey, we decided to review all aspects of survey methodology, including sample period and sample criteria. We conducted an online survey with NHS trusts to review potential alternative sample months and note concerns about moving the timeline for the Adult Inpatient Survey going forward. We considered factors such as other competing trust activities and levels of inpatient activity.

We consulted HES data to make sure that there would still be a sufficient number of patients in the new sample month. We also assessed the proportions of emergency and elective admissions in the new sample month, as past research has found that patient experiences between these two admission types varies.

In addition to the online survey, we also interviewed staff from eight NHS trusts who had experience of drawing samples for earlier iterations of the Adult Inpatient Survey. The interviews covered topics such as the sampling process, materials and other concerns.

We undertook consultation with a number of stakeholders with regard to the additional variables which would be required by the new methodology. Furthermore, we discussed the impact of the COVID-19 pandemic on the sampling process and reporting requirements. We wanted to ensure that a potential change of sampling month would not have a detrimental impact on reporting requirements for survey data users.

As a result of this engagement work, we made four key changes to the sampling process for this survey:

  • The sampling month was moved from July to November. This allowed more time to set up the 2020 survey, while providing enough patient throughput to reach the sample size requirements
  • We are now collecting a mobile phone number variable from trusts. Instructions for the inclusion of the mobile phone number variable were tested during the pilot and will be adopted for this survey.
  • In addition, it became vital that the survey measured the impact of the COVID-19 pandemic on healthcare services. Two COVID-19 specific variables were requested. The first covering whether or not a patient was diagnosed with the virus and the second to show whether or not a patient was treated for the virus.


A substantial amount of work on questionnaire content was undertaken during redevelopment and it is discussed in detail, along with the influences and rationale for change, in section 5 of the adult inpatient survey 2020 development report.

The questionnaire was reviewed with the dual aims of:

  • reducing the length of the questionnaire to lessen the burden on participants and meet best practice guidelines for online surveys
  • ensuring the questionnaire content is reflective of the way in which inpatient services are delivered; and horizon scanning to make sure the questionnaire is future proof where possible

As healthcare services are adapting and changing, we wanted to make sure that the questionnaire correctly reflected a patient’s journey through inpatient services, especially noting the critical elements of this journey and affording them the appropriate weight in the questionnaire. To do this, we conducted a patient journey mapping exercise which was informed by depth interviews and focus groups with service users, NHS trusts and stakeholders.

A large number of stakeholders specialising in patient care were consulted on an ongoing basis throughout questionnaire redevelopment. We held a one-day workshop aimed at beginning to build consensus around changes to the questionnaire. We also conducted depth interviews and had follow-up discussions via phone and email, with feedback incorporated into future drafts.

NHS trusts were invited to provide feedback on the questionnaire via an online survey. They told us the questions that they would find most and least useful, along with their thoughts about the ‘push-to-web’ method.

In addition to our engagement work, we also carried out analysis of the 2019 survey data to identify which questions appeared to be ‘working well’ and which were less efficient. We looked at celling and floor effects, as well as the correlation between questions.

All revisions were subjected to four rounds of cognitive testing to confirm comprehension and relevance. A detailed overview of testing can be found in section 5 of the adult inpatient survey 2020 development report.

Overall, there were changes to most questions and a reduction in length from 82 questions to 58.

Gender and Sex

Over the past few years, survey methodologists have been working on developing inclusive gender and sex questions. ONS developed new questions for the 2021 Census based on a significant programme of development work. Mindful of this, we considered use of this question, alongside other appropriate candidate questions. We selected four different options for cognitive testing. The first option was based on the Census question, but adapted by CQC and Ipsos MORI, the second was developed by the ONS, the third by NHS England and the fourth option was developed by Stonewall. The details of these questions can be found in the adult inpatient survey 2020 development report.

Based on cognitive testing, a hybrid of option one and option two was adopted:

Q1. At birth, were you registered as…

  • Male
  • Female
  • Intersex
  • I would prefer not to say

Q2. Is your gender the same as the sex you were assigned at birth?

  • Yes
  • No (write in gender)
  • Prefer not to say

Supporting materials

During the pilot, we updated the survey covering letter, reminder letters and dissent posters to bring them in line with industry best practice. We also developed three versions of text for SMS reminders which are sent alternately with the letters. As is standard practice across the programme, all materials were cognitively tested with service users. A total of 9 interviews were conducted with service users, with each interview lasting between 45-60 minutes. Participants varied by region, age, sex, socio-economic background and internet usage.

For the 2020 survey iteration, the main focus of cognitive testing was the survey questionnaire, however the materials were tested in the third round of interviews, with six service users. This specifically explored their understanding of data protection and confidentiality.

Testing resulted in the following changes:

  • Updated messaging in letters to reflect the option to complete the survey online
  • Updated motivational messaging in letters
  • Improving the visual appearance of the letters and making them easier to read
  • Highlight relevant information on data protection and confidentiality
  • Meeting accessibility guidelines
  • SMS reminders will now come from a named sender
  • The online survey is available in 9 non-English languages and British Sign Language
  • The dissent poster was updated to make it clear that participants don’t opt-in to the survey
  • The multilanguage sheet was updated to include links for online completion for 9 non-English languages


To ensure our work meets the Public Sector Bodies Accessibility Regulations, we carried out desk-based research to review best practice guidelines and review the approaches taken by other national surveys. The research informed the following changes to accessibility:

  • Participants can change the font size and background colour on the online survey
  • The survey is compatible with screen readers and other accessibility tools
  • The online survey is translated into 9 non-English languages as well as British Sign Language
  • Participants can request a telephone assisted complete in English of non-English using services such as Language Line
  • Large print and Braille questionnaires are available upon request

Throughout fieldwork we will monitor the number of requests we have for additional accessible options. Results will be reviewed to inform whether any additional options are required in future.

Fieldwork and Analysis

Coming soon.


Statistical release

The statistical release is expected during Autumn 2021.

Future development

Phase 3: Maternity survey

In progress.

Phase 4: Children and young people's survey

In progress.

Phase 5: Community mental health survey

Second wave.

Phase 6: Urgent and emergency care survey

Second wave.