Shelter | Shipped March 2025

New feedback survey to improve UX and data

Summary

  • I work on the housing advice website for Shelter, a housing and homelessness organisation

  • Our feedback survey on the site was too long, clunky, and did not provide enough useful data

  • I collaborated with a user researcher to redesign the survey

  • The user journey for the survey is clearer and quicker, and the data gathered is more useful

  • Survey data now directly informs annual reviews of content

A good user feedback survey is vital to improve content and measure success

Our team works on advice content, so we cover common user needs and also edge cases.

This means we cannot improve content or measure success purely by looking at analytics such as users, sessions or engagement time.

Some advice pages may have low traffic but are important for the users that view them.

To measure success, we therefore need quality feedback from users who visit the site.

The old survey did not work for users or for us

The old feedback survey on the site had a number of problems:

  • It was complicated and not user friendly, with 5 separate questions for users to answer

  • Each questions alternated between a multiple choice and a text input

  • There was a high drop-off rate - 54% of respondents did not answer question 2 after answering question 1

  • Much of the data was not actionable - for example the final question was ‘would you recommend Shelter to another person’ which is too broad when gathering data about a specific piece of content

Designing a new feedback survey

Working with my content design team, a user researcher and a UX designer, we planned  workshops to do discovery, design and testing.

Part of discovery was competitor research. We paid close attention in particular to GOV.UK websites, as their content design often represents the gold standard in the industry.

The key findings from competitor research were:

  • Feedback surveys should be short, taking only a few seconds to complete 

  • They should be easy to follow

  • There should be no repetition

  • Binary data gathered should be useful

I ran a workshop to design the new survey, tapping into my colleagues’ experience and expertise. Using a Miro whiteboard, we each drafted the steps and copy for the new survey. Then we talked through each team member’s idea, giving and receiving feedback in a critique session.

Once we agreed on the new survey, we presented it to senior stakeholders, who signed off on implementing it immediately.

A clear, clean and useful new survey

The new survey is super simple. 

It just asks ‘Was this advice useful?’

Users can choose with radio buttons one of:

  • Yes

  • Yes, but…

  • No

After that, no matter what the user answers, they are asked: ‘Help us improve our advice. Tell us what worked and what could be better.’

And that’s it. The survey takes seconds to complete, while giving us clear, actionable data.

Impact of the new survey

This shorter, simpler survey means that:

  • We can see instantly if there is a problem with a page - if for example more than 20% of respondents said the page was not useful

  • We know a page is working if at least 80% of users answer ‘Yes’ or ‘Yes, but…’

  • We get qualitative feedback that forms part of an annual review of a page - around 10% of pages under review have had new content added or content changed based on this feedback

  • We have a scalable feedback loop that helps us to continually improve the user experience