Boosting Engagement and Reducing Completion Time: Leveraging UX to Transform an Employee Total Rewards Survey
Reynolds Consumer Products
RCP is launching it’s first ever Total Rewards Survey which will be sent out to over 5,000 employees to gather their feedback on our benefits, compensation, employee programs and all aspects of our Total Rewards package. It’s the first time we’ve been able to do something at this scale, so it is crucial that it goes smoothly.
Team:
Collaborated with Benefits Team, HR Leadership, and 3rd Party Survey Vendor
Role:
User Experience Consultant
Timeframe:
July 2023
Contributions:
Challenge
The first version of the total rewards survey is too long, and if it takes too much time to complete, employees will likely not finish it. We also may only have one shot to collect this kind of information, so we need to get a substantial amount of information in just this one survey.
Solution
Implemented UX best practices by making updates that better align with participants' mental models. I changed button and answer types to be more intuitive and less tedious, prioritized questions and content, and removed unnecessary or confusing information.
Outcome
Reduced mental load and survey fatigue, increasing engagement and getting the majority of participants to complete the entire survey. Satisfied leadership with the smooth implementation and excellent results, and drove adoption of UX practices in this non-UX setting.
time on task reduced by:
33%
Details
Setting the scene:
We are at our desks, preparing to launch our first ever Total Rewards Survey, the pressure is on, but already we're facing a problem…the survey is way too long. It took a lot of encouragement to get leadership to agree to do this survey, and we aren’t sure we’ll ever be able to do it again, so we really need it to go well, we need to get a lot of responses, and we need to collect a lot of information.
The survey is being conducted by a 3rd party vendor and we are limited by their technical constraints, there is very little flexibility in visual changes and the question and answer types have limited options.
My Process
Evaluate
Review
Prioritize
Make it feel better
How might we decrease the time it takes to complete? Let's take a look at it by section
Currently, it takes an average of about 18 minutes to complete and we’d like to cut that down to 15, or ideally even less. Studies show that people are only willing to spend about 10-15 minutes on a survey and huge drop-offs start to happen around 8 minutes in.
The survey is divided into 4 sections, but Sections 2 and 3 take the vast majority of the time, so that’s where we’ll dive in.
Section 1: Demographics
Section 2: Rating Most & Least Preferred Benefits
Section 3: Ranking Satisfaction & Importance
Section 4: Open-Ended Questions
Section 2: Rating Most & Least Preferred Benefits
In this section, participants rank which benefit from the list is their most preferred and which one is least preferred. Only one list is presented per page and it’s repeated 12 times.
Having the “most preferred” answers on the left and the “least preferred” on the right is counterintuitive and does not align with the typical mental model.
Although not inherently wrong, this approach can lead to survey fatigue and contribute to increased response times.
Fun fact- Due to psychological biases (for cultures that read left to right), we tend to rate the left side more often. So, having the higher selection (most) on the left-hand side can lead to more biased survey answers
We are asking participants to do a lot on this one section, it’s a huge mental load. Let’s dive in and see if we can eliminate some of it.
A lot of these steps are required, so our opportunity lies here, in the hovering or highlighting of new benefits
Decisions for Section 2:
Reverse the order of the most and least preferred answers
Keep the detailed hover descriptions
Remove the blue highlights
Look for opportunities to combine or eliminate questions
These improvements alone reduced the time to complete this section by almost 2 minutes!
Section 3: Ranking Satisfaction & Importance
In this section, we are asking 2 questions per benefit- How Satisfied are you with that benefit? And, How important is that benefit to you?
Participants select from a dropdown for each benefit, twice. This is repeated 10x for 5 pages. Which is super time consuming and leads to major survey fatigue.
Constraint- Unfortunately, we don’t have a lot of flexibility on design for this section and whatever answer type we choose has to be the same for both satisfaction AND importance.
How can we make this easier?
Option 1: Keep the Dropdowns
Dropdowns require a lot of clicks.
Research also shows that best practice is to only use dropdowns when there are 6+ options to select from, and that they can become very tedious when overused.
Option 2: Use Radio Buttons or Sliders
Using radio buttons or sliders would cut the clicks in half. That's a big improvement!
Which is better, sliders or radio buttons?
I did some research on best practices and found that they provide very similar experiences for participants and similar results. Some people have a preference, but in general there isn’t a huge difference for this use case.
Option 3: Put Satisfaction and Importance on Separate Pages
While this would cut the time on each page, it would double the pages needed for this section. Plus, participants would be asked about the same benefit multiple times which would be confusing
Decision: Radio Buttons
Option 2 was the only one that cut down time. While we could have selected a slider, we felt that the repeated drag movement of a slider would get tedious so we opted for a radio buttons.
This improvement decreased the clicks on this page by half, ultimately reducing the time on task by 23%, about 1.5 minutes of total time saved just by changing the answer type.
How might we improve and clarify the content within the survey?
Prioritizing questions and making their meaning clear
I partnered with our HR Leadership to review every single question of the survey. We prioritized questions, combined similar questions, and eliminated redundancies. We also clarified the wording on some questions where it was unclear what exactly we were asking.
Overall we reduced the total number of questions by about 10%, shortening the total number of pages in multiple sections.
Improving the instructions
The original instructions were very long on the section introduction slides, and then repeated in a similar but slightly different way on each page within that section. This was a bit confusing for participants
By shortening the instructions and using more direct language, we made it faster to complete and easier to comprehend.
How might we improve the overall experience?
Adding a progress bar to show people that they really are almost done
Research shows that adding a progress bar in surveys helps increase the number of completions. When participants get tired and think “how much longer is this thing” they can look at the progress bar and see that they only have one section left.
Ending the survey on a positive note
The current survey ends abruptly, it doesn’t let participants know that they’re on the last question, there’s no opportunity for them to go back and or change any answers before submitting.
Though they may be happy it’s done, having that choice taken away still leaves participants with a negative feeling. And due to recency biases that feeling is likely to be what they remember from the survey.
To fix this, we added a warning screen at the end of the survey, giving participants the choice to Submit or go back. This makes them feel more satisfied and empowered at the end of the survey.
Results
Great employee engagement and exceeding our time goals!
Time to complete reduced by:
33%
About a 5 minute reduction in time, exceeding our goal of keeping the survey under 15 minutes.
Received great employee engagement, with the majority of participants who started the survey completing it.
Smooth implementation and great results made leadership happy, this leaves us more opportunity to do other surveys in the future.
Research, recommendations, and updates were all completed on a quick timeline- only about 1 week
Driving adoption of UX practices in a non-UX setting
Lessons Learned
Even making seemingly minor changes to an interface, such as switching out button types, can lead to significant increase in usability
You cannot ask every question in one survey. We did our best to get everything out of this survey that we could, but if we truly asked every question we wanted to know the answer to, participants wouldn’t complete it because it would be too long.
Getting stakeholders to accept that these “small” changes would help, was difficult in this very non-UX, non-design setting. With limited time, I couldn’t conduct extensive research or testing to backup my claims, ultimately the stakeholders had to trust my judgement. In this case, it was my relationships with the stakeholders that I truly leveraged