Who are the only people that know what they need from a website?
Visitors to corporate websites encounter great content assembled by skilled communications and web teams. And, to keep everything running smoothly, those teams carefully track visitor website activity and on-page behaviour.
But, Google Analytics (or its equivalents) can’t tell you what visitors actually needed to do when they landed on your site, only what they did and where they went.
To understand visitor intent, you need to collect high-quality feedback.
In the measurement phase of preparing our Top 20 Most Effective Corporate Digital Estates report, we found every website had deployed analytics software (Google Analytics, Matomo, etc.) to record visitor behaviour and activity.
However, few sites had active user surveys running alongside their web analytics software. And, the surveys we did encounter were often not set up to collect data that could feed into decision making.
Collecting high-quality feedback
Highly effective websites are designed, from the start, around meeting their audiences' needs. Designers collaborate with prospective end users in discovery exercises to identify (and prioritise) those needs. Then they build using regular feedback from their collaborators.
Over time, uncertainty can creep in about a website's effectiveness. Perhaps it becomes less clear that the original design accurately identified audience needs, or maybe audience needs have changed or a website’s audiences have changed.
When this happens, it makes sense to collect data and recalibrate everyone’s understanding. User surveys are a good tool to measure audiences and confirm or re-establish their needs. However, to add value a survey needs to collect data that answers the ‘right’ questions and to do that means using a coherent process.
Setting up for success
Data collection or measurement needs to be driven by a decision. For example, let's say visitor behaviour flow analysis suggests site navigation is confusing, as visitors seem to ping back and forth between pages. Our visitor mix may have changed, since our last re-fresh, and we may no longer be appropriately meeting our audiences' needs. If so, should we invest in updating or changing the website's navigation?
We need to establish a baseline, by understanding what we already know. In this case, web analytics on the most frequently visited pages or website sections are likely an acceptable baseline assumption about the current audience mix. The data might show that 40% of visitors end up on our careers section or 10% are clearly going to our sustainability reporting.
Next, we identify what new information would challenge our baseline assumptions and answer the question our decision poses. A survey should feed directly into the decision by collecting data with high information value. In this hypothetical case, the survey results will reduce our uncertainty about the visitor mix and show us how it aligns with the observed behaviour flow analysis.
We could then choose to poll website visitors to see how they self-identify on arrival. For example, we could ask: In what role are you visiting this website? And, offer a checklist of options. We can analyse our web analytics data to determine how many results we will want to collect to meet our measurement reliability threshold.
Once we've collected enough survey data, we can see if it materially changes our baseline audience composition assumptions. If it does, we can make updates knowing that we are responding to an actual shift. If the data fails to challenge our baseline assumptions, we can avoid the investment, for the time being.
If we make navigation changes in response to our new understand, we should periodically re-survey to ensure we keep up with changes in our audiences.
Use more surveys. Regular website user surveys are one of the only ways of understanding if audience needs continue to be met. Short, focused surveys can rapidly provide actionable insights, and website updates can be responsive to changing needs.
Keep surveys short. We often encounter verbose user surveys that impose an undue burden on users. In the absence of incentives, it is difficult to know why website visitors complete surveys. Surveys with too many questions risk abandonment or inaccurate answers as participants lose focus. Neither outcome helps with decision making.
Key surveys focused. We also encounter surveys that are clearly (even to the participant) trying to cover too much ground. Often asking specific as well as 'sentiment' questions. For example, how likely are you to recommend this website to others? Answering these questions usually involves choosing a point on 10-point Likert scale, a scale that may have little context for the participant. Sentiment surveys have a place, but separate them from surveys gathering data to support quantifiable decisions.