Most of your digital presences need performance improvements

relative performance of website homepages across the digital estates of six global consumer goods companies

Can’t see the forest for the trees

Almost every global organization we encounter has a digital estate comprising hundreds or thousands of websites, along with social media presences and content hosted on third-party platforms.

And, most of those organizations lack accurate digital asset inventories. So, they don’t know how many active websites they’ve got or how well those sites work for their audiences.

Without up-to-date data on key metrics such as website performance, accessibility, data privacy or marketing technologies in use it is very difficult to make informed decisions.

And, things don’t stand still. Digital estates change regularly. New sites come online, old ones are retired and visitors are redirected to more relevant content. Many of the changes are made in isolation, without organizations being aware of the wider situation or impacts.

User-centred performance

To illustrate the issues, we ran a simple user experience test. We checked the homepage optimization of each website we had found in an earlier survey of six global consumer goods companies and their digital estates: a total of 7,150 individual websites.

We used the Chrome Lighthouse performance audit tool to measure performance as:

  • Lighthouse tries to measure how users experience web page performance
  • Lighthouse offers a single performance score benchmarked against a large dataset
  • The tests can be readily reproduced and offer diagnostics to improve performance
  • Substantial documentation explains the rationale behind the tests and the scoring methodology

There are different ways to run the tests, but given the number of sites we planned testing, we ran the tests as a Node.js batch process on an AWS Lightsail instance. As AWS data centres have fast internet connections it’s likely the performance results we recorded could exceed those for a typical desktop or mobile user.

Green is good - everything else needs attention

The following doughnut charts summarized the results across the websites in each digital estate. The Lighthouse scoring methodology uses the following classifications:

  • 0 to 49 (red): Poor
  • 50 to 89 (orange): Needs Improvement
  • 90 to 100 (green): Good

We recorded the performance score from each test and grouped them by category: poor (red), needs improvement (orange) and good (green). The doughnut charts show the proportion of each digital estate (set of websites) in each category. Hovering a cursor over the a chart segment will reveal the percentage of sites in a category.

The charts show the proportion of websites in each Chrome Lighthouse performance audit category. For example, L'Oréal has 485 websites in its digital estate, 26.8% can be categorized as good, 3.0% as poor and 70.2% as needing improvement.

Green is good. Everything else needs attention. Visitors don't want to encounter slow websites and those sites could be losing online sales, receiving poor reviews on social media or just bouncing potential customers to the competition.

Overall, we found that 6.6% of the website homepages we audited fell into Lighthouse’s Poor category, 71.3% had room for improvement and 22.1% surpassed the threshold for Good performance. The fundamental message from testing this group of global consumer goods companies is that three out of four sites need attention to deliver good user experiences.

The charts show the proportion of websites in each Chrome Lighthouse performance audit category. For example, L'Oréal has 485 websites in its digital estate, 26.8% can be categorized as good, 3.0% as poor and 70.2% as needing improvement.

As mentioned above, one of Lighthouse’s many benefits is its diagnostic information. And, for the “slower” websites our testing identified, there is a widespread common cause, high Total Blocking Time (TBT). TBT measures the total time pages are blocked from responding to user input. In fact, Lighthouse assigns TBT the highest single weighting in its current scoring system of 30% of the total score.

The largest contributor to high TBT is avoidable JavaScript loading, parsing and execution. From other research we've carried out, we can see user input blocking is often exacerbated by multiple marketing tags (third-party JavaScript) being loaded. In the absence of accurate digital estate data, many organizations simply don't know how they are using tags across their websites or the impact on website visitors. 

Conclusions

When organizations deploy large numbers of websites (or social media accounts) to address audiences across different products, product categories, geographical markets and languages, it is easy to lose track of all the moving parts and be unaware of user experience and other issues.

In this article we addressed web page performance and its impact on user experience, but losing track of the details also impacts, among other things, whether:

  • sites are sufficiently accessible
  • are compliant with corporate policies, standards or best practices?
  • they can meet local regulatory standards?
  • current versions of approved technologies been implemented

Not knowing the details, means not knowing whether a site is working well for its audiences or telling a corporate story effectively. The solution is to test regularly, monitor the results and make changes as needed.

 

Subscribe for digital effectiveness research:

* indicates required
Digital effectiveness research options *
Email Format
 

 

 

Blog photo image: eQAfy