Pharmaceutical companies - complex organizations with complex communications needs
Global organizations have multiple audiences for their digital presences, needing to reach them in different languages, in different locations all while being compliant with local regulations. As a result, multi-national companies typically develop large digital estates comprising hundreds of websites, social media presences and additional content hosted on third-party provider platforms.
Pharmaceutical companies, have particularly complex digital estates. Between large numbers of products, regulatory requirements, professional and non-professional audiences, advocacy efforts, patient support and research activities pharmaceuticals use large numbers of digital presences to help their audiences find information or complete tasks. At the same time, other areas of the digital estate address the needs of investors, the media, job seekers, suppliers, potential partners and researchers.
In addition to being large, these sprawling website collections do not remain static. Individual site content is updated, new sites go live, old ones go off-line (at least, that’s the plan) and complex sets of rules expand to redirect visitors to more relevant content. With content publishing responsibilities often spread worldwide, changes can take place in isolation, without a wider sense of the amount of change or the impacts on the end user experience.
Given the pace of change it can be difficult to answer basis questions. How many websites or social media presences does our organization have? What condition are those presences in? Is their content current and accurate? Does the content meet our best practices? What kind of user experience do visitors to our digital estate have? Do we have any metrics that tell us how effective our digital presences are?
In our experience, many organizations cannot answer those detailed questions with confidence. Nor, do many organizations have a good sense of the overall state of their digital estates.
The latter issue can be assessed by measuring each website’s user experience. A reasonable proxy for which is page performance, particularly as measured by the Lighthouse Performance Audit, as it takes a user-centred approach to measuring performance.
Webpage performance as a proxy for user experience
Given the complexity of a typical pharmaceutical company’s digital estate and the roles the various websites within those estates need to play, we decided to measure their overall user experience by running the Lighthouse Performance Audit tool.
The Lighthouse performance audit tool is a good approach as it:
- Has an extensively documented approach to measuring actual user experience,
- The scoring methodology is clear and based on large numbers of tests,
- The performance test is readily reproduced by anyone using the Chrome browser, and
- The performance audit provides a set of diagnostics to guide improvements.
To keep matters simple, we tested 4,650 website homepages across the digital estates of six global pharmaceutical companies. We had identified these websites as part of an earlier research exercise – they represent most, but not all of the sites in each organization’s digital estate .
To ensure consistency and reliable, reproducible data collection we ran the Lighthouse Performance Audits in a node.js batch process on an AWS instance located in Western Europe. The Lighthouse audits can be run in mobile or desktop mode, and we opted for the latter. It is possible that re-running the audits over slower mobile data connections may result in poorer performance results.
We used version 8 of the Lighthouse Performance Audit software, which measures six aspects of webpage performance. In this report we focus on the three measures accounting for 70% of the overall Lighthouse Performance score weighting:
- Total Blocking Time (30% weight) - TBT (in milliseconds), measures the total amount of time that a page is blocked from responding to user input: mouse clicks, screen taps, keyboard presses. It is calculated as the elapse time over 50 milliseconds in which tasks occupy the main thread. If a task takes 90 milliseconds, TBT will be calculated as 40 milliseconds. On the other hand, a task taking only 40 milliseconds with have zero TBT. TBT’s of less than 200 milliseconds are deemed to be good.
- Largest Contentful Paint (25%) – LCP (in milliseconds), measures the time needed to render the largest text block or image visible within the viewport (screen display) from when the page first starts loading. Good scores are achieved when the LCP is less than 2,500 milliseconds.
- Cumulative Layout Shift (15%) – CLS measures visual stability or the number of “layout shifts”: those annoying content jumps that sometimes take place as pages load. TBT and LCP are measured in milliseconds, but CLS is determined by sampling pages and using a complex (and evolving) calculation to assign scores for all the layout shifts taking place. A good CLS score is 0.1 or less.
|Lighthouse Performance Audit Score||User Experience (Colour Code)|
|0 – 49||Poor (Red)|
|50 – 89||Needs Improvement (Orange)|
|90 – 100||Good (Green)|
Results and diagnosis
At the risk of slight over simplification, improving website performance (user experience) means addressing the three factors discussed above, as these have the biggest impact on the overall Lighthouse Performance score.
How did the 4,650 pharmaceutical website homepages we scanned performed and what general actions do the results of the audits suggest need to be taken?
The summary scores for the six pharmaceutical companies we analysed are set out below in a set of doughnut charts. We carried out a similar exercise for global consumer goods companies and overall the scores aren’t materially different. Although, two of the consumer goods companies had digital estates with more than 2,000 separate websites.
Realistically, in any large global corporation it may be difficult to put globally effective digital governance frameworks in place and even if they exist, it is tough to ensure websites across a large digital estate are always top performers.
A quick look at the set of doughnut charts shows the green (Good) segments of each chart are relatively small. The orange (Needs Improvement) segments predominate and the red (Poor) segments represent anywhere from 16% to 45% of the sites tested.
Across the 4,650 websites tested, just 7% emerged with a good rating.
We recorded the performance score from each test and grouped them by category: poor (red), needs improvement (orange) and good (green). The doughnut charts show the proportion of each digital estate (set of websites) in each category. Hovering a cursor over a chart segment will reveal the percentage of sites in a category.
Overall, we found that 29.8% of the website homepages we audited fell into Lighthouse’s Poor category, 63.1% had room for improvement and 7.0% surpassed the threshold for Good performance.
What are the main issues Lighthouse highlights?
One of Lighthouse’s benefits is its diagnostic information. In the next sections we examine the TBT, FCP and CLS metrics, the values we should be aiming for and the steps needed to make improvements.
Total Blocking Time
The Google Developers team behind the Lighthouse audits suggest that web developers should aim to keep TBT below 200 milliseconds.
|TBT (in milliseconds)||User Experience (Colour Code)|
|0 – 200||Good (Green)|
|201 - 600||Needs Improvement (Orange)|
|Over 600||Poor (Red)|
Lighthouse sets 50 milliseconds as its threshold for ‘long tasks’ and to see which tasks may be problematic, you simply need to fire up the Performance panel of Chrome DevTools.
Largest Contentful Paint
Measures to determine how quickly a web page’s main content loads have evolved over the past few years, as groups like the W3C Web Performance Working Group have sought to find metrics that reflect user experience.
The current version of the Lighthouse Performance audit uses Largest Contentful Paint as its page content loading metric, as Google research work has shown that measuring when the largest element on a webpage is loaded accurately reflects the loading of a page’s main content.
|LCP (in milliseconds)||User Experience (Colour Code)|
|0 – 2,500||Good (Green)|
|2,501 – 4,000||Needs Improvement (Orange)|
|Over 4,000||Poor (Red)|
The Google Developers team behind the Lighthouse audits suggests web developers should aim to keep LCP below 2.5 seconds at a 75th percentile threshold. In other words, 3 out of 4 visits to a webpage should experience an LCP of 2.5 seconds or less.
The LCP measurement methodology is somewhat abstruse, factoring in specific page content elements, their size as displayed on different devices and the point in time at which they should be measured.
In practice, reducing LCP times hangs on four key factors:
- Server response times – slower servers lengthen the time taken to load a page’s main content. Optimizing servers, using content distribution networks and content caching can all improve server response times.
Cumulative Layout Shift
The approach to measuring cumulative layout shifts has also evolved over time, with the most recent set of changes going live in June 2021, after soliciting wide ranging feedback.
As noted earlier, CLS is a not a time-based measure, but a metric calculated as the sum of individual layout shifts. It is well worth reading the Web.dev Cumulative Layout Shift page for the details of the CLS metric calculation.
|CLS (fraction)||User Experience (Colour Code)|
|0 – 0.1||Good (Green)|
|0.011 – 0.25||Needs Improvement (Orange)|
|Over 0.25||Poor (Red)|
The Google Developers team behind the Lighthouse audits suggest that web developers should target CLS to be below 0.1 at a 75th percentile threshold. In other words, 3 out of 4 times a visitor arrives on a webpage she should experience a CLS of less than 0.1.
Reducing unexpected layout shifts means paying attention to three webpage design and implementation principles:
- Image and video elements need size attributes, as these ensure browsers allocate the appropriate space while images load,
- Except in response to user input don’t insert new content above existing page content, as this creates the unexpected layout shifts we are seeking to avoid,
- Use any animated transitions to provide context and continuity in generating any content layout changes. As these create expected layout shifts, which are fine.
Many global organizations would find it difficult and time-consuming to produce a simple list of all the websites they own. They are not able to monitor the condition of their digital estates and thus lack information about their overall digital accessibility, user experience, or issues related to security, data privacy or technical infrastructure.
The knowledge gaps can have potentially material consequences. Ineffective websites may result in healthcare professionals failing to engage with new initiatives because they can’t find the relevant information, or programmes to assist with drug costs don’t recruit widely enough because the digital content isn’t accessible, or external organizations miss out on research collaboration opportunities because search wasn’t optimized.
Our study shows the six pharmaceutical company digital estates examined have significant opportunities to improve user experience as a majority of the sites failed to achieve a Good rating.
While each site will need individual attention to improve its performance, overall the first step must be to put in place a mechanism for identifying all the relevant digital presences and then regularly monitoring their performance.
Not knowing the details, means not knowing whether sites are working for their audiences or telling corporate stories effectively.
 To fit with our research publication schedule, our original digital estate discovery exercise had a 48 -hour processing time limit. As a result, it is likely we’ve under estimated the total number of websites. A pharmaceutical company digital estate typically comprises a main corporate website (for investors, the media, job seekers and others), individual product sites, sites grouping similar products, sites designed to inform healthcare professionals, sites intended for ‘end users’, advocacy sites, sites for suppliers and other business partners: in multiple languages and designed to comply with local regulatory requirements.