Institutional oversight of higher education websites is often patchy or difficult to put in place. But, if you edit content, develop websites, are part of an IT organisation or belong to a higher education marketing and communications team, this blog post is for you.
We looked at 300 (297 to be exact) websites forming part of the web estate of a large publicly-funded North American university. Using our quality assurance software, we collected data about each website. We used that data to assess how the sites had been configured and to identify any systemic issues within this cluster of higher education institution (HEI) websites.
We identified five issues to address that will prevent websites causing reputational harm:
- Around 3% of the 297 websites appeared to be abandoned: content hadn’t been updated in more than two years – in one case, the latest events, dated from 2007. Find and take dormant websites off line.
- 12 different content management systems (CMS) underpin the 300 sites. This results in the institution is running 11 different generations of WordPress and three of Drupal. Multiple CMSs dissipate expertise and leave sites running on legacy platforms that miss important functionality and security upgrades. In the absence of central control, identify a core set of CMSs, including fully hosted offerings, under active development, and consolidate around these.
- 75% of the sites have Google Analytics or a similar analytics service installed. The remaining sites aren’t gathering visitor activity and behaviour data to assess whether they are meeting their owner/operator’s objectives. Why have a website if you can’t know if it is working? Install Google Analytics (or another service) and use the default reports and dashboards to improve the experience of visitors to over 70 websites.
- 66% of the sites are mobile friendly (as assessed by Google). The other 1/3rd need a combination of ‘tweaks’ and modifications to catch up and deliver an experience that better matches the devices their visitors are using. And, for sites where search-derived traffic matters, mobile friendliness will improve SEO.
- 20% of this study’s sites run on HTTPS (secure) connections. The balance should implement HTTPS, as soon as possible, (and as recommended by Google) to improve privacy and SEO.
The rest of the article provides further data and analysis to recognise the issues that typically need addressing across university web estates.
The Messy Details
If you don’t look you can’t know
For most universities and colleges websites (plural!) are now their primary communications channel, but to be effective these sites must balance several competing factors. Websites need to communicate compelling messages, while ensuring that end ‘user’ audiences have productive and enjoyable interactions and, at the same time, mitigating any content and configuration risks. This balancing act results in many fingers being in the pie and can only be successful if website stakeholders have the data to understand how their sites and content are configured and perform.
The evidence from auditing websites strongly suggests that organisations lack effective oversight data and thus struggle to balance the competing elements of running their sites or even recognise that they are out of balance.
In other words, if you don’t look you won’t know what condition your web estate is in.
Is this web governance?
In our work, we encounter the full continuum of institutional website management, from central control to no control. We’re not advocating a specific oversight model, we’re promoting the idea that any form of oversight needs reliable performance data. It is difficult for a policy to be relevant or workable in the absence of data about current practice.
Putting institution-wide website data in the hands of website owners/operators (along with guidance and support) empowers them to make their sites the effective communication medium they intended them to be.
A case study to illustrate the issues at stake
We ‘audited’ a portion of the web estate of a large (>15,000 students), publicly-funded North American research university to illustrate our thesis. This institution’s public-facing web estate comprises over 650,000 pages, along with 240,000 PDF documents, 4,500 Word documents and 4,000 PowerPoint presentations and Excel spreadsheets.
We selected the first 300 (297, after eliminating duplicates) distinct websites that our software was able to resolve from the much larger overall set of websites in order to complete our ‘audit’ within a reasonable period. Web estate audits at decentralised higher education institutions can uncover thousands of registered domain names of which only a minority are active or re-direct to active websites.
The institution we examined is representative of web estates we’ve encountered in Canada, the UK and the US, but less so for institutions in Australia, Ireland and New Zealand. However, don’t stop reading, as the individual site characteristics are still relevant to the latter set of countries.
Potential Financial Exposures
What is the cost of website support?
Using a combination of LinkedIn profiles and the institution’s online staff directory we were able to identify about 110 individuals providing web support services. This total is likely an underestimate. For smaller sites, content editing is often someone’s part-time job and many sites are supported by individuals without readily identifiable “web” job titles.
To a reasonable first approximation website development, support and maintenance costs this institution at least $5 million each year. We’ve no idea if this expenditure level is appropriate, but it is likely the institution doesn’t know either.
At minimum, it would be appropriate to know the overall number of websites and the associated staff costs of maintaining them, to ensure that an adequate return is being generated on the implied investment.
Three different generations of Drupal and 11 of WordPress are in use across the sample. It is possible that some of the economies of scale resulting from a focused set of content management systems may be lost given the diversity of actual systems in use. And, for the WordPress implementations, it is likely that patches to the core WordPress software have yet to be applied.
We also discovered three sites using Squarespace. Judging by the low page count in each case, this approach is likely very cost-effective for any entity primarily concerned with content rather than design or development.
Self-hosting versus external hosting?
We also checked the IP addresses in use, from which we deduced that a majority of the sites are hosted on servers at on-campus data centre(s). The remaining 24% of the sites use external hosting arrangements from 16 different hosting suppliers, including Amazon, Google and Rackspace.
We make three observations in passing. The web hosting infrastructure available at major hosting providers likely exceeds that available at most university campuses: does on-campus website hosting makes sense, either financially or operationally?
Second, do individual units within a university have the expertise to select external hosting providers, and if so are all 16 of comparable reliability?
Perhaps the overall organisation would be better served, financially and operationally, by having preferred arrangements with a focused set of vendors and directing site owner/operators to pre-packaged hosting solutions that meet cost, security and reliability criteria?
Although, university web estates can comprise hundreds or thousands of individual websites visitors to higher education websites assume they are visiting a single institutional website, albeit one segmented in ways that aids their “journey”. Visitors expect straightforward navigation and uniform quality and performance. Without some form of oversight uniformity of experience is difficult to achieve: as it might be in the quality of courses an institution offers.
The situation is more complex, because there isn’t a typical higher education website visitor. Research shows that many view prospective students as the key audience. But, employees use the website(s) to find campus services, external audiences seek out research publications, the media searches for subject matter experts and past students look for transcripts. And, these visitors enter the web estate at many points other than the optimised home page.
As a result, the overall institution faces a number of reputational exposures.
Abandoned or moribund sites
As we ‘audited’ the individual sites within our sample group we encountered nine that were dormant (no apparent home page updates within the past two years), thus displaying dated and possibly erroneous information to visitors.
We also scanned the dates used in any copyright statement on a home page. For more information about different approaches to providing copyright protection read one of our recent articles.
In the sample of sites just over half (51.2%) had a dated copyright statement with 2/3rd (63.2%) using the current year, either alone or as part of a copyright period. In our experience, non-current dates are indicative of two situations. First, the copyright statement is hard-coded rather than driven by the system. Second, and more importantly, it is indicative of neglected website content.
For the sample sites, copyright dates reach back to 2002 (one of the ‘abandoned’ sites). The largest cluster (18 sites) is trapped in 2015. The page content may be current, but visitors could hesitate to rely on it, if it doesn’t appear to have been updated recently.
Abandoned and neglected websites are likely inconsistent with the branding and quality standards of most institutions.
How well is this site working for me?
The world has gone mobile, perhaps even more so for higher education institutions. It is a safe assumption that a plurality of visitors uses mobile devices and expect sites to work accordingly. Moreover, as Google has cautioned, it gives higher priority to search results from mobile friendly pages.
If visitor satisfaction matters pages should be mobile friendly and if search traffic matters, pages should be mobile friendly. So, are they?
For our sample set of sites, 2/3rd have mobile friendly home pages – that is, those pages completely passed Google’s mobile friendly test.
The remaining sites need careful attention to a combination of set-up issues to bring them up to scratch. Read our five-part series on fixing each of the errors that Google reports, for more information.
A second preference that visitors want from all websites is speed: especially over mobile data connections. We ran all 297 websites through Google’s PageSpeed Insights test and produced a composite performance score for mobile and desktop access:
The higher the score the faster the page. About 1/3rd of the sites score 80 or above, indicating good performance. The balance of the sites need attention to various technical aspects of how their pages are configured to boost page loading speeds.
However, the single simplest remedy to slow pages is image compression. One of the worst performing websites attempts to load 30MB of images as it renders its home page. Free services, such as JPEG Optimizer or TinyPNG offer flexible image compression: just experiment to balance image quality with file size.
We also noted in passing that half the sites could benefit from turning on server-side (GZIP) compression, which would also boost performance. And, at no cost.
Overall, our analysis suggests that the more complex sites have addressed many of the fundamentals of improving the user experience. However, knowledge and expertise is uneven and as a result, so is the overall experience of a visitor traversing different sites within the web estate.
How well is this website working for us?
We also need to ask how well is the site working for its owners/operators? Given that time, effort and financial resources were invested in a site, how do we know if it is meeting its objectives?
The answers come from the web analytics data being captured with each site visit. Analytics data provides insights into how visitors get to a website, which content captures their attention and what they do once they’ve arrived.
Just over 75% of the sites used Google Analytics, Piwik or WordPress Stats to capture visitor data. The balance would struggle to understand if their sites are operating as intended – although, they could deduce some elements of visitor interactions from their server logs, in extremis.
Campus-wide web analytics implementation would help identify abandoned websites, provide feedback on the urgency of dealing with mobile friendliness issues and clarify which content is no longer needed on any individual site.
Our experience from conducting Google Analytics reviews is that only a small slice of organisations has the training and expertise to ensure that they are collecting ‘clean’ data and using relevant activity and behaviour measures. However, even ‘dirty’ data can be filtered and cleaned to produce useful insights. It is better to have a web analytics tool sub-optimally installed than to be collecting no data at all.
We also observed that just 8% of the sites had implemented Google Tag Manager, a service eliminating developer involvement in measuring specific visitor interactions and behaviour or setting up third-party applications, such as social media sharing.
In the absence of widespread Google Analytics implementation, it is likely that traffic statistics (and other measures) cannot be aggregated for the entire institution, perhaps resulting in activity and interactions being understated and less well understood than they should be.
One other consequence of less-than-total analytics coverage is lack of information about where and how visitors enter the overall web estate. While the main home page has likely been ‘optimised’ for key tasks, without data from other important sites it is difficult to know how well main home page optimisation actually meets visitor needs.
It was outside the scope of this study, but a more detailed analysis of the individual sites would review content issues, page structure, cookie use and the accessibility of content to different audiences. All worth addressing in order to protect an institution’s reputation.
Security and Privacy Exposures
Unauthorised database access
Websites and their underlying databases may present security issues that require intensive testing to identify. We are aware that some content management systems and, specifically, their MySQL databases are accessed via third party database administration tools.
Anecdotal evidence suggests that these tools are sometimes left with default passwords in place thus allowing site content to be accessed and altered. Testing with the full authorisation of the site owners (and the institution’s IT staff) would be needed to determine if this issue exists within the sample set of sites.
Keep applying the patches
The 297 websites run on a mix of web servers (for example, Apache, nginx and IIS), some hosted internally; others externally. Consequently, these servers are on different release levels – opening up the possibility of some web servers not having all security patches in place.
Furthermore, the individual content management systems running the websites are at different release levels. As noted, we identified 11 generations of WordPress in operation – again this may result in security and functionality upgrades being missed.
Make browsing private
19% of the sites have implemented HTTPS to offer secure, private browsing connections for visitors to those specific sites. The balance of sites has yet to catch up, but a migration is clearly underway. As Google increasingly emphasises, non-HTTPS sites get ranked lower in search results. Many of the sites will benefit from search-referred traffic, so ranking matters and can be enhanced, for free, by upgrading to HTTPS.
Web analytics data will confirm which browsers visitors use most frequently, as Chrome evolves over the next few months it will provide stronger warnings about non-secure sites. Visitors to this web estate will have a mixed experience of warning messages as they move from site to site: perhaps, not entirely consistent with this institution’s reputational aspirations?
Whether an institution wants more control over its web properties or is happy for those properties to be developed and managed independently, there are substantial benefits to having the data to:
- Understand the lay of the land – how many sites do we own or manage? What technologies are being used? Who is responsible for each site?
- Identify and manage potential financial exposures – do we want to be in the web hosting business? Would it be better to use third party hosting services? Are we hosting sites that no one visits?
- Identify and manage potential reputational risks – do we offer the best user experience? Is our content current and accurate? Do we meet our own editorial and access standards?
- Identify and manage potential security risks – are we using the appropriate technology platforms? Are our sites secure and is browsing our sites private?
Want to understand how we collect data, perform our analysis, identify solutions? Or to discuss your web estate. Call us at: +1 416 464 9771 or +44 203 290 3575.
Don’t have accurate and current information on all the websites you own? Not able to monitor and check each website’s content quality and risk status? Let’s talk about how we can help.