(Revised 5 January 2018)
Don't have time to read the book? Read the original article instead: The New Yorker 10 December 2007 The Checklist. The thesis is that even experts benefit from systematically working through a process to ensure that important steps aren't missed or performed out of sequence.
The same 'pre-flight checklist' works for setting up and maintaining websites by confirming key website elements are in place. Otherwise you're potentially playing website Jenga.
What follows is a 39-point university or college website checklist to configure critical items from web server performance through SEO to Google Analytics.
1 Browser Caching. Implement browser caching to set page expiry dates. Expiry dates instruct browsers to load cached pages rather than downloading and refreshing pages. The result is faster page loading for an improved visitor experience.
Test if a server has browser caching implemented: Google PageSpeed Insights
2 Enable Compression. Enabling compression on a web server reduces the amount of data being exchanged, significantly improving page loading times. All browsers support this approach.
Test if a server has compression enabled: gzip Compression Test
3 Enable Keep-Alive. Ensure Keep-Alive is set on a web server to handle multiple HTTPS/HTTP requests and reduce latency for an improved visitor experience.
Test if Keep-Alive enabled: Keep-Alive Test
4 Enable HTTPS. Implement secure web server connections by using HTTPS and significantly improve data privacy and security for all site visitors.
Test an HTTPS connection quality: SSL Labs HTTPS Test
5 Enable Canonical Redirection. To eliminate ambiguity browsers and search engines need to understand which of a www or a non-www is a site's definitive version. Once decided a web server's redirection capability will ensure browsers redirect automatically. The same capability may also be applied to HTTP and HTTPS versions of a site. Site visitors benefit by being directed to the preferred or secure version of a site and its content.
6 Implement Content Security Policy Settings. CSPs are designed to enhance website security by preventing certain types of attack. They work in conjunction with browsers and can be configured either in web server headers or in on-page meta elements.
7 Establish a Development Environment.
Establish a development version of the main website for testing and evaluation. Make changes in the development environment and promote tested modifications and updates to the production environment. This approach minimises embarrassing glitches being installed on live sites at the most visible and least opportune times. Everyone benefits.
8 Eliminate Inline HTML Styles. Inline HTML code becomes progressively more difficult to manage, so avoid it. Use CSS to establish site-wide styling for consistent and accurate propagation of formatting changes. Moreover, CSS is cached, so pages load faster for website visitors.
9 Implement Sass, sCSS or LESS. CSS pre-processors add significant additional capabilities to CSS that improve implementing site-wide font, colour or other variable changes.
10 Use Path-only References. Path-only references increase flexibility when making changes. For example, use
<a href="/blog.html"> rather than
<a href="https://cs.exampleu.ca/blog.html">. In the latter case, subsequently changing the domain name or access scheme (HTTP/HTTPS) would result in visitors encountering 404 errors.
11 Determine the Site-wide Character Set to Use. In almost all cases this should be UTF8. Set the definition in the web server header or an on-page meta element. A defined character set ensures all characters are interpreted and rendered correctly by browsers and UTF8 is the preferred solution.
12 Determine the Default Site Language. The language definition can be set in web server headers or in on-page HTML element attributes. The latter approach suits comprehensively multi-lingual sites.
13 Google Search Console. Register for Google Search Console (the new name for Google Webmaster Tools) and add the relevant website(s). The console's tools provide essential website information.
robots.txt. Install a
robots.txt file in the website's root directory and use the file to control which folders/subdirectories search engines index. Include a directive telling crawlers where to find an XML sitemap.
15 Implement an XML Sitemap. Use a third-party application or a CMS's in-built tools to install 'sitemap.xml' in a website's root directory. The sitemap provides search engine crawlers with additional information about website content.
16 Submit an XML Sitemap. Submit and test the relevant
sitemap.xml file to Google's Search Console. Confirm that the sitemap is being automatically updated as content is changed.
17 Use a CDN. Content Delivery Networks host images and other static files needed to render page content on strategically located servers. Site visitors experience noticeably faster page speeds as the relevant files load from the nearest server. If a site serves an international audience and those visitors use mobile devices for access, the speed improvements will be particularly noticeable. However, most 'commercial' users will need to pay for using a CDN.
18 Canonical References. To ensure search engines can resolve definitive site versions use the
rel='canonical' meta element. It points to the preferred site version, for example, indicating the non-www version is preferred to the www version, or vice-versa, as appropriate.
19 Prioritise Rendering Visible Content. The content that displays 'above the fold' should be of most interest to site visitors, so treat it that way. Configure pages to load resources that render visible page elements first, reduce unnecessary network round trips to fetch content and load page footer elements last. Site visitors appreciate faster page loading.
22 Compress all Images. Large, uncompressed images on web pages contribute to slow page loading. This typically arises because content creation workflows do not include an image compression step. Free online services can be used to substantially reduce file sizes without compromising image quality.
23 Run Google's Page Speed Insights. Submit the site's home page to confirm that items 19, 20, 21 and 22 have been completed.
24 Run Google's Mobile Friendly Test. Run, at minimum, the home page through Google's Mobile Friendly Test to identify any systemic mobile display issues or robots.txt file directives blocking page rendering file access.
25 Implement Structured Markup. Search engines can better understand and present search results if key website content is encoded using schema.org structured markup tags. These supplementary tags allow addresses, contact details, events and the like to be readily identified in search results: thus, aiding site visitors in readily reaching key content.
26 Implement Social Media Markup. Facebook, Twitter and Google have specified web page meta tags to optimise content sharing. At minimum, implement Facebook's Open Graph markup, as it is used by default by most social media networks.
27 Ensure Every Page has a Unique Title. Every page on a site needs a title and the title should be unique and describe the content to be found on the page. Search engines display page title in search results conveying important information to would be site visitors.
28 Ensure Every Page has a unique Description. Every page should have a unique description summarising its content. Search engines display the description in search results directly beneath the page title.
29 Enforce Page Structure. Search engines and page reader software can understand content better if pages follow a hierarchy of one
<h1> </h1> main title element followed by multiple sub-title elements used in order, that is
<h2> </h2> before
<h3> </h3> and so on.
30 Tag all Images. Every image on a site should have a title attribute and an alt-image attribute to aid search engines and page reader software. The title attribute shows up on screen when a cursor hovers over an image. The alt image attribute shows up in search results or provides a description to page reader software.
31 Keep HTML Clean. Creating content in Word ensures accurate spelling and grammar. However, copying and pasting text from Word directly into most CMSs can introduce proprietary text "tags" and other non-standard code into the resulting HTML.
32 Repair Broken Links. Links that take visitors to 404 error pages are both irritating and a sign of neglect. Repairing broken links forms part of regular site maintenance.
34 Mobile Friendly Content. Google's Search Console allows all of a site's pages to be tested for mobile compatibility. Given the high proportion of mobile traffic to university and college websites it is important to confirm that content and visitor interactions work properly.
35 Cross-browser Support. Visitors will access sites using a variety of desktop, mobile and in-app browsers. It is critical to test that content and user interactions render and operate correctly for each of the browsers being used to access the site.
36 Website Accessibility. Regulations in many jurisdictions oblige public institutions, including universities and colleges, to meet offline and online accessibility standards. Even in the absence of regulatory requirements good design practice should emphasise accessibility and ease of use.
37 Implement Google Analytics. Don't just install Google Analytics, and assume it will collect useful data, actually implement it. Google Analytics needs to be configured to ensure it ignores spam, visits from the web team and can record goals and events. In the event that Google Analytics is not philosophically acceptable, try Piwik instead.
38 Implement Google Tag Manager. Google Tag Manager allows non-technical staff to test and event tags and triggers to record and analyse visitor interactions – it's the natural complement to Google Analytics.
39 Regularly Test Site Performance. Updating content, implementing new pages and installing CMS updates can all impact site performance. Tools such as WebPageTest provide performance summaries and the ability to isolate specific performance issues.