Web Estate Registry
Higher education institutions typically let websites develop organically. A policy leading to web estates with hundreds or thousands of autonomous sites. Estates in which the number of sites may be unknown, individual site ownership can be unclear and for which there may be no effective digital oversight.
Digital marketing and communications can be blunted by easily-fixed website set-up, configuration and technical issues. And, poor website knowledge unnecessarily exposes institutions to compliance, security and user experience concerns.
Our web estate registry service finds every website. It captures key site data and monitors site changes on a user-defined schedule.
Data to help streamline digital marketing and communications. Data to enable digital governance and infrastructure oversight. And, data to let higher education institutions answer wider questions such as:
GDPR
Internal Audit
Accessibility
Security & Privacy
Web Estate Registry and Registers
Web Estate Registry with Multiple Registers
A registry of all the sites within a web estate can be arranged as a set of sub-registers: reflecting a university or college’s reporting needs. The system reports the total number of sub-registers and the number of websites in each sub-register.
Sample Register and Websites
Registry and sub-register data can be filtered, search and queried to answer institution-wide, or site-specific, questions. For example, which content management systems do our sites use? Which sites still need to upgrade to HTTPS? What Facebook or Twitter accounts do our sites use?
Website Summary - More Detail Accessed via Top Right HTML/PDF Icons
Size Up Your Web Estate – Cut Risk, Boost Content
Discovery
Discovery itemizes all the websites in an institution’s web estate.
Our automated audits find the full scope and scale of institutional websites by scanning servers and links to extract every website.
Data Collection
Systematic scanning of the discovery exercise’s detailed site breakdown yields data about:
- technologies - security measures implemented, web server configuration and set-up, content management system(s)
- site configuration - cookies, policy and privacy links and page counts
- content – content types in use and metadata
Meticulous data collection delivers site-level data for subsequent analysis and reporting.
Data Consolidation
The populated Web Estate Registry produces:
- a central, single-source-of-the-truth database of all of an institution's websites, their ‘business owners’ and critical site-specific data
- the key information to explore, identify and evaluate website enhancement and risk minimization opportunities
And, regular, automated data updates ensure that the Registry delivers current and reliable results.
Discovery

Where to Start?
Higher education institutions often have website lists or can poll to add candidate sites. The results can seed a comprehensive automated discovery exercise.
As well as needing somewhere to start, discovery exercises need intelligent boundaries to avoid page or link scanning that will not yield useful data.
Discovery planning needs to answer:
- What IP address ranges are relevant?
- Which domains should be examined?
- Should the exercise apply to public-facing as well as internal websites?
- Do we need to discriminate 'services' sharing a web server from a website?
- Can some 'well-known' domains be ignored as not relevant?
When to Stop?
Discovery means systematically checking a seed list's URLs and every other page link to identify other relevant sites.
In practice, limiting scans to multi-thousand pages and selectively inspecting server time stamps for recent material can shorten the discovery time.
A scanning exercise delivers a candidate list of URLs for intelligent winnowing to produce a list ready to load a Web Estate Registry.
Scanning and site identification is iterative, continuing until no new servers or sites are identified.
Data Collection

Data Collection
Data collection acquires data about each website's underlying technology infrastructure, the website implementation and relevant page content.
With the current and accurate data collected for each website you'll know:
- Protocols/schemes in use (HTTP vs. HTTPS)
- Web server technology reported by the server
- Web content management system (CMS) from the CMS and by independent analysis
And, be able to change, modify and update systems and servers as appropriate.
Up-to-date site configuration and page content information lets you understand:
- Web page metadata – titles, descriptions and other elements
- JavaScript used to provide analytics, advertising and other on-page functions
- Cookies being used
- Whether privacy, accessibility and other policy statementsare present
- Accessibility as compared with the WCAG 2.0
- Total counts of scanned pages for each (recorded during the survey)
Allowing marketing staff, content editors and developers to identify and respond to user experience and related concerns, as needed.
Data Consolidation

Analysis
Each sites data is automatically updated on a user-defined schedule.
Analysis can be carried out across sites or within groups of sites to identify and resolve systemic issues, to identify common risk exposures and to support web governance initiatives.
Reporting
The web estate registry database can be queried to:
- generate status reports, for example all sites using a specific version of WordPress
- produce risk exposure reports covering financial, legal and regulatory and security risks
- highlight page configuration and content issues affecting digital marketing and communications campaigns.