Analytics reporting
Analytics can provide evidence for questions like:
- are users looking for content on this topic? Is there a user need?
- is content meeting user needs?
- how popular or used is the content?
- what's the user journey?
- how should the website be structured?
- are users following links and other calls to action?
Think carefully about which metrics provide the best insights into these questions.
Be cautious when relying on analytics reporting. Use analytics data alongside qualitative data gathered through user research or feedback.
What to report
Useful metrics to report for public sector sites are:
- total users
- number of sessions
- page views
- session default channel group and number of sessions
- events on the page – feedback submissions, external link clicks and file downloads
Which metrics you report depends on what you’re trying to understand about user behaviour (your research objective).
Do not rely on a single metric to make content decisions, such as number of page views. Consider several metrics and context when designing and assessing content.
For example, to evaluate the user journey through a form, you could report:
- traffic to the start page
- clicks on the start button
- the number of users who start the form
- the number who get to the final page
Low page views alone are not a reason to archive content about a government service that’s only relevant to a small population of users.
Data on external link clicks and downloads is only retained for 13 months. You can only report a maximum of 13 months prior to the reporting date.
If you’ve got the same content in PDFs and HTML
If both PDFs and multi-page HTML have the same content, you should report 'total users.' This shows how many viewed the HTML pages or downloaded the PDF.
You’ll get a falsely inflated number of users if you report either:
- ‘sessions’ – because users may view the HTML multiple times
- clicks on the PDF download added to HTML page views – because the same user might view the content in both formats
Using caution with analytics data
When relying on analytics to make decisions about website content, remember that:
- the data does not represent all users
- there can be spam traffic and data gaps
- sometimes numbers are rounded down to the nearest 100 and figures below 100 are grouped
You can interpret numbers in analytics tools as the lowest possible amount. The real number is likely greater.
When reading analytics data, also consider:
- the metrics being shown
- the date range
- filters applied to the data
Use data to look at trends, patterns and comparisons over time rather than focusing on exact numbers.
Why the data does not represent all users
Analytics data usually represents a subset of users but not all users because:
- the analytics tool samples the data if the volume is too large
- users can view content without being tracked
- users can opt out of analytics tracking, disable JavaScript or block analytics tools
Users can view content without being tracked by:
- viewing Google search result snippets
- following an external link directly to a PDF
You can find out the percentage who are opting in to analytics (‘site-average opt in rate’). To do this, compare Google Analytics numbers with actual user numbers from server logs or an analytics tool that does not use cookies.
Be cautious about relying on analytics data if only a small percentage of users are represented. User research can provide more evidence for content decisions.
Spam traffic and data gaps
Some spam is likely even with efforts to minimise it.
Data gaps may be caused by:
- site downtime
- bugs
- changes to Design System components and patterns
- new components and patterns
Feedback, help and support
If you need help or support you can e-mail us at designsystem@gov.scot
There is a problem
Thanks for your feedback