To an analyst, there’s not a lot worse than having your boss question the validity of your work. If you’re making business decisions or providing recommendations based on data, then you obviously need it to be as accurate as possible. The problem is there are many different ways your data accuracy can go awry.
Here’s a few common ones you may not be aware of:
- position of code on page – yes this still impacts web analytics
- speed of connection / speed of click away – if the page hasn’t fully loaded and a visitor clicks away, they’re not tracked
- changes to browser technologies – browsers change all the time and they don’t care about your measurement
- cookies – some deny them by default, some allow them, but cookies still remain the mainstay of the visitor identification
- tagging errors – mistagging something through the chain of command is common; what you describe to your dev team is not always what you get back
- missing tags – new feature sets across the site and new pages added with no tags are still very common problems
Because of this web analytics is generally used as a directional tool – I don’t think you’d find anyone that will swear blind it’s 100% accurate. Out of the box, a standard implementation can normally be within 5-7% of internal systems when capturing things like revenue and orders etc. However, with a bit of extra work, it can be within 1% of internal systems – and that’s what we should all strive for.
Getting there can be a bit of process. Staying there can be even harder.
So how can you ensure that you remain as accurate as possible over the long haul, sleep well at night, and confidently respond to the accuracy question? Well it’s now a little easier by using tag auditing services.
A couple of years ago we started using a service from the US called ObservePoint to help provide quick and cost effective solutions for large scale tag validation across websites. We were so impressed with the service that we’ve included it in our standard implementation and validation process.
ObservePoint crawls across the target website and captures all of the data that is sent into the analytics (and other) platforms. It then generates a set of reports that show whether pages are tagged, what’s been tagged, how many errors there are, what the errors are, etc. They now support 110 tags across Analytics, Advertising, Ad Measurement, Data Management, Behavioural Targeting, Split Testing, Video, Tag Management, eCommerce, Voice of Consumer, Lead Tracking and Social Media.
But don’t worry, they won’t inflate your numbers either, they intercept the vendor call and stop it from being made.
They can also get “beyond the firewall”; checking tags on password-protected sites, dev environments, intranets and other non-public sites. Perfect for that pre-launch comfort.
You can also get a full data extract into Excel that shows the page URL and all of the props, eVars, events, campaign codes plus a whole heap of other data useful for quickly reviewing everything, particularly if you use the Excel Filter feature.
But it doesn’t stop there – you can automate things so you get a monthly scheduled audit done, you can also get SMS or email alerts when tag values change or simulations fail.
This all leads back to governance. Having your tags working correctly as well as ensuring that development teams implement correctly and consistently, all helps toward having a stable and reliable data set. If you’re struggling with governance internally, you can use auditing to measure the accuracy over time and from that you can calculate the impact to the business.
With information like that, you should be able to get stakeholder support for governance – especially if your boss consistently asks about validity.
Companies like ObservePoint allow us to sleep well at night, safe in the knowledge that someone is helping us to keep a watchful eye on our tags and, in turn, helping you change your answer from “I think” to “I know”.