In part one, I wrote about a few sources to verify when your analytics data is skewed. In this post, I cover a few more, but with an eye for specific traffic influences – traffic from paid search campaigns or cross-domain can introduce other instances where data is inflated. Here is a look at the causes of these differences.
Is that spike for real?
Once in a while, you will see a spike in traffic. The first question that comes to mine – is it a real influence or just an aberration?
To determine if a spike in data is meaningful, you should verify how it is influencing the conversions in the reports. Depending on how your reports are set, and if there are multiple increases, examining the influence can help organize which analytics tags are should be adjusted.
You need to also know if the change is a blip or a something more substantial and enduring.
Look at the average session time to see if that has increased in a similar fashion. If possible, review conversion rates for the goals set for an increase. Increases are unlikely to be a one-to-one change.
Mismatched metrics in AdWords clicks and analytics sessions
It’s an honest assumption that AdWords and analytics visitor data should exactly match when compared. It is also a bit misleading. They don’t necessarily have to match in some instances.
An AdWords click represents a singular action on behalf of your visitor. The visitor saw an ad and clicked on the URL in that ad. The clicks are recorded on AdWords servers. Thus AdWords manager displays visitor activity related only to the ad campaigns.
Analytics, on the other hand, relates that visitor activity to a website or app. It covers what visitors generally do while on a site or app page.
Combining AdWords and analytics reporting can reveal some differences when data is reviewed side by side. Some analytics sessions are unrecorded due to server latency, redirects, users clearing their browser cache, and other processing that occurs when a page is loaded. The process to record a click is straightforward; however, session metrics, because of the coding processing, can experience latency that in turn lead to more clicks than session metrics recorded. A 10% difference is considered a by-the-hip acceptable difference.
Missing sessions can also signal poor tag functionality. I referred to verifying tags in part one. If you are seeing clicks in AdWords, but no associated sessions metrics in analytics, it may mean the tag is not registering page loads accurately.
Are visitors crossing domains…?
Cross domain traffic – visitor activity on a set of domains related by a topic or organization – introduces the natural question how to track one visitor sessions across related domains, rather than having that visitor counted as new on a given domain when he/she really isn’t.
Accounting for that singular visit means setting cross-domain tracking. The tracking can occur a number of ways depending on the analytics solution central to measurement. In most cases, it means modifying the analytic script.
The key diagnostic for cross domains is verifying that script modifications registers a test visitor movement from one domain to another, and to see if that visit is registered as a new visitor by the second domain. This means as modifying the analytic script – in Google Analytics, for example, you are adding lines of code in the first site’s script that names the second site, then adding lines of code in the second site’s script that names the first. You can also set three, four, or more sites. It just means checking that each one has the right arranger of sites in the script.
For reports, you can also check if filters are set correctly. Filters are used so that analysts can determine traffic flow on one specific domain.
…And are visitors coming from different devices?
Given the capability of syncing applications from desktop to a tablet or smartphone, chances are that the answer to the question is yes.
User ID offers a means to identify individuals who access from different devices or different sessions. Many analytics solutions, from the ever-present Google Analytics to the niche open source platform Piwik, have a User ID feature. Deploying user ID can provide marketers a means to resolve duplicate site visits from one person.
Finally, capture what has been inspected
No matter how you determine data errors, it can be easier to troubleshoot problems against daily business tasks when journaling issues. A journal provides a central repository to keep track of changes and inform teams of what tactics have been tried. You can also use notations in Google Analytics.
No matter how you journal, the key benefit from journaling is eliminating choices that have been tried before, and maintaining focus on what can be corrected.