This is our third article in the series Emerse on Quality where we discuss topics in quality control of programmatic advertising campaigns. In our first article we discussed too fast ad reload times and in our second article we discussed ad stacking, both important quality problems to manage in programmatic advertising.
Before we move on to the interesting topics of this article we would like to mention that Emerse provides fully managed quality controlled services for delivery, analytics and optimization of programmatic advertising for brands, advertisers and agencies. Our tools and processes for quality control goes beyond the ordinary. On a daily basis, we help brands deliver ad campaigns with more impact, more quality and at lower cost. Contact our sales team today to get started working with us.
Intro: What is click discrepancy?
Click discrepancies between Google Analytics sessions and DSP (Demand-Side Platform) clicks occur when the number of clicks reported by a DSP doesn't match the number of sessions recorded in Google Analytics. Several factors contribute to these discrepancies:
Bot Traffic:
DSP clicks might include non-human (bot) traffic, which inflates click numbers. Google Analytics applies filters to exclude some bot traffic, reducing session counts. This creates a gap between what the DSP reports as clicks and what GA reports as sessions. We see this is a very common issue in programmatic campaigns and will discuss this more below.
Tracking Differences (Sessions vs. Clicks):
Google Analytics Sessions: A session is a group of interactions on a website within a specific time frame. A session is initiated when a user lands on a site and typically ends after 30 minutes of inactivity. If a user clicks an ad multiple times or revisits within that time, only one session may be counted.
DSP Clicks: Every click on an ad is recorded by the DSP, even if the user doesn't complete the landing process, encounters errors, or navigates away quickly.
Discrepancy Example: One user may click an ad multiple times but trigger only a single session in GA.
Tracking Issues:
Some users may block tracking scripts or have disabled JavaScript, preventing Google Analytics from recording their session. DSPs, however, record the click because it happens on the ad server side, not on the user's browser.
Here are for example some stats about browsers that use some kind of blocking app that will disable Google Analytics from running (of course many of these apps also block ads so don't assume they see or click your ads either but some ads might pass the filter):
In total 700 million or more browsers have some of these apps blocking GA from tracking their visits to your website.
Redirects and Page Load Failures:
DSP clicks are counted when the user clicks the ad. However, if the landing page fails to load properly (slow connections, server errors, user closes the page before it loads), Google Analytics may not track the session. This results in clicks being reported without corresponding GA sessions.
UTM Tagging and URL Mismatches:
If the landing page URL in the ad is incorrect or lacks proper UTM parameters for Google Analytics tracking, the session may not be attributed correctly. DSP clicks will still be counted, but GA will fail to register the session, leading to a discrepancy.
Session Timeouts:
Google Analytics considers a session inactive if there's no activity for 30 minutes. If a user clicks an ad but waits too long before interacting with the site, Google Analytics may not register it as a new session, even if the DSP reports multiple clicks.
Tracking discrepancies using a chart
For any programmatic campaign it makes sense to track discrepancies. Both to be aware of current levels but also to try to reduce them over time, keep track of what DSP and GA configurations impact different changes in discrepancy and so forth. At Emerse we make it part of our job in managing programmatic strategies for customers to create charts like this and keep them populated with data.
Before starting to tackle discrepancies what we like to do at Emerse is to create a chart to track data. It shows the measurement points across a timeline with values for averages/mean and a few upper and lower standard deviation levels (typically 2 and 3 standard deviations). If you'd like our help setting up and running a chart like this just contact us and we'll help you
Here's an example chart based on mock-up data (not a real campaign) that shows how we track and visualize discrepancies at Emerse:
The control chart in this example includes weekly measurement data of discrepancies between Google Analytics sessions and DSP clicks.
Here's a deeper look at the chart itself:
We can see in the sample data here that there is a clear issue with discrepancies. During some weeks discrepancies are very high. So this would indicate there is a problem and we need to do something about it.
We have built a tool specifically for creating control charts that connects to Google Analytics and your DSP to produce automatically updated charts. It's called AdQMS and you can find it by clicking the logo below. Feel free to sign up to start tracking your own charts today:
Analysing the cause of discrepancies
As listed above there can be many reasons behind discrepancies. It's important to rule out any configuration settings are causing issues such as looking for the clicks under a certain UTM tag but that tag then not being used properly in the DSP. Or the GA tag not firing on the page traffic is directed at.
Once we can establish that the configurations look ok it is time to look into the traffic itself.
Traffic analysis
To help the client improve their campaign, we first take a look at impression level data to see what level of bot traffic is visible when the ad tag is firing in the DSP. This is only sampled data and not the entire data set of the campaign:
From the impression level ad-tag data we can see that there is an amount of bot generated impressions in the campaign (about 5%). This is of course interesting in itself but it does not show the whole reason for the discrepancy (which is higher). There are as we showed above many natural reasons even a quality controlled campaign will have some levels of bot impressions (such as scraping bots frequently visiting major news sites to scrape their content).
The biggest news sites for example usually have good content, because of this they are very popular for others to scrape. So other sites, services and tech firms send their bots to the large news sites and just read and download their content, save it and do something with it. Some might be news aggregators, some might be AI services reading news to learn, some might use AI to rewrite the articles into their own. Etc. An important point here: Even if you buy ads directly from the largest publishers, you will still get this bot traffic on your campaigns. So if I go out and buy an ad campaign directly from the largest news publishers in my region, the bots will still go there and see my ads. Because bots also download ads, not just articles. So it doesn't matter if it is programmatic or direct buying from big sites, the bot traffic is there regardless.
Next we dig into the actual click traffic from the ad-tag in the DSP to see what amount of clicks (not impressions) in the ad-tag originate from bots:
We see some interesting data here. About 8% of the clicks on the ad-tag are from bots. Again there is very little the publishers can do to prevent bot traffic but in some cases the amount can be larger on certain publishers and that data can be interesting to look into further. Bot clicks can of course cause reporting errors and discrepancy in the GA/DSP data ratios.
Next steps
Once you have (with our help if you like) identified the cause of discrepancies, the next step is to work to reduce it. Here it is clear that we need to identify which traffic sources are driving the bot clicks through and find ways to block them out from the campaigns. If clicks like this are used in CTR/CPC or (worse) even CPA optimization then they will mess up the optimization algorithms causing them to drive more and more traffic from the wrong places.
We are able to identify exactly which publishers, sites and apps are driving the bot clicks. This will help you remove them from your campaigns.
It's important to note that some level of bot traffic will occur on any site or app. For example, reputable high quality news sites are often scraped for content by bots that feed that content into AI and convert it into content on other sites (typically made-for-advertising sites). This is something the publisher being scraped has nothing to do with. So that level of bot traffic will be hard to avoid. But then there are publishers that buy traffic from bot farms or ad networks that send tons of bot traffic and generate both impressions and clicks. The bots can manipulate DSP algorithms by clicking ads and thereby fooling the algorithms to allocate more budget to them as they seem to have a high CTR.
Emerse delivers quality controlled programmatic advertising
Our services to deliver quality controlled campaigns and programmatic advertising for customers means we take care of quality assurance techniques such as the ones in this article for you.
If you are interested letting Emerse manage your programmatic advertising with our quality and cost control processes as well as performance optimization, please contact our sales team today to discuss more.