We decided to post this article written by the Head of The Performance Network, Jaysen Juplessis as we found clients were asking us the common question:
Why do adserver and web analytics stats differ so much?
As long as there are different measuring systems there will always be different figures. The key however is understanding why there are differences and at what point the differences become unacceptable. The below reasons and discrepancies should give you a bench mark to compare to and possibly set up some business rules that say you are willing to accept the following margin of difference however anything over needs to be reviewed.
Adserver vs Adserver
If you are a media buyer you are well aware that a website or networks impressions and clicks are higher than your adserver. The difference normally ranges between 10% to 20%. The more adservers there are in the process the higher the discrepancy. For instance if the advertiser servers their banners through double click and books the campaign on a network, the network is run on another adserver like OpenX and the network then puts the banners on websites who use either their own systems or DFP or Atlas you start getting very far removed from the original impressions. So in the above example you could be as far out as 30%.
Below are a couple of reasons why those discrepancies exist
• Impression definitions: Publishers count the ad requested and advertisers count the ad displayed.
• Large creatives have long load times resulting in differences in impression counts.
• Latency: Any lag in the connection between the ad request and the displaying of the ad can create differences in counts; the user may navigate away before seeing the ad or page
• Network connection and server reliability: An ad server may fail briefly, not receive a connection, or encounter an issue while logging a request, resulting in different counts.
• Ad blockers: Publishers issue an ad request, but the ad is prevented from being displayed by an ad blocker.
• Caching: A creative may be cached in the browser or on a proxy server; no ad request is seen by the advertiser server, which results in impression count differences.
• Trafficking errors: An ad tag may be implemented incorrectly so that one ad server is able to see the impressions and clicks while another server doesn’t (or only receives a subset of the statistics).
• Frequency capping: An advertiser’s frequency cap could prevent an ad request from being filled, which may cause different impression counts.
• Timing differences: Ad servers may operate on different time intervals or time zones, which results in temporal differences.
• Spam filtering: Ad servers may filter out spam impressions and clicks, impressions from robots and spiders, back-to-back clicks, and other activities. These filtering technologies are implemented in different ways; some servers may be more or less aggressive in their filtering, which results in spam and click count differences.
Google Analytics vs Adserver
As mentioned above different systems measure impressions and clicks differently. There is almost always a discrepancy between ad serving and web analytics because they are two different platforms that measure data in different ways. In my experience, I’ve often seen the web analytics report lower numbers than the ad server’s numbers. Discrepancies are okay if they are consistently inconsistent. It helps to know how an ad server is “defining” a set of data vs. your web analytics platform. For example, a “visit” may be defined differently by DART (clicks) vs. Omniture (visits to an actual web page). In this example, it would be possible for a user to click on an ad a few times and repeatedly visit your site, but the web platform could count this as one visit.
Here are a couple of reasons why there are discrepancies
• Ad Servers report clicks that result in a redirect to a web page. There is no guarantee that the visitor makes it to the webpage or isn’t further redirected.
• The statistics are affected by a user who closes a browser after clicking an ad, hijacking (toolbars that redirect traffic), bots, and in some cases an ad server that times out. Ad servers accurately measure ad displays and clicks. They are not so accurate at telling you how many people visited a website.
• A log analysers reports on pages served by a web server, it doesn’t see pages served from caching proxies used by ISPs and doesn’t see pages served from a browser’s cache. Log analysers accurately report server activity and nothing else.
• Java script based metrics (like Google Analytics): Reports accurately if the end user has java script and no software that blocks your tracker (7-15% of computers have this depending on who’s metrics you are using). Java script based metrics tell you within 7-15% what pages have been viewed.
• Coding errors on the website – checking web analytics tags are laborious which means it is easy to miss something. You can use this tool to check tags http://wasp.immeria.net/
Get in touch with our team for more advice on display media and best practice.
New Zealand: firstname.lastname@example.org
The Performance Network (TPN) is a premium performance based advertising network offering key benefits to many of Australia’s and new Zealand’s leading advertisers and publishers. TPN offers a wide range of CPC (Cost per Click) and CPA (Cost per Acquisition) opportunities across display, email, links and co-registration media placements. Advertiser and Publisher opportunities with TPN are available exclusively from 3dinteractive.