There are many parameters that can tilt your data analysis and prevent you from comparing data between the two tools.
- The data that you compare must cover the same scope of analysis
- Your web analytics tool’s configuration must match the configuration of your tests in AB Tasty
- The technical and network infrastructure (probes, robots, and crawlers) needs to be taken into account because it can impact data collection and create counting differences.
Calculations
It is important to understand that AB Tasty is not a web analytics tool (like Google Analytics or AT Internet): Its job is to help you to decide between several versions of your page by comparing their performance based on indicators that are calculated identically by AB Tasty regardless of the change. The calculation method is specific to the AB Tasty solution.
Web analytics tools each have their own specificities and calculation methods. Therefore, it doesn't make sense to compare data on unique visitors provided by Google Analytics and AT-Internet, for example.
AB Tasty measures unique visitors and unique conversions. Before attempting to compare indicators provided by our reporting and that of your analytics tool, you must make sure that what you are comparing is comparable.
A common mistake is to want to reconcile our “visitor” indicator with Google Analytics’ default “visits” indicator. By acting in this way, you will be taking a first step toward incoherent data. You must compare AB Tasty figures with “unique visitors” numbers in Google Analytics.
This calculation also has an impact on the measured goals because we avoid duplication of the conversions.
If an internet user converts repeatedly on your site (for example, by purchasing twice during the analysis period), AB Tasty will only record a single conversion.
For example: to compare a button click objective (AB Tasty), in Google Analytics you must implement and monitor the “unique event” indicator in order to compare the data.
Configuration
![]() |
Before performing A/B Testing campaigns, we recommend that you perform a calibration test, called an A/A test. |
This will help you confirm:
- The random distribution of traffic according to the selected traffic modulation,
- The reporting of targets of type URL or clicks,
- The consistency of the data with your web analytics tool.
In certain cases, differences may appear in your results between AB Tasty and your analytics tool. While differences of the order of 10% are acceptable, beyond that, check that your scopes of analysis are the same and you are comparing similar items.
You should ask yourself these questions:
- Do you know all of the exact settings of your web analytics tool?
- Does your tagging plan contain specific characteristics that affect the measurement of the indicators?
- Have you excluded certain IP addresses in your web analytics tool that are not excluded in AB Tasty?
- Is your conversion tunnel properly configured on the web analytics tool side?
- Does the URL of your target in the conversion tunnel take into account all hypothetical cases? You must ensure that the definition of the conversion target in AB Tasty is identical to that of your web analytics tool.
- Have you enabled cross domain tracking on the analytics side?
- Comparison of data from different scopes
- You may often find errors when comparing data between AB Tasty and a web analytics tool. This is due to the fact that these tools’ scopes are sometimes quite different.
Here are some of the most common reasons for these errors:
Different tagging plans: AB Tasty's tag and your analytics tool’s tag are not deployed in exactly the same way on your site. Depending on the age of your site, the work that’s been done on it, the various versions, the tagging plan’s complexity can vary greatly.
The absence of the AB Tasty tag on certain pages will have a mechanical impact on the recorded data.
- Scope of analysis changed by targeting options:
Your targeting options can reduce the scope of measurement.
For example:
- You target a test on the homepage
- Your objective is to measure the conversion rate of a variant of this page
- If the visitor does not see your homepage but arrives on your site via a deeper page, and they convert, this conversion will be counted in your Analytics tool, but not in AB Tasty. Indeed, the visitor will not have been subject to the test.
It is therefore not appropriate to compare conversions in the two tools:
In your web analytics tool, you must restrict your analysis to only those users who saw the home page, in order to compare like with like.
Overall, you need to be more vigilant in the comparison of data if you use restrictive targeting in your tests.
Bad settings of your target URLs
Are you sure that you have entered the URL format of your target confirmation page correctly and you have exhaustively taken all cases into account?
It is rather usual to find certain URL formats of forgotten objectives, for which no conversion will have been recorded.
For example, you have entered the following URL format:
URL must be equal to http://www.example.fr/confirmation-target.
But can we access this page:
via the SSL protocol? In which case conversions performed on https://www.example.fr/confirmation-target will not be taken into account,
without mention of a sub-domain? in which case the URL http://example.fr/confirmation-target will not be taken into account,
in the additional URL parameters you ignore? In which case conversions are performed on http://www.example.fr/target-confirmation?parameter=a-specific-case will not be recorded.
A clear vision of possible scenarios for these URLs must therefore be drafted beforehand.
You can then use the various operators proposed by AB Tasty to correctly configure your target URLs (exactly equal to, contains, regular expression).
Take into account traffic modulation parameters
If you try to compare the visitors and conversions between tools, you should remember to readjust your calculations according to the chosen modulation of traffic. On a test in 2 variations (original + variant), by default, the modulation of the traffic is 50/50 and all of your visitors are subjected to the test.
If you modulate the traffic and apply, for example, a modulation of 70% for the original and 30% for the change, only 60% of your traffic will be included in your test (30% for the original and 30% for the variant in order to compare samples of the same size).
If conversion rates are not affected by this modulation (the orders of magnitude will be the same with your web analytics tool), the number of conversions will necessarily be different.
Absence of web analytics side indicators
Certain targeting options specific to AB Tasty are not visible in your web analytics tool: you cannot compare this data.
For example, if you target a typology of specific pages based on their content and not on their URLs (e.g. only the pages containing a badge with a promotional offer), you will retrieve conversion data for a more restricted population of Internet users.
Unless you have a very sharp analytics tagging plan, you will probably not have specific indicators of this Internet user population in your analytics tool.
Temporality
The time range over which you are going to compare data will have consequences on the differences that are visible in your tools.
For example, if you compare results for the same day. On the AB Tasty side, a conversion is linked to a visitor for the total length of the test (several days or weeks). If you filter AB Tasty reporting on a single day, the displayed number of unique visitors will correspond to the number of new visitors that were assigned to the test on that day.
The number of conversions displayed will, however, match the conversions by these new internet users on the day even if the conversion takes place later.
Not being a web analytics tool, in the sense of Google Analytics, we do not display the number of conversions carried out on a single day but the number of conversions completed in time for internet users being tested on a given date.
Technical infrastructure and network
In very specific cases, it is possible that we either do not collect all user data or, on the contrary, we collect more compared to the analytics tools that you use.
If the user has a very slow internet connection:
We have implemented a system that prevents launching a test if it takes more than 2 seconds for the page hosting your changes to be loaded once the AB Tasty tag has called it.
For slower connections (like 3G), this saves the visitor from seeing a flickering effect.
If your mobile audience is significant, it may be a source of difference and it's better to exclude mobile users from your test and then compare only the desktop traffic in your web analytics tool.
You use sensors to mimic the behaviour of your Internet users (process tests, debugging, etc.) but their settings for third-party tools are different.
A case that frequently occurs is that they have been previously configured in order not to distort the data reported by the web analytics tool but have not been updated during the integration of AB Tasty, which then provides more data than your web analytics tool.
![]() |
The latest version of Tasty AB tag automatically excludes from the tests users using older browsers (Internet Explorer 8 or earlier). These browsers represent a tiny part of web traffic, but if a large share of your users still uses them, this may explain counting differences. |
See the dedicated article for specific details about the reasons for differences in data between AB Tasty and Google Analytics : https://support.abtasty.com/hc/en-us/articles/200324487-Reasons-for-differences-in-data-between-Google-Analytics-and-AB-Tasty
Comments
0 comments
Please sign in to leave a comment.