Use the following link to go straight to the Final Analytics Report.
Introduction to the 2007 Analytics Shoot Out – by Jim Sterne
Every web analytics tool measures clickthroughs and page views a little differently. They’re all using slightly different yardsticks and getting slightly different results. The disparity is driving us to distraction.
Just how different are they?
To get the answer to that, you would have to install some of the most popular tools side by side on the same sites and compare the results. That’s what the team at Perficient Digital did and the results are informative.
Yes, it’s interesting to see which tools, looking at the same data, count higher and which count lower. But the eye opener for these very complex yardsticks is just how many variables there are in the implementation process. Very interesting reading.
Take me directly to the data!
Overview of the 2007 Analytics Shoot Out
The 2007 Analytics Shoot Out is targeted at evaluating the performance, accuracy, and capabilities of 7 different analytics packages as implemented across 4 different sites. The goals of the project are as follows:
- Evaluate ease of implementation
- Evaluate ease of use
- Understand the basic capabilities of each package
- Solve specific problems on each web site
- Discover the unique strengths of each package
- Discover the unique weaknesses of each package
- Learn about the structural technology elements of each package that affect its capabilities
- Learn how to better match a customer’s needs to the right analytics package
How the results of the Shoot Out will be delivered.
The results of the Shoot Out will be delivered in two stages:
- This is the interim report, which was officially released at the Emetrics Summit in San Francisco on May 6, 2007.
- The final report, with more comprehensive results and analysis, was officially released on August 27, 2007. Click here to see the Final Report of The 2007 Analytics Shoot Out.
What you get in this interim report
- An analysis of how the user deletion/non-acceptance rates of third-party cookies and first-party cookies differ.
- Comparative data showing:
- Unique Visitors
- Page Views
- Specific segments as defined per site for 2 sites
- An analysis of the comparative data and discussion of the following topics:
- What the numbers tell us
- Range of results
- Does one package always report lower numbers than the others?
- Does one package always report higher numbers than the others?
Contents of the Final Report
The final report will contain the same basic types of data as the interim report but with more numerical detail and a more extensive analysis of that data.
In addition to enhancing the above data from the interim report, the final report will also:
- Highlight major strengths of each analytics package, and outline specific scenarios where that will matter to a website owner/web master.
- Highlight specific weaknesses (as well as workarounds and plans by each company to address these weaknesses) of each analytics package, and outline specific scenarios where that will matter to a website owner/webmaster.
- Provide detailed screenshots of the significant aspects of each analytics package.
- Provide overall commentary on each package.
2007 Analytics Shoot Out Details
The following companies actively contributed their time and effort to this project:
Each of these analytics packages was installed on multiple web sites, and each of these companies contributed engineering support resources to assist us during the project.
We were also able to evaluate the following analytics packages because they were already on one of the sites we used in the project:
- Omniture SiteCatalyst
Participating Web Sites
Each of these sites installed multiple analytics packages on their sites per our instructions and made revisions as requested by us.
Matrix of Web Sites and Analytics Packages that were tested in the Shoot Out
|Site||Clicktracks||Google Analytics||IndexTools||Omniture||Unica Net Insight||WebSideStory HBX Analytics||WebTrends|
Thanks are also due to the following companies, who contributed to this project:
The major aspects of the Shoot Out methodology are as follows:
- All packages were run concurrently.
- All packages used first-party cookies.
- A custom analytics plan was tailored to the needs of each site.
- Visitors, Unique Visitors, and Page Views were recorded daily for each site.
- Content Groups and Segments were setups for each site. Numbers related to these were recorded daily.
- Detailed ad hoc analysis was done with each analytics package on each site.
- Critical strengths and weaknesses of each package were noted and reviewed with each vendor for comment.
- Each vendor was given an opportunity to present their product’s strongest features and benefits.
The next few sections present the interim results from the Shoot Out. Note that the time frames for the data from each site have been masked, and the time frame used for each site was different, but the numbers are real.
First Party Cookies vs. Third Party Cookies
Using WebSideStory’s HBX Analytics running on CityTownInfo.com, we ran the software for a fixed period of time using third-party cookies (TPCs). We then ran the software for the same amount of time using first-party cookies (FPCs).
During that same period we ran 3 of the other analytics packages (Clicktracks, Google Analytics, and IndexTools), all using first-party cookies.
The results were then compared by examining the relationship of HBX reported volumes to the average of the volumes of the three other packages and then seeing how that relationship changed when we switched from third-party cookies to first-party cookies. In theory, this should give us an estimate of how the user deletion/non-acceptance rates of third-party cookies compare to user deletion/non-acceptance rates of first-party cookies.
Here are the results we obtained while HBX Analytics was running third-party cookies:
|WebSideStory’s HBX Analytics||48,990||47,813||102,534|
|Average of all but HBX Analytics||68,818||65,507||120,682|
|HBX Analytics % of Average||71.19%||72.99%||84.96%|
Choosing a Global Software Development Partner to Accelerate Your Digital Strategy
To be successful and outpace the competition, you need a software development partner that excels in exactly the type of digital projects you are now faced with accelerating, and in the most cost effective and optimized way possible.
Visitor and unique visitor totals for HBX Analytics are 71 – 73% of the average of the other 3 packages. On the other hand, page views are roughly 85% of the average of the other 3 packages.
Now let’s take a look at the same data when HBX Analytics was making use of first-party cookies:
|WebSideStory’s HBX Analytics||55,871||54,520||96,453|
|Average of all but HBX Analytics||68,033||64,655||115,484|
|HBX Analytics % of Average||82.12%||84.32%||83.52%|
|Relative Traffic Growth with FPCs||13.32%||13.44%|
With first-party cookies, the visitor and unique visitor totals for HBX Analytics are now 82 – 84% of the average of the other 3 packages. The pageviews relationship did not change significantly and were roughly 84%.
Analysis and Commentary
By observing how the increase in the traffic reported by HBX Analytics increased with respect to the average of the other 3 packages, we can estimate how third-party cookie deletion + non-acceptance differs from first-party cookie deletion.
According to this data, the third party cookie deletion / non-acceptance rate exceeds the first party cookie deletion / non-acceptance rate by a little more than 13%. WebSideStory also reported to Perficient Digital that it saw a 15-20% third-party cookie deletion / non-acceptance rate across sites that they monitor during a 2 week period in January, and about a 2% first-party cookie deletion /non-acceptance rate.
This data is fairly consistent with past industry data that estimates the third party cookie deletion / non-acceptance rate at about 15%.
Note that comScore recently reported more than 30% of cookies are deleted or not accepted overall, and also seemed to show that the difference between TPC and FPC deletions / not acceptances was significantly smaller. What remains to be seen is the methodology they used. Nonetheless, our data above should provide a reasonable indication of how TPC deletions and non-acceptances differ from FPC deletions and non-acceptances.
Cookie deletion and acceptance rates are of great concern when evaluating web analytics. Every time a cookie is deleted or not accepted it impacts the visitor and unique visitor counts of the tool. In particular, counting of unique visitors is significantly affected. If a user visits a site in the morning, and doesn’t accept or deletes their cookies, and then visits again in the afternoon, this will show up as 2 different daily unique visitors in the totals for that day, when in fact one user made multiple visits, and should be counted only as one unique visitor.
It should be noted that the packages use different methods for setting their cookies. For example, HBX Analytics requires you to set up a CNAME record in your DNS configuration file to remap a sub-domain of your site to one of their servers. This CNAME record maps a sub-domain of your site to the HBX Analytics server.
While this requires someone who is not frightened by configuring CNAME records to do, it does provide some advantages. For example, simple first party cookie implementations still pass data directly back to the servers of the analytics vendor. Memory resident anti-spyware software will intercept and block these communications.
Using the CNAME record bypasses this problem because all the memory resident anti-spyware software will see is a communication with a sub-domain of your site, and the process of redirecting the data stream to the HBX Analytics server happens at the DNS server.
Unica uses a similar approach that provides a choice of either using a CNAME record-based approach for first-party cookies or going with a simpler first-party cookie implementation.
Other analytics packages used in this test (Clicktracks, Google Analytics, and IndexTools) have chosen an approach to the initial configuration which requires no special configuration, and that allows a less technical user to set them up and get started.
Have comments or want to discuss? You can comment on the Interim Analytics Report here
Visitors, Unique Visitors, and Page Views (aka “traffic numbers”)
The Uniques column is the summation of Daily Unique Visitors over a period of time. The resulting total is therefore not an actual unique visitor count for the time period (because some of the visitors may have visited the site multiple times, and have been counted as a Daily Unique Visitor for each visit).
This was done because not all of the packages readily permitted us to obtain Unique Visitor totals over an arbitrary period of time. For example, for some packages, it is not trivial to pull the 12 day period Unique Visitor count.
Regardless, the Uniques data in the tables below remains a meaningful measurement of how the analytics packages compare in calculating Daily Unique Visitors.
The rows that show % information for each analytics package refer to that packages percent of the average result. Note that average does not necessarily mean “correct”. The purpose of this information is intended simply to help identify which analytics package yielded results that differed substantially from the others.
The time period is not being disclosed to obscure the actual daily traffic numbers of the participating sites. In addition, the time period used for each site differed.
- Summary visitor, unique visitor, and page view data for CityTownInfo.com:
|Unica Affinium NetInsight||607,475||593,871||1,027,445|
|WebSideStory HBX Analytics||524,055||510,882||910,809|
|Google Analytics %||100.36%||101.58%||101.47%|
|Unica Affinium NetInsight %||101.51%||103.43%||100.34%|
|WebSideStory HBX Analytics%||87.57%||88.98%||88.95%|
|Clicktracks Std Deviations||1.17||0.42||0.30|
|Google Analytics Std Deviations||0.05||0.28||0.24|
|IndexTools Std Deviations||0.40||0.66||1.23|
|Unica Affinium NetInsight Std Deviations||0.23||0.62||0.06|
|WebSideStory HBX Analytics Std Deviations||-1.85||-1.98||-1.83|
- Summary visitor, unique visitor, and page view data for HomePortfolio.com:
|WebSideStory HBX Analytics||701,895||662,411||6,439,982|
|Google Analytics %||100.88%||99.82%||102.22%|
|WebSideStory HBX Analytics %||93.85%||93.48%||91.31%|
|Google Analytics Std Deviations||0.18||-0.03||0.41|
|IndexTools Std Deviations||-0.45||-0.51||0.07|
|WebSideStory HBX Analytics Std Deviations||-1.23||-1.07||-1.60|
|WebTrends Std Deviations||1.50||1.61||1.12|
- Summary visitor, unique visitor, and page view data for ToolPartsDirect.com:
|WebSideStory HBX Analytics||103,724||91,847||582,887|
|Google Analytics %||127.44%||108.97%||131.86%|
|WebSideStory HBX Analytics %||82.64%||96.92%||81.82%|
|Clicktracks Std Deviations||0.20||-0.59||-0.53|
|Google Analytics Std Deviations||1.55||1.73||1.67|
|IndexTools Std Deviations||-0.77||-0.55||-0.18|
|WebSideStory HBX Analytics Std Deviations||-0.98||-0.59||-0.95|
- Summary visitor, unique visitor, and page view data for AdvancedMD.com:
|Unica Affinium Net Insight||101,419||57,739||196,277|
|WebSideStory HBX Analytics||110,824||63,156||222,732|
|Google Analytics %||120.01%||104.54%||103.01%|
|Omniture Site Catalyst %||88.97%||105.30%||105.51%|
|Unica Affinium Net Insight %||81.87%||94.98%||87.34%|
|WebSideStory HBX Analytics %||89.46%||103.89%||99.11%|
|Clicktracks Std Deviations||1.54||0.62||0.75|
|Google Analytics Std Deviations||1.21||0.67||0.50|
|IndexTools Std Deviations||-0.35||-1.91||0.08|
|Omniture SiteCatalyst Std Deviations||-0.67||0.79||0.91|
|Unica Affinium Net Insight Std Deviations||-1.10||-0.74||-2.08|
|WebSideStory HBX Analytics Std Deviations||-0.64||0.58||-0.15|
Analysis and Commentary
- There were significant differences in the traffic numbers revealed by the packages. While we are all conditioned to think that this is a purely mechanical counting process, it is, in fact, a very complex process.
There are dozens (possibly more) implementation decisions made in putting together an analytics package that affects the method of counting used by each package. The example we provided above of different types of first-party cookie implementation is just one example.
If we look at the standard deviations in the above data, the distribution appears to be pretty normal. Note that for a normal distribution, 68% of scores should be within 1 standard deviation, and 95% of the scores should be within 2 standard deviations. In our data above, this indeed appears to be holding roughly true.
Here is a summary of the raw data:
1.1. While HBX Analytics tended to report the lowest numbers of all the packages, this was certainly not always the case. For example, on AdvancedMD.com, HBX was higher than 2 packages for visitors, and unique visitors.
1.2. Clicktracks reported the highest numbers on AdvancedMD.com. Google Analytics reported the second highest numbers for this site. Google Analytics reported the highest numbers on ToolPartsDirect.com. Clicktracks reported the second highest numbers for this site.
AdvancedMD.com and ToolPartsDirect.com receive a large amount of their traffic from Pay Per Click campaigns. While it’s pure speculation on our part, perhaps this plays into some differences in the way that Clicktracks and Google Analytics count visitor, unique visitor, and page view data as compared to the other packages. We are seeking feedback from Clicktracks and Google Analytics on this point, and hope to provide more information in the Final Report.
1.3. On HomePortfolio.com, WebTrends reported significantly more visitors and unique visitors than the other vendors (about 20% more). This is the only site that we were able to look at WebTrends numbers for at this stage in the project.
Google Analytics reported the second highest numbers on this site.
1.4. On CityTownInfo.com, the highest numbers were reports by IndexTools.
- As Jim Sterne is fond of saying, if your yardstick measures 39 inches instead of 36 inches, it’s still great to have a measurement tool. The yardstick will still help you measure changes with a great deal of accuracy. So if tomorrow your 39-inch yardstick tells you that you are at 1 yard and 1 inch (i.e., 40 inches), you know you have made some progress.
In evaluating the data presented above, you can see that the analytics packages are all reasonably close to one another. For purposes of evaluating the quality of a yardstick, we can conclude that each of these yardsticks are similar in their measurement quality.
However, referring back to the comScore study, if the first party cookie deletion rate is, in fact, greater than 30%, this would be of great concern. We will provide some more commentary on this issue in the Final Report.
To put this in perspective, classic marketing vehicles have no direct form of measurement whatsoever. How do TV ads drive sales? How about Radio ads? There is no direct measurement of the return on these marketing expenditures. The web is unique in its ability to provide a direct measurement of user behavior to a high degree of accuracy. There is no other marketing vehicle like it.
For example, in web analytics, you can conduct A/B testing at a remarkably granular level. Using our yardstick, we can tweak our marketing message in numerous ways, and get direct feedback on how it affects the achievement of our business objective. We could, for example, measure the effects of such changes as:
- Change marketing copy
- Change layout
- Change the colors used on landing pages
- Change the actual offers made
There really is much more that you can do. But the kicker is that you can get close to real-time feedback, and you can rapidly hone your pitch to the customer. On high volume sites, you can get meaningful feedback in less than a day.
Used in this fashion, web analytics software is very accurate. Given enough data, our 39-inch yardstick can easily measure the difference between a 1.5% conversion rate and a 2% conversion rate.
The marketing person placing TV and radio ads might well sell their soul to have that 39-inch yardstick to measure their ad’s effectiveness with a similar level of accuracy.
Have comments or want to discuss? You can comment on the Interim Analytics Report here
Content Group Data
- Here is the form completion and content group page view data for each of the analytics packages and CityTownInfo.com:
|Form 1||Form 2||Form 3||Group 1 Views||Group 2 Views||Group 3 Views|
|Unica Affinium NetInsight||172||572||70||60,699||4,713||12,291|
|WebSideStory HBX Analytics||162||560||69||54,889||4,274||14,763|
|Google Analytics %||100.94%||95.00%||88.06%||103.52%||104.77%||96.96%|
|Unica Affinium NetInsight %||100.94%||100.07%||104.48%||105.37%||105.17%||97.25%|
|WebSideStory HBX Analytics %||95.07%||97.97%||102.99%||95.28%||95.38%||116.81%|
- Here is the content group page view data for each of the analytics packages and HomePortfolio.com:
|Group 1 Views||Group 2 Views||Group 3 Views||Group 4 Views|
|WebSideStory HBX Analytics||2,222,843||161,922||317,307||10,787|
|Google Analytics %||122.52%||128.98%||109.93%||103.86%|
|WebSideStory HBX Analytics %||55.82%||40.58%||77.80%||94.76%|
Analysis and commentary
- Interestingly, this data appears to be more consistent than the traffic data (we discuss the exception of HBX Analytics running on HomePortfolio.com below). This is largely because it is page view based, and page views are inherently easier to track accurately than visitors. This is because the analytics software can’t detect when a user leaves your site.
In addition, if your visitor arrives on the site, then goes to lunch for an hour, then comes back and clicks on a link on the page they arrived on, the analytics software sees this as a new visit, even though they never left the page.
The reason for this is that the analytics industry has standardized on a definition of sessions that says that 30 minutes of inactivity ends a session (a visit). Some criteria for dealing with these situations needed to be selected, and this is what they chose.
- As an exception to this, the HBX Analytics content group data for HomePortfolio is quite a bit lower than that of the other packages. However, we discovered that this is due to an implementation error by our team.
Note that this is not a reflection of the difficulty in implementing HBX Analytics. Instead, it’s a reflection of how important it is to understand exactly what it is that you want the analytics software to do, specifying it accurately, and then double checking that you are measuring what you think you are measuring.
In the case of the problem above, through no fault of the software, we set it up to track people who initially entered at pages in the content group, rather than tracking all the page views for the content group, which is what we wanted.
There is a key lesson in this. Implementation of an analytics package requires substantial forethought and planning. And, when you are done with that, you have to check and recheck your results, to make sure they make sense. Here is a summary of some of the issues you face in setting up your implementation correctly:
- Tagging errors – an error in tagging your pages can really throw you for a loop. You need to do a comprehensive job of setting the software up for success.
- Understanding the terminology – each package uses terms in different ways, and it’s important to understand them.
- Learning the software, and how it does things – each software package has its own way of doing things.
- Learning your requirements – learning your requirements will be a process all by itself. If you are implementing analytics for the first time it may be many months before you truly understand how to use it most effectively on your site.
- Learning the requirements of others in your organization – these are not necessarily the same as your personal requirements.
- Validating the data – even if you are not running more than one analytics package, you need to have a method of testing the quality of your data and making sure it makes sense. If it doesn’t, then perhaps some of the other steps above were not correctly executed.
One way to reduce many of these risks is to install multiple analytics packages. A substantial difference between the two packages would provide you a visible clue that something went wrong!
Have comments or want to discuss? You can comment on the Interim Analytics Report here
The final report, with more comprehensive results and analysis, was officially released on August 27, 2007. Click here to see the Final Report of The 2007 Analytics Shoot Out.