Skip to main content

Analytics

2007 Web Analytics Shootout – Final Report

Introduction to the 2007 Analytics Shoot Out – by Jim Sterne

In this updated 2007 Analytics Shoot Out, Perficient Digital takes the same approach of head-to-head comparisons of major web analytics packages on real websites. Yes, they evaluate things like ease of implementation, use, and reporting. Yes, they look at the strengths and weaknesses of each package. But then they dig deeper into the conundrum of accuracy in web analytics data and discuss where accuracy matters. They look harder at first-party versus third-party cookies. They measure how does JavaScript placement on the web page affect the resulting data? They also get practical, identifying which analytics tools are best for which types of websites.
Are you trying to compare and contrast the different tools out there? This is a great resource.
Jim Sterne
eMetrics Marketing Optimization Summit

Overview of the 2007 Analytics Shoot Out

The 2007 Analytics Shoot Out is targeted at evaluating the performance, accuracy, and capabilities of 7 different analytics packages as implemented across 4 different sites. The goals of the project are as follows:

  1. Evaluate ease of implementation
  2. Evaluate ease of use
  3. Understand the basic capabilities of each package
  4. Solve specific problems on each web site
  5. Discover the unique strengths of each package
  6. Discover the unique weaknesses of each package
  7. Learn about the structural technology elements of each package that affect its capabilities
  8. Learn how to better match a customer’s needs to the right analytics package

How the results of the Shoot Out are delivered

The results of the Shoot Out have been delivered in two stages:

  1. The interim report was officially released at the Emetrics Summit in San Francisco on May 6, 2007.
  2. This report, the final report, contains all of the material in the interim report, along with more comprehensive results and analysis.

What you get in this report

Section 1. An executive summary of the report, key findings, and key takeaways
Section 2. Information about how the study was conducted, and its methodology
Section 3. An analysis of how the user deletion rates of third party cookies and first party cookies differ
Section 4. *** Content Updated in the Final Report ***: Comparative data showing:

  1. Visitors
  2. Unique Visitors
  3. Page Views
  4. Specific segments as defined per site for 2 sites

These numbers have been updated and expanded from the Interim Report.
Section 5. *** All New Content ***: A section on “Why Accuracy Matters”

  1. An overall commentary on accuracy in analytics
  2. A discussion of scenarios where accuracy matters
  3. What this means for how you use analytics to help manage your business
  4. How the analytics vendors measure sessions

Section 6. *** All New Content ***: A detailed study of the effect that location of the JavaScript on the web page has on traffic data:

  1. Test results showing how JavaScript placement affects search results
  2. A discussion of what this means for website owners and marketers

Section 7. *** All New Content ***: A qualitative review of the major strengths and weaknesses of all of the packages we worked with during the study. As all of the packages have strong customer bases, we did not anticipate that we would pick winners and losers per se, and we frankly don’t feel that is the pertinent output from such an examination.
This would imply that one package is best at all things for all people, and this is not the case. Each package has different strengths and weaknesses that ultimately make it a better fit for some types of web sites than others. For many webmasters, cost is also a large factor that needs to be considered.

Section 1: Executive Summary

I have participated in countless discussions with people who have been concerned about the accuracy of their analytics solutions. I have also had the chance to talk with, and interview, many of the leading players in the analytics industry. These leaders have all indicated that accuracy was not a problem, provided that the tools are implemented and used properly.
While I’ve used analytics tools extensively, and followed this business with great interest for quite some time, pursuing this project ultimately required a spark. That spark was provided by Rand Fishkin in a blog post he did in November 2006, titled: Free Linkbait Idea. Basically, Rand suggested that someone do a study based on placing multiple analytics packages simultaneously on multiple web sites, recording the data, and then analyze and publish the results.
I signed Perficient Digital up to do the job, and this study is the result.
As for whether or not the packages are accurate, you’ll see that this is not a simple question. The pundits are right – and they are also wrong. Ultimately, web analytics packages are like any other tool. Used properly, they can certainly help you grow and understand your business. However, it is easy to use them improperly, and it takes a sophisticated level of expertise to use them in an optimal fashion.
Web analytics, done right, is hard. However, done right, web analytics can provide an outstanding ROI on the time and money you put into it, and doing it well provides you with a major advantage over your competitors who do it less well.

Key Findings

  1. Web analytics packages, installed on the same web site, configured the same way, produce different numbers. Sometimes radically different numbers. In some cases the package showing the highest numbers reported 150% more traffic than the package reporting the least traffic.

  2. By far the biggest source of error in analytics is implementation error. A Web analytics implementation needs to be treated like a software development project, and must be subjected to the same scrutiny and testing to make sure it has been done correctly.

Note that we had the support of the analytics vendors themselves in the implementations done for the 2007 Web Analytics Shootout, so we believe that this type of error was not a factor in any of the data in our report, except where noted.

  1. Two other major factors drive differences in the results. One of these is the placement of JavaScript on the site, as being placed far down on a page may result in some users leaving the page before the JavaScript can execute. Traffic that is not counted as a result of the JavaScript can be considered an error, because the data for that visit is lost (or at least the data regarding the original landing page and, if the visitor came from the search engine, the keyword data would also be lost).

The other factor is differences in the definition of what each package is counting. The way that analytics packages count visitors and unique visitors is based on the concept of sessions. There are many design decisions made within an analytics package that will cause it to count sessions differently, and this has a profound impact on the reported numbers.
Note that this should not be considered a source of error. It’s just that the packages are counting different things, equally well for the most part.

  1. Page views tend to have a smaller level of variance. The variance in ways an analytics package can count page views is much smaller. JavaScript placement will affect page views, but differences in sessionization algorithms will not. Simply put, if the tracking JavaScript on a page executes, it counts as a page view.
  2. There are scenarios in which these variances and errors matter, particularly if you are trying to compare traffic between sites, or numbers between different analytics packages. This is, generally speaking, an almost fruitless exercise.

  3. To help address these accuracy problems, you should calibrate with other tools and measurement techniques when you can. This helps quantify the nature of any inaccuracies, and makes your analytics strategy more effective.

  4. One of the basic lessons is learning what analytics software packages are good at, and what they are not good at. Armed with this understanding, you can take advantage of the analytics capabilities that are strong and reliable, and pay less attention to the other aspects. Some examples of where analytics software is accurate and powerful are:

  1. A/B and multivariate testing
  2. Optimizing PPC Campaigns
  3. Optimizing Organic SEO Campaigns
  4. Segmenting visitor traffic
  1. There are many other examples that could be listed. The critical lesson is that the tools are not accurate, But their relative measurements are worth their weight in gold.

In other words if your analytics package tells you that Page A converts better than Page B, that’s money in the bank. Or if the software tells you which keywords offer the best conversion rates, that’s also money in the bank. Or, if it says that European visitors buy more blue widgets than North American visitors – you got it – more money in the bank.
So enter the world of analytics accuracy below, and hopefully, you will emerge with a better appreciation of how to use these tools to help your business, as I did.

Section 2: 2007 Analytics Shoot Out Details

Analytics Packages

The following companies actively contributed their time and effort to this project:

  1. Clicktracks
  2. Google Analytics
  3. IndexTools
  4. Unica Affinium NetInsight
  5. Visual Sciences’ HBX Analytics

Each of these analytics packages was installed on multiple web sites, and each of these companies contributed engineering support resources to assist us during the project.
We were also able to evaluate the following analytics packages because they were already on one of the sites we used in the project:

  1. Omniture SiteCatalyst
  2. WebTrends

Participating Web Sites

  1. AdvancedMD (AMD)
  2. City Town Info (CTI)
  3. Home Portfolio (HPort)
  4. Tool Parts Direct (TPD)

Each of these sites installed multiple analytics packages on their sites per our instructions, and made revisions as requested by us. Here is a matrix of Web Sites and Analytics Packages that were tested in the Shoot Out:

Site Clicktracks Google Analytics IndexTools Omniture Unica Net Insight WebSideStory HBX Analytics WebTrends
AMD Y Y Y Y Y Y N
CTI Y Y Y N Y Y N
HPort Y Y Y N N Y Y
TPD Y Y Y N N Y N

Additional Contributors

Thanks are also due to the following people, who contributed to this project:

  1. John Biundo of Perficient Digital
  2. Jonah Stein of Alchemist Media
  3. Rand Fishkin of SEOmoz
  4. John Marshall of Market Motive
  5. Dennis Mortensen of IndexTools

And a special thanks to Jim Sterne of Target Marketing, and the eMetrics Marketing Optimization Summit for his support of the Shoot Out.

Methodology

The major aspects of the Shoot Out methodology are as follows:

  1. For each package, except WebTrends, we installed JavaScript on the pages of the participating sites. WebTrends was already installed on one of the sites participating in the project, and the implementation used a combination of JavaScript tags and log file analysis.
  2. All the JavaScript was added to website pages through include files. As a result, we have eliminated the possibility of the JavaScript coverage varying by package.
  3. All packages were run concurrently.
  4. All packages used first party cookies.
  5. A custom analytics plan was tailored for the needs of each site.
  6. Visitors, Unique Visitors, and Page Views were recorded daily for each site.
  7. Content Groups and Segments were set up for each site. Numbers related to these were recorded daily.
  8. On one site, City Town Info, we varied the order of the JavaScript on the page for a period of time, to see how this altered the comparative statistics for the 5 analytics packages we had running on it.
  9. Also on City Town Info, we placed a tracking pixel at the top of the page, to see how that placement affected the counting of traffic.
  10. We measured the execution time of each of the analytics packages across 3 of the sites.
  11. Detailed ad hoc analysis was done with each analytics package on each site.
  12. Critical strengths and weaknesses of each package were noted, and reviewed with each vendor for comment.
  13. Each vendor was given an opportunity to present their product’s strongest features and benefits.

Section 3: First Party Cookies vs. Third Party Cookies

Using Visual Sciences’s HBX Analytics running on CityTownInfo.com, we ran the software for a fixed period of time using third party cookies (TPCs). We then ran the software for the same amount of time using first party cookies (FPCs).
During that same period we ran 3 of the other analytics packages (Clicktracks, Google Analytics, and IndexTools), all using first party cookies.
The results were then compared by examining the relationship of HBX reported volumes to the average of the volumes of the three other packages, and then seeing how that relationship changed when we switched from third party cookies to first party cookies. In theory, this should give us an estimate of how the user blocking and deletion of third party cookies compares to user blocking and deletion of first party cookies.
Here are the results we obtained while HBX Analytics was running third party cookies:

  Visitors Uniques Page Views
Clicktracks 72,224 66,335 120,536
Google Analytics 66,866 64,975 118,230
IndexTools 67,365 65,212 123,279
WebSideStory’s HBX Analytics 48,990 47,813 102,534
       
Average of all but HBX Analytics 68,818 65,507 120,682
HBX Analytics % of Average 71.19% 72.99% 84.96%

Visitor and unique visitor totals for HBX Analytics are 71 – 73% of the average of the other 3 packages. On the other hand, page views are roughly 85% of the average of the other 3 packages.
Now let’s take a look at the same type of information over the time period when HBX Analytics was making use of first party cookies:

  Visitors Uniques Page Views
Clicktracks 71,076 65,314 114,966
Google Analytics 65,906 64,030 112,436
IndexTools 67,117 64,621 119,049
WebSideStory’s HBX Analytics 55,871 54,520 96,453
       
Average of all but HBX Analytics 68,033 64,655 115,484
HBX Analytics % of Average 82.12% 84.32% 83.52%
Relative Traffic Growth with FPCs (*) 13.32% 13.44%
  • Calculated as 1 – (The HBX Analytics % of Average in the first part of this test / The HBX Analytics % of Average in the second part of this test)

With first party cookies, the visitor and unique visitor totals for HBX Analytics are now 82 – 84% of the average of the other 3 packages. The page views relationship did not change significantly, and was roughly 84%.
By observing how the traffic reported by HBX Analytics increased with respect to the average of the other 3 packages, we can estimate how third party cookie blocking and deletion differs from first party cookie blocking and deletion.
According to this data, the third party cookie blocking and deletion rate exceeds the first party cookie blocking and deletion rate by a little more than 13%. Visual Sciences also reported to Perficient Digital that it saw a 15-20% third party cookie blocking and deletion rate across sites that they monitor during a 2 week period in January, and about a 2% first party cookie blocking and deletion rate.
This data is fairly consistent with past industry data that estimates the third party cookie deletion rate at about 15%. Visual Sciences reported to me recently that they see a 12% to 15% deletion rate on TPCs and about 1% on FPCs.
Note that the page view numbers do not vary much, because the process of counting page views is not dependent on cookies, so whether or not a FPC or TPC is used is irrelevant.
Note that comScore recently reported more than 30% of cookies are deleted overall, and also seemed to show that the difference between TPC and FPC deletions was significantly smaller. Note that there are many concerns about the accuracy of these numbers given the methods used by comScore to collect their data. In any event, our data above should provide a reasonable indication of how TPC deletions differ from FPC deletions.

Why Cookie Deletion Rates Matter

Cookie deletion rates are of great concern when evaluating web analytics. Every time a cookie is deleted it impacts the visitor and unique visitor counts of the tool. In particular, counting of unique visitors is significantly affected. If a user visits a site in the morning, deletes their cookies, and then visits again in the afternoon, this will show up as 2 different daily unique visitors in the totals for that day, when in fact one user made multiple visits, and should be counted only as one unique visitor.
It should be noted that the packages use different methods for setting their cookies. For example, HBX Analytics requires you to setup a CNAME record in your DNS configuration file (note that DNS A records can also be used) to remap a sub-domain of your site to one of their servers.
While this requires someone who is familiar with configuring DNS records to do, it does provide some advantages. For example, simple first party cookie implementations still pass data directly back to the servers of the analytics vendor. Memory resident anti-spyware software will intercept and block these communications.
Using the CNAME record bypasses this problem, because all the memory resident anti-spyware software will see is a communication with a sub-domain of your site, and the process of redirecting the data stream to the HBX Analytics server happens at the DNS level.
Unica provides the option of either using a DNS A record based approach for first party cookies or going with a simpler first party cookie implementation. Note that an A record can be used to do the same thing as a CNAME record, with only some subtle differences.
Other analytics packages used in this test (Clicktracks, Google Analytics, and IndexTools) have chosen a simple first party cookie approach to initial configuration which requires no special configuration, and that allows a less technical user to set them up and get started.

Section 4: Visitors, Unique Visitors, and Page Views (aka “traffic numbers”)

For each participating site we show two sets of results below. First is the set of numbers presented in the Interim report published in May of 2007. The second set of numbers is completely new traffic data for the same sites, but over a different period of time. There was no overlap in the two time periods.
The goal with the second set of data is to determine if there were any major shifts in the data over time.

Notes

  1. The Uniques column is the summation of Daily Unique Visitors over a period of time. The resulting total is therefore not an actual unique visitor count for the time period (because some of the visitors may have visited the site multiple times, and have been counted as a Daily Unique Visitor for each visit).

This was done because not all of the packages readily permitted us to obtain Unique Visitor totals over an arbitrary period of time. For example, for some packages, it is not trivial to pull the 12 day Unique Visitor count.
Regardless, the Uniques data in the tables below remains a meaningful measurement of how the analytics packages compare in calculating Daily Unique Visitors.

  1. The time period is not being disclosed to obscure the actual daily traffic numbers of the participating sites. In addition, the time period used for each site differed.
  2. One factor that we examined in detail was the effect of JavaScript order on the results. The details of this will be discussed in a later section of this report, but you can see a table of the placement of the JavaScript for each of the sites in Appendix A.

Traffic Data

1. City Town Info Table 1. The following data is the summary visitor, unique visitor, and page view data for CityTownInfo.com that was presented in the Interim Report:

CityTownInfo.com Analytics Data – Interim Report Data Visitors Uniques Page Views
Clicktracks 645,380 587,658 1,042,604
Google Analytics 600,545 583,199 1,038,995
IndexTools 614,600 595,163 1,099,786
Unica Affinium NetInsight 607,475 593,871 1,027,445
WebSideStory HBX Analytics 524,055 510,882 910,809
Average 598,411 574,155 1,023,928
       
Clicktracks % 107.85% 102.35% 101.82%
Google Analytics % 100.36% 101.58% 101.47%
IndexTools % 102.71% 103.66% 107.41%
Unica Affinium NetInsight % 101.51% 103.43% 100.34%
WebSideStory HBX Analytics% 87.57% 88.98% 88.95%
       
Standard Deviation 40209 31930 61868
Clicktracks Std Deviations 1.17 0.42 0.30
Google Analytics Std Deviations 0.05 0.28 0.24
IndexTools Std Deviations 0.40 0.66 1.23
Unica Affinium NetInsight Std Deviations 0.23 0.62 0.06
WebSideStory HBX Analytics Std Deviations -1.85 -1.98 -1.83

2. City Town Info Table 2. The following data is the summary visitor, unique visitor, and page view data for CityTownInfo.com that was recorded for the Final Report:

CityTownInfo.com Analytics Data – Final Report Data Visitors Uniques Page Views
Clicktracks 663,803 609,511 1,071,589
Google Analytics 603,619 586,580 1,045,327
IndexTools 638,602 618,376 1,138,659
Unica Net Insight 627,072 614,512 1,062,493
Visual Sciences HBX Analytics 525,038 513,020 922,692
Average 611,627 588,400 1,048,152
       
Clicktracks % 108.53% 103.59% 102.24%
Google Analytics % 98.69% 99.69% 99.73%
IndexTools % 104.41% 105.09% 108.63%
Unica Net Insight % 102.53% 104.44% 101.37%
Visual Sciences HBX Analytics% 85.84% 87.19% 88.03%
       
Standard Deviation 47435 39272 70278
Clicktracks Std Deviations 1.3 0.66 0.38
Google Analytics Std Deviations -0.2 -0.06 -0.05
IndexTools Std Deviations 0.67 0.94 1.46
Unica Affinium Net Insight Std Deviations 0.38 0.82 0.23
Visual Sciences HBX Analytics Std Deviations -2.15 -2.36 -2.03

3. Home Portfolio Table 1: The following data is the summary visitor, unique visitor, and page view data for HomePortfolio.com that was presented in the Interim Report:

HomePortfolio.com Analytics Data – Interim Report Data Visitors Uniques Page Views
Google Analytics 754,446 707,358 7,209,828
IndexTools 731,218 686,518 7,078,720
WebSideStory HBX Analytics 701,895 662,411 6,439,982
WebTrends 804,012 778,280 7,483,154
Average 747,893 708,642 7,052,921
       
Google Analytics % 100.88% 99.82% 102.22%
IndexTools % 97.77% 96.88% 100.37%
WebSideStory HBX Analytics % 93.85% 93.48% 91.31%
WebTrends % 124.83% 127.53% 106.10%
       
Standard Deviation 37370 43237 382779
Google Analytics Std Deviations 0.18 -0.03 0.41
IndexTools Std Deviations -0.45 -0.51 0.07
WebSideStory HBX Analytics Std Deviations -1.23 -1.07 -1.60
WebTrends Std Deviations 1.50 1.61 1.12

4. Home Portfolio Table 2: The following data is the summary visitor, unique visitor, and page view data for HomePortfolio.com that was recorded for the Final Report. Note that Clicktracks was not present in the first phase, but was included in the second phase.

HomePortfolio.com Analytics Data – Final Report Data Visitors Uniques Page Views
Clicktracks 906,264 767,128 6,761,954
Google Analytics 800,608 756,164 7,055,278
IndexTools 780,043 734,082 6,794,242
Visual Sciences HBX Analytics 778,789 750,734 6,451,555
WebTrends 1,003,683 964,480 7,312,397
Average 853,877 794,518 6,875,085
       
Clicktracks % 106.14% 96.55% 98.35%
Google Analytics % 93.76% 95.17% 102.62%
IndexTools % 91.35% 92.39% 98.82%
Visual Sciences HBX Analytics % 91.21% 94.49% 93.84%
WebTrends % 117.54% 121.39% 106.36%
       
Standard Deviation 88446 85648 290662
Clicktracks Std Deviations 0.59 -0.32 -0.39
Google Analytics Std Deviations -0.6 -0.45 0.62
IndexTools Std Deviations -0.83 -0.71 -0.28
Visual Sciences HBX Analytics Std Deviations -0.85 -0.51 -1.46
WebTrends Std Deviations 1.69 1.98 1.5

5. Tool Parts Direct Table 1: The following data is the summary visitor, unique visitor, and page view data for ToolPartsDirect.com that was presented in the Interim Report:

ToolPartsDirect.com Analytics Data – Interim Report Data Visitors Uniques Page Views
Clicktracks 129,900 91,879 639,892
Google Analytics 159,955 103,260 939,373
IndexTools 108,486 92,070 687,544
WebSideStory HBX Analytics 103,724 91,847 582,887
Average 125,516 94,764 712,424
       
Clicktracks % 103.49% 96.96% 89.82%
Google Analytics % 127.44% 108.97% 131.86%
IndexTools % 86.43% 97.16% 96.51%
WebSideStory HBX Analytics % 82.64% 96.92% 81.82%
       
Standard Deviation 22193 4906 136167
Clicktracks Std Deviations 0.20 -0.59 -0.53
Google Analytics Std Deviations 1.55 1.73 1.67
IndexTools Std Deviations -0.77 -0.55 -0.18
WebSideStory HBX Analytics Std Deviations -0.98 -0.59 -0.95

6. Tool Parts Direct Table 2: The following data is the summary visitor, unique visitor, and page view data for ToolPartsDirect.com that was recorded for the Final Report:

ToolPartsDirect.com Analytics Data Visitors Uniques Page Views
Clicktracks 318,189 222,270 1,568,546
Google Analytics 399,784 249,788 2,262,553
IndexTools 261,691 222,248 1,653,576
Visual Sciences HBX Analytics 249,067 220,813 1,417,426
Average 307,183 228,780 1,725,525
       
Clicktracks % 103.58% 97.15% 90.90%
Google Analytics % 130.15% 109.18% 131.12%
IndexTools % 85.19% 97.14% 95.83%
Visual Sciences HBX Analytics % 81.08% 96.52% 82.14%
       
Standard Deviation 59462 12144 32138100.00%
Clicktracks Std Deviations 0.19 -0.54 -0.49
Google Analytics Std Deviations 1.56 1.73 1.67
IndexTools Std Deviations -0.77 -0.54 -0.22
Visual Sciences HBX Analytics Std Deviations -0.98 -0.66 -0.96

7. AdvancedMD Table 1: The following data is the summary visitor, unique visitor, and page view data for AdvancedMD.com that was presented in the Interim Report:

AdvancedMD.com Analytics Data – Interim Report Data Visitors Uniques Page Views
Clicktracks 155,396 63,339 234,930
Google Analytics 148,665 63,554 231,511
IndexTools 116,757 52,949 225,859
Omniture SiteCatalyst 110,211 64,016 237,108
Unica Affinium Net Insight 101,419 57,739 196,277
WebSideStory HBX Analytics 110,824 63,156 222,732
Average 123,878 60,792 224,736
       
Clicktracks % 125.44% 104.19% 104.54%
Google Analytics % 120.01% 104.54% 103.01%
IndexTools % 94.25% 87.10% 100.50%
Omniture Site Catalyst % 88.97% 105.30% 105.51%
Unica Affinium Net Insight % 81.87% 94.98% 87.34%
WebSideStory HBX Analytics % 89.46% 103.89% 99.11%
       
Standard Deviation 20494 4101 13651
Clicktracks Std Deviations 1.54 0.62 0.75
Google Analytics Std Deviations 1.21 0.67 0.50
IndexTools Std Deviations -0.35 -1.91 0.08
Omniture SiteCatalyst Std Deviations -0.67 0.79 0.91
Unica Affinium Net Insight Std Deviations -1.10 -0.74 -2.08
WebSideStory HBX Analytics Std Deviations -0.64 0.58 -0.15

8. AdvancedMD Table 2: The following data is the summary visitor, unique visitor, and page view data for AdvancedMD.com that was recorded for the Final Report:

AdvancedMD.com Analytics Data – Final Report Data Visitors Uniques Page Views
Clicktracks 1,398,365 600,855 2,039,587
Google Analytics 1,345,801 603,627 2,012,420
IndexTools 1,067,819 489,605 1,960,184
Omniture SiteCatalyst 1,016,563 605,550 2,094,566
Unica Net Insight 944,008 54042400.00% 1,717,584
Visual Sciences HBX Analytics 1,023,003 59467700.00% 1,920,104
Average 1,132,593 57245600.00% 1,957,408
       
Clicktracks % 123.47% 104.96% 104.20%
Google Analytics % 118.82% 105.45% 102.81%
IndexTools % 94.28% 85.53% 100.14%
Omniture Site Catalyst % 89.76% 105.78% 107.01%
Unica Net Insight % 83.35% 94.40% 87.75%
Visual Sciences HBX Analytics % 90.32% 103.88% 98.09%
       
Standard Deviation 173842 43316 120766
Clicktracks Std Deviations 1.53 0.66 0.68
Google Analytics Std Deviations 1.23 0.72 0.46
IndexTools Srd Deviations -0.37 -1.91 0.02
Omniture SiteCatalyst Std Deviations -0.67 0.76 1.14
Unica Affinium Net Insight Std Deviations -1.08 -0.74 -1.99
Visual Sciences HBX Analytics Std Deviations -0.63 0.51 -0.31

Initial Observations

There were significant differences in the traffic numbers revealed by the packages. While we might be inclined to think that this is a purely mechanical counting process, it is in fact a very complex process.
There are dozens (possibly more) implementation decisions made in putting together an analytics package that affect the method of counting used by each package. The discussion we provided above about different types of first party cookie implementation is just one example.
Another example is the method used by analytics packages to track user sessions. It turns out that this is done somewhat differently by each package. You can see more details on what these differences are in Appendix B.
Other examples include: whether or not configuration of the package is done primarily in the JavaScript or the UI, and how a unique visitor is defined (e.g., is a daily unique visitor defined as over the past 24 hours, or for a specific calendar day?).
If we look at the standard deviations in the above data, the distribution appears to be pretty normal. Note that for a normal distribution, 68% of scores should be within 1 standard deviation, and 95% of the scores should be within 2 standard deviations. In our data above, this indeed appears to be holding roughly true.

Charting the Data

The following three charts provide a graphical representation of the tables above. In order to give them more meaning, we have normalized the data to the same scale.
Here is a summary of the visitor data in a chart:
Visitor Data Chart

 

Here is a summary of the raw unique visitor data in a chart:
Chart Showing Unique Visitor Data

 

Here is a summary of the raw page view data in a chart:
Page View Data Chart

 
  1. While HBX Analytics tended to report the lowest numbers of all the packages, this was not always the case. For example, on AdvancedMD.com, HBX was higher than 2 packages for visitors, and unique visitors. In particular, note the scenario labeled “CTI2” (City Town Info, Scenario 2) which corresponds to the time when the JavaScript order was changed on CTI. HBX Analytics was the first JavaScript in the HTML before the change, and first after the change, and the HBX results were on the higher side after the change.
  2. Google Analytics appears to count significantly higher than any of the other vendors on Tool Parts Direct (TPD). However, on TPD, the Google Analytics code is present in the HTML header, and all the other vendors are placed immediately before the tag at the bottom of the HTML for the page.

We measured the average time between the completed execution of Google Analytics, and the completed execution of IndexTools (the next analytics package to execute), and that time delay was about 3.3 seconds (Google was finished at 0.7 seconds after the page began loading, and IndexTools was finished at around 4 seconds).
In another test, for which the results are shown in Section 6, we showed that an execution delay of 1.4 seconds would result in a loss of 2% to 4% of the data. It is our theory that as the delay in execution expands the amount of lost data increases.
The loss occurs because the user sees the link they want, and clicks on it before the analytics software ever executes. On TPD, because the Google Analytics JavaScript is in the header, it always executes before the page is displayed. IndexTools was not finished for another 3.3 seconds. It is reasonable to project that a significant number of users will have moved on by that point in time. In Section 6, we speculate that this number may be 12.2% of the users.

  1. Clicktracks reported the highest numbers on AdvancedMD.com, and the second highest numbers on ToolPartsDirect.com. Our later analysis shows reasons why Clicktracks may tend to count quite a bit higher on PPC driven sites (which is the case for AMD and TPD).

Clicktracks uses a shorter inactivity timeout for sessions (see Appendix B for more details on this), and also will treat any new PPC visit to a site as a new session. Clicktracks is more heavily optimized for the management of PPC campaigns than other packages, and this is one of the results of that.

  1. On HomePortfolio.com, WebTrends reported significantly more visitors and unique visitors than the other vendors (about 20% more). This is the only site that we were able to look at WebTrends numbers for at this stage in the project.

Google Analytics reported the second highest numbers on this site.

  1. On CityTownInfo.com, the highest numbers were reports by IndexTools.

Content Group Data

  1. Here is the form completion and content group page view data for each of the analytics packages and CityTownInfo.com:
  Form 1 Form 2 Form 3 Group 1 Views Group 2 Views Group 3 Views
Clicktracks 169 567 69 45,646 3,833 9,423
Google Analytics 172 543 59 59,638 4,695 12,255
IndexTools 177 616 68 67,166 4,891 14,461
Unica Affinium NetInsight 172 572 70 60,699 4,713 12,291
WebSideStory HBX Analytics 162 560 69 54,889 4,274 14,763
             
Average 170 572 67 57,608 4,481 12,639
             
Clicktracks % 99.18% 99.20% 102.99% 79.24% 85.54% 74.56%
Google Analytics % 100.94% 95.00% 88.06% 103.52% 104.77% 96.96%
IndexTools % 103.87% 107.77% 101.49% 116.59% 109.14% 114.42%
Unica Affinium NetInsight % 100.94% 100.07% 104.48% 105.37% 105.17% 97.25%
WebSideStory HBX Analytics % 95.07% 97.97% 102.99% 95.28% 95.38% 116.81%
  1. Here is the content group page view data for each of the analytics packages and HomePortfolio.com:
  Group 1 Views Group 2 Views Group 3 Views Group 4 Views
Google Analytics 4,878,899 514,704 448,355 11,823
IndexTools 4,844,642 520,521 457,857 11,540
WebSideStory HBX Analytics 2,222,843 161,922 317,307 10,787
         
Average 3,982,128 399,049 407,840 11,383
         
Google Analytics % 122.52% 128.98% 109.93% 103.86%
IndexTools % 121.66% 130.44% 112.26% 101.38%
WebSideStory HBX Analytics % 55.82% 40.58% 77.80% 94.76%

Analysis and commentary on Content Group data

  1. Interestingly, this data has less variation across packages than the traffic data (we discuss the exception of HBX Analytics running on HomePortfolio.com below). This is largely because it is page view based, and page views are inherently easier to track accurately than visitors.

The reason for this is that page views are, generally speaking, easy to count, and there is less variance in the algorithms that the web analytics packages use. Basically, every time the JavaScript runs, the page view count is updated.
Tracking visitors and unique visitors is quite a bit more complicated. In Appendix B, we explain why in more detail, but basically, session tracking relies on cookies and post processing to count visitors and unique visitors. There are a number of heuristics and basic implementation decisions that each vendor makes that have a major impact on visitor and unique visitor totals.

  1. As an exception to this, the HBX Analytics content group data for HomePortfolio is quite a bit lower than that of the other packages. However, we discovered that this is due to an implementation error by our team.

Note that this is not a reflection of the difficulty in implementing HBX Analytics. Instead, it’s a reflection of how important it is to understand exactly what it is that you want the analytics software to do, specifying it accurately, and then double checking that you are measuring what you think you are measuring.
In this case, we set up HBX Analytics to track people who initially entered at pages in the content group, rather than tracking all the page views for the content group, which is what we wanted.
There is a key lesson in this. Implementation of an analytics package requires substantial forethought and planning. And, when you are done with that, you have to check, and recheck your results, to make sure they make sense. Here is a summary of some of the issues you face in setting up your implementation correctly:

  1. Tagging errors – an error in tagging (placing JavaScript on) your pages can really throw you for a loop. These errors are easy to make too, as tagging pages is basically a programming task, and you need to remember to tag every page, and this gets exponentially harder as you begin customizing the JavaScript. You need to do a comprehensive job of setting the software up for success.
  2. Understanding the terminology – each package uses terms in different ways, and it’s important to understand them.
  3. Learning the software, and how it does things – each software package has its own way of doing things.
  4. Learning your requirements – this is a process all by itself. If you are implementing analytics for the first time it may be many months before you truly understand how to use it most effectively on your site.
  5. Learning the requirements of others in your organization – these are not necessarily the same as your personal requirements. For example, your CEO may need one set of information, your VP of Sales something else, and your business analyst something else entirely.
  6. Validating the data – even if you are not running more than one analytics package, you need to have a method of testing the quality of your data and making sure it makes sense.

One way to reduce many of these risks is to install multiple analytics packages. We often put Google Analytics on sites, even if they already have an analytics package on them. This is not to say that Google Analytics is the gold standard. With this approach, however, if you spot substantial differences (30% or more, for example) between the two packages, that would provide you a visible clue that something may have gone wrong in your tagging or setup!

Section 5: Why Accuracy Matters

As Jim Sterne is fond of saying, if your yardstick measures 39 inches instead of 36 inches, it’s still great to have a measurement tool. The 39 inch yardstick will still help you measure changes with a great deal of accuracy. So if tomorrow your 39 inch yardstick tells you that you are at 1 yard and 1 inch (i.e., 40 inches), you know you have made some progress.
Having explained the value of a 39 inch yardstick, it is worthwhile to take a moment and consider the value of accuracy in analytics. To evaluate how far apart our yardsticks are getting, we looked a bit further at our data to see how the difference between the packages reporting the most traffic, and the least traffic varied, for each site:
Max Differential Per Site – Visitors

AMD 153.22% Clicktracks / Unica Net Insight
TPD 154.21% Google Analytics / HBX Analytics
HP 114.55% WebTrends / HBX Analytics
CTI 123.15% Clicktracks / HBX Analytics

Max Differential Per Site – Unique Visitors

AMD 120.90% Omniture / Unica Affinium NetInsight
TPD 112.42% Google Analytics / HBX Analytics
HP 136.43% WebTrends / HBX Analytics
CTI 116.50% IndexTools / HBX Analytics

Max Differential Per Site – Page Views

AMD 120.80% Omniture / Unica Affinium NetInsight
TPD 161.15% Google Analytics / HBX Analytics
HP 116.20% WebTrends / HBX Analytics
CTI 120.75% IndexTools / HBX Analytics

As you can see, the differences in the above data between the low counting software and the highest counting software are substantial.
Given the notion of a 39 inch yardstick, how much does this matter? Actually, in some situations, it matters a lot. Here are three example scenarios I have heard about recently:

  1. Company A acquires company B’s web site, and one of the key metrics discussed during the acquisition is the traffic level to the site. One reason that traffic may be a key metric, for example, is that you may know that you have an ability to achieve a certain amount of revenue per visitor, based on the way your analytics package counts visitors.

But if the site you just acquired is running a different analytics package that reports 50% more traffic on the acquired site than your analytics package does, you are going to be extremely unhappy once you set up your analytics package on the acquired site and see the “real numbers”.
This is a clear scenario where you need to calibrate your analytics. Ideally, you should get your analytics software installed on the site to be acquired prior to finalizing the acquisition, so you can see the traffic numbers in real terms that you are familiar with.
A backup plan would be to take one of the free packages, such as Google Analytics or Clicktracks Appetizer, and place them both on the site to be acquired and your own site, so you can get a clear reference point on the traffic.

  1. Company A has been running one analytics software package for a long time, but decides to switch to another one. Perhaps there is a limitation in the first package causing them to make the switch.

They get it running, and they find the traffic numbers vary wildly by category of data. In some cases the discrepancy is quite large. Now management has lost all confidence in the analytics data they are dealing with. The team that has done the implementation is in all kinds of hot water.
Not having any confidence in the metrics for your business can be considered a small scale disaster. Consider this: If you are a senior manager in the business, and you don’t believe the numbers coming from your analytics software, wouldn’t you consider not spending any more money on it?

  1. Company A is running a PPC campaign. They know from other tracking mechanisms that they have in place (such as a parameter on the URL) that they are getting a 27% margin on their PPC campaigns. Now they want to use their analytics solution to give them the insight to further optimize and improve their campaign.

The problem occurs when they start seeing a different set of results from their analytics data. This causes them to lose confidence in the data that they are looking at, and therefore they may choose not to proceed with using the analytics software to help tune their PPC campaigns.

  1. Company A is running a PPC campaign. They are comparing their incoming click data reports from the search engine with the data they see in their analytics, and they don’t match up. They are wondering if the search engine is ripping them off.
  2. Company A is selling impression based advertising to Company B. They are using Company A’s web analytics software to measure the number of impressions generated. Both companies want to make sure that the count is accurate.

Accuracy Summary

The stories above are not uncommon. However, analytics solutions can be extremely effective in helping you tune your web site. The first thing you should know is that often the largest source of error in web analytics, when using a JavaScript solution, is an implementation problem.
During the Shoot Out, we went to great pains to make sure that we had correct implementations for all the tools, and the analytics vendors helped us with this. But there are many different sources of error. In our test, some of these errors would affect all the analytics packages tested equally. For example, a user who uses multiple computers would likely be seen as multiple users by all the packages.
Here is a summary of some factors that would potentially cause the analytics packages we tested to report different results:

  1. Placement of the JavaScript on the page – as we will see in Section 6, this does affect counting significantly.
  2. Session tracking timeouts algorithm used (see Appendix B for more on this).
  3. Other factors that drive the initiation of new sessions, such as beginning a new session on any new visit from a PPC search engine (see Appendix B for more on this).
  4. Aggressiveness with which questionable sessions are discarded (see Appendix B for more on this).
  5. Cookie blocking – some packages do not fall back on a combination of pixel tracking and/or IP and User agent detection to still count those visitors. Some packages do, and in addition, there are multiple ways for them to implement this.
  6. Spyware blocking communications with the analytics server. This will not affect implementations where first party cookies are set up at the DNS level.
  7. Analytics server down time (rare).
  8. Network problems preventing communication with the analytics server.
  9. Analytics servers being blocked by firewalls (e.g. a corporate firewall).

Here is a summary of some factors that would affect the results of the analytics packages we tested equally:

  1. Multiple users on one computer will be treated as a single user.
  2. One user who uses multiple computers will be counted as multiple users.
  3. JavaScript is disabled on the user’s browser.

In rough terms, for two of our sites, our data showed that the highest count was about 20% above the average of all the packages, and the lowest showed data about 20% below the average of all the packages.
This variance is largely attributable to design and implementation decisions made by the software development teams that created each package, resulting in greater or lesser accuracy (but there is no way to know which one was most accurate).

What to do about it

Now we get back to our 39 inch yardstick. Perhaps based on our data we should be referring to this as a 43 inch yardstick (120% of 1 yard). Should we be alarmed at this level of variance in the results? Not really, but it is important to understand that these sources of error exist, and it’s important to understand how to deal with them.
First of all, some of the largest sources of error, such as those that relate to session management and counting, do cause a variance in the traffic results returned by the packages, but they do not affect the ability of the program to monitor the key performance indicators (KPIs) for your site. For example, one large potential source of error is the aggressiveness with which questionable sessions are filtered out.
An example of a potentially questionable session is a visit from a new IP address with an unknown user agent, that views only 1 page, and that has no referrer. One package might throw this out, and another might leave it in. This type of decision-making could have a large affect on traffic counts, but the visitors we are talking about are untraceable.
The analytics packages that are throwing this data out have made the decision that the data is not useful or relevant. There is a chance that they are wrong in some cases. However, even if they do throw out some relevant data, the analytics package is measuring the behavior of the great majority of your users.
Even if an analytics package is measuring the behavior of only 80% of their users, it remains highly relevant and valuable data. By contrast, the traditional print industry relies on subscriber surveys, and feels lucky if they get 20% response. They would die for data on 80% of their customers.
The fact is that some percentage of the questionable data is bad, and some of it may actually relate to a real user. The package that throws out too little gets skewed on one direction, and the package that throws out too much gets skewed a little bit in the other direction.
Neither of these changes the ability of these packages to measure trends, or to help you do sophisticated analysis about what users are doing on your site.
Here are some suggestions on what you can do to deal with the natural variances in analytics measurement techniques:

  1. Realize that there are variances and errors, and get comfortable with the fact that the tools all provide very accurate relative data measurement. As we said before, if your 43 inch yardstick tells you that page A is converting better than page B, or that visitors from Europe buy more blue widgets than visitors from North America, that is solid and dependable information.

Similarly, if your 29 inch yardstick told you that you have 500,000 unique visitors two months ago, and also tells you that you received 600,000 unique visitors last month, you can feel comfortable that your business grew by approximately 20%.

  1. Don’t get hung up on the basic traffic numbers. The true power of web analytics comes into play when you begin doing A/B testing, multivariate testing, visitor segmentation, search engine marketing performance tracking and tuning, search engine optimization, etc.
  2. Calibrate whenever you can. For example, if you have a PPC campaign, use some other mechanism to see how your results compare at a global level. This other mechanism will help you cross check the accuracy of your analytics data, and help ferret out any implementation errors.

Note that the analytics package will be able to do many other extremely valuable things that other tracking mechanisms can’t, such as match up conversions with landing pages, navigation paths, search terms and search engines, etc.
Or, using the acquisition example we talked about above, use a common analytics package between two different sites to get a better idea as to how the data between the two sites compares. For example, part of the due diligence process could be the installation of Google Analytics on your site, and the site you are looking at acquiring, and then comparing the numbers from Google Analytics side by side.
Comparing two sites using the same analytics tool will remove the largest source of error beyond your control, namely, the specific design and implementation decisions made in building the tool.

  1. Realize that the biggest sources of error are JavaScript implementation errors. This could be as simple as pages that are missing the JavaScript, pages with malformed JavaScript, or problems that crop up as pages get added to the web site, moved, or removed from the web site.

This is an error completely within your control, and one that is quite potentially more devastating than any variance in the counting techniques used by the packages.
Note that we do not believe that this affected our report, except where noted, because we had the active help of the analytics vendor in setting up their JavaScript and making sure that we were error free.
In addition, the JavaScript was installed using include files, so any failure to place one vendor’s tag on a given page would result in all the vendors not being on that page.
This is the end of Part 1 of the Final Report of the 2007 Web Analytics Shoot Out. You can see Part 2 of the Report here. Part 2 includes an analysis of JavaScript placement and how it matters, as well

Section 6: JavaScript Placement Matters

One of the things I heard from the vendors was a concern about where their JavaScript was placed on the page, with respect to that of the other vendors. The notion is that people may arrive at a web page and leave the page by clicking on something or leaving the site before the JavaScript finished executing.
Obviously, the longer it takes a page to load, the worse this problem would be. The further down the page the JavaScript is placed will affect that code’s chance of executing. In addition, if there is a problem with one vendor’s server, vendors further down in the stack of JavaScript would never get to execute.
So after hearing this concern a number of times, we decided to see just how much it really mattered. To test this, we took City Town Info, and shuffled the order of the JavaScript. The original order was as follows:

  1. IndexTools
  2. Clicktracks
  3. Google Analytics
  4. HBX Analytics
  5. Affinium NetInsight

In the second stage of the test, the order became:

  1. HBX Analytics
  2. Affinium NetInsight
  3. IndexTools
  4. Clicktracks
  5. Google Analytics

Here is the data we received using the original JavaScript ordering:

CityTownInfo.com Analytics Data – Pre JS Change Visitors Uniques Page Views
Clicktracks 663,803 609,511 1,071,589
Google Analytics 603,619 586,580 1,045,327
IndexTools 638,602 618,376 1,138,659
Unica Net Insight 627,072 614,512 1,062,493
WebSideStory HBX Analytics 525,038 513,020 922,692
Average 611,627 588,400 1,048,152
         
Clicktracks % 108.53% 103.59% 102.24%
Google Analytics % 98.69% 99.69% 99.73%
IndexTools % 104.41% 105.09% 108.63%
Unica Net Insight % 102.53% 104.44% 101.37%
WebSideStory HBX Analytics% 85.84% 87.19% 88.03%
         
Standard Deviation 47435 39272 70278
Clicktracks Std Deviations 1.3 0.66 0.38
Google Analytics Std Deviations -0.2 -0.06 -0.05
IndexTools Std Deviations 0.67 0.94 1.46
Unica Net Insight Std Deviations 0.38 0.82 0.23
WebSideStory HBX Analytics Std Deviations -2.15 -2.36 -2.03

Here is the data we received with the revised JavaScript ordering:

CityTownInfo.com Analytics Data – Post JS Change Visitors Uniques Page Views
Clicktracks 716,541 662,366 1,188,151
Google Analytics 682,844 664,419 1,214,729
IndexTools 724,204 700,283 1,304,817
Unica Net Insight 812,109 710,423 1,289,562
WebSideStory HBX Analytics 719,063 704,896 1,288,945
Average 730,952 688,477 1,257,241
         
Clicktracks % 98.03% 96.21% 94.50%
Google Analytics % 93.42% 96.51% 96.62%
IndexTools % 99.08% 101.71% 103.78%
Unica Net Insight % 111.10% 103.19% 102.57%
WebSideStory HBX Analytics% 98.37% 102.38% 102.52%
         
Standard Deviation 43117 20742 46678
Clicktracks Std Deviations -0.36 -0.82 -1.12
Google Analytics Std Deviations -1.2 -0.75 -0.69
IndexTools Std Deviations -0.17 0.37 0.77
Unica Net Insight Std Deviations 2.02 0.69 0.52
WebSideStory HBX Analytics Std Deviations -0.3 0.51 0.51

Clearly, the JavaScript execution order is a factor in the test. We decided to summarize the data by comparing the placement of the JavaScript with the ranking of the 3 traffic metrics, as follows:

City Town Info Results Summary – Pre JS change JS Order Visits Uniques PVs 3 Metric Average
IndexTools 1 2 1 1 1.3
Clicktracks 2 1 3 2 2
Google Analytics 3 4 4 4 4
Visual Sciences HBX Analytics 4 5 5 5 5
Unica Affinium Net Insight 5 3 2 3 2.7

Here is the same table, but showing the results after the JavaScript order changed:

City Town Info Results Summary – Post JS change JS Order Visits Uniques PVs 3 Metric Average
Visual Sciences HBX Analytics 1 3 2 3 2.7
Unica Affinium Net Insight 2 1 1 2 1.3
IndexTools 3 2 3 1 2.0
Clicktracks 4 4 5 5 4.7
Google Analytics 5 5 4 4 4.3

A few observations emerge from this:

  1. In the first set, the order of the results almost exactly mirrored the order of the JavaScript, with the notable exception of Unica’s Affinium Net Insight, which finished in the middle of the 3 packages, even though it was in the 5th position.
  2. In the second set, Affinium Net Insight scored first from the second position, and Visual Sciences HBX Analytics scored third, even though it was in the first position.

  3. The clear indication is that JavaScript placement matters. Bear in mind that in Section 4 we showed two different sets of results for City Town Info that were run over different and non-overlapping time intervals, and the differences in the relative data comparisons was small. So the differences in numbers due to JavaScript placement, as documented in this section, are significant.

It’s important to focus on this in the right way. The critical issue affecting how a package counts visits is the delay between when the page begins to load until the JavaScript finishes executing. It’s the time delay that matters, because it provides users with increasing opportunities to see what they want, and to click on the next link, or to simple leave the site.
We have no reason to suspect any other interaction issues between the various packages.

  1. From this data, we can hypothesize, but cannot prove, that HBX tends to count a bit lower than other packages on sites like CTI, and that Unica Affinium NetInsight tends to count a bit higher than other packages on sites like CTI. Clearly, JavaScript execution order is not the only factor at work here in the differences between our packages.
  2. Lastly, we should point out that data lost due to JavaScript placement should be considered an error, not a variance. By this we mean, it represents truly lost data.

For example, if a user arrives on your site’s home page, and then clicks on a link to go to another page on the site before the JavaScript for your analytics package can execute, your analytics will not count that visitor as having landed on the home page.
Most likely, it will count the next page as the landing page, and consider the home page of your site as the initial referrer. If the user had in fact come from a search engine you will likely have lost information on the keyword used as well.
In addition, if a user arrives on your home page, and then leaves (bounces) before the JavaScript runs, the analytics software will not count that user in the bounce rate calculations for your site.

Tracking pixel test

Next, we did an additional test. Working together with IndexTools, we implemented a tracking pixel at the top of the page code on City Town Info. This tracking pixel was placed just after the tag, and the remainder of the JavaScript was left in place at the other end of the file, immediately before the tag.

  Visitors Uniques Page Views
IndexTools JavaScript 520672 503371 937372
IndexTools Tracking Pixel 614393 525159 952491
Ratio 1.180000077 1.04328418 1.016129135

When you look at this data the Visitors numbers jump out at you. It’s a startling 18% higher for the tracking pixel at the top of the source code than the JavaScript at the bottom of the source code. However, it turns out that visitor counting using a tracking pixel has some inherent inaccuracies that result in a bit of double counting.
The short explanation of what is transpiring is that the analytics server is undertaking multiple communications with the user’s machine to determine if it can set a cookie. The server has a timeout (in IndexTool’s case the timeout used is 5 seconds), after which it assumes a cookie can’t be set, and then moves on to performing IP and User Agent based tracking.
Some of the time, the user’s computer actually does allow a cookie, and it comes back after 5 seconds, and says OK. When this happens, the software in fact still sets the cookie, and the user gets double counted.
In any event, this error affects only the visitor count in our data above, and it means we can’t use that data to make our determination of the difference between being at the top of the page and the bottom. However, we can still use the unique visitor numbers to get a flavor as to how much the placement affected our counting.
This turns out to be about 1.6% for page views and 4.3% for unique visitors. What is driving this difference? Ultimately, it is the time delay between when the tracking pixel finishes executing, and when the Javascript finishes executing. Some visitors will leave the site, or click on a link and move onto the next page, before the JavaScript finishes executing.
In a few sample data points we determined that the average delay between the completed execution of the tracking pixel and the JavaScript was about 1.4 seconds. The site on which we ran the test has a pretty fast execution time, even with all the JavaScript that was present on the pages.
Slower loading sites are likely to have a more significant impact due to the positioning of the JavaScript at the top or the bottom of the page. Some industry studies suggest that users make their decision about your web page in roughly 3 seconds. On our test site, the JavaScript generally finished executing at around 2.5 seconds.
This was still enough to be a factor in counting, but a relatively small factor as noted above. I would conjecture that conducting this test on a site where the JavaScript did not finish executing until 4 or 5 seconds would show even more dramatic results.

Google Analytics and Tool Parts Direct

We saw earlier in Section 4 that Google Analytics reports significantly higher numbers than the other packages that were run at the same time on the site. We also stated our belief that this was due to the Google Analytics JavaScript being placed in the HTML header of that site, while the JavaScript for the other packages was installed just before the tag in the HTML.
We measured the time from when Google Analytics JavaScript finished executing to when the next package’s (IndexTools) JavaScript finished executing, and we saw that the average time between the two was 3.3 seconds.
It’s worth looking at this in a bit more detail, to see if we can estimate the impact of this additional delay of 3.3 seconds, to see how it compares to our 1.4 second delay that we measured and discussed above. To look at how Google Analytics fares in general compared to other sites in the test, let’s look at the following table:
Google Analytics % of Other Package Averages (Visits)

AMD TPD HP CTI CTI JS Changed
124.99% 140.27% 92.32% 98.37% 91.91%

What this table shows is the number of Google Analytics visits divided by the average of the other analytics packages we ran on the same site. For example, the AMD number of 124.99% represents the Google Analytics number of visits, divided by the average of the other five packages installed on the AMD site.
Google Analytics consistently tended to be in the middle of the pack on HP, CTI, and CTI JS Changed. These sites derive most of their traffic from organic search. On AMD and TPD the traffic is largely PPC based.
To be conservative in our speculation, let’s use AMD as an indicator of what to expect from Google Analytics on a given site. We can then further speculate that the delay of 3.3 seconds results in a loss of 12.2% of the data (we divided the TPD number by the AMD number from the above chart to come up with that number).
This is all pure speculation. However, it does stand to reason that a greater delay would provide more users with the opportunity to leave the site prior to the JavaScript executing.

JavaScript Execution Time

Since JavaScript placement does seem to have an impact, we decided to measure the exact execution time of the JavaScript of the respective vendors. In theory, page placement is a factor because the longer it takes for your analytics JavaScript to begin and finish executing, the more likely it is that a user will click on some link they see and leave the page they are on before the JavaScript can finish.
As a result, analytics software with a long execution time would be more prone to these types of errors. The following measurements were taken using a tool known as HTTP Watch (http://www.httpwatch.com/).
For each analytics package, we measured the JavaScript execution time on 3 of the sites in the study (Advanced MD, City Town Info, and Tool Parts Direct). On each site, we took 4 measurements, two on one day, and two more 14 days later. All measurements were taking during the early afternoon Eastern Standard Time.

Advanced MD   City Town Info   Tool Parts Direct
Tool Time Tool Time Tool Time
Omniture 1.41        
IndexTools 0.51 IndexTools 0.39 IndexTools 1.3
HBX 0.61 HBX 0.57 HBX 0.61
IndexTools Tracking Pixel     0.57    
Google 0.29 Google 0.23 Google 0.42
Unica 0.3 Unica 0.21    
Clicktracks 0.3 Clicktracks 0.22 Clicktracks 0.26

For an aggregate look at the data, here are the average results for each tool.

# of Sites Tool Average
    Time
1 Omniture 1.41
3 IndexTools 0.73
3 HBX 0.6
1 IndexTools Tracking Pixel 0.57
3 Google 0.31
2 Unica 0.25
3 Clicktracks 0.26

Omniture, with the sample size of the 1 site we ran it on, seemed to take the longest. Second in length was IndexTools, although it should be noted that the time it took for the IndexTools JavaScript to run on Tool Parts Direct was quite a bit longer than for the other sites.
The IndexTools code running on Tool Parts Direct was the same code that was run on Advanced MD and City Town Info, so this suggests that some external factor was in play here. We have not identified what that factor may be, if any, at this point in time.
Have comments or want to discuss? You can comment on the Final Analytics Report here

Section 7: Qualitative Comparisons

This section will discuss some of the strengths and weaknesses of each of the 5 packages that worked closely with us during the 2007 Web Analytics Shootout. We don’t suggest that this is a comprehensive analysis of all the aspects of the products reviewed, but it does cover a number of factors with regard to each package.
In particular, we try to focus our efforts here on information that may impact your purchase decisions, or use of your analytics software.

7.1 Clicktracks

In general terms, Clicktracks focuses on ease of installation and setup, ease of use, and also offers excellent pay per click campaign management tools. The product does not offer the same level of configurability and options available to customers of Omniture, Visual Sciences, or Unica.
However, it offers a powerful package for those companies that are ready to do deeper analysis than that you can do with Google Analytics, at a lot lower price tag than some of the other companies. In addition, the ease of installation and set up will be a big positive for those who want to go deeper with analytics, but are not yet ready to invest heavily in web analytics development tasks.
Clicktracks also has a lot to offer customers who want to manage their pay per click (PPC) campaigns. This includes basic bid management capability build directly into the analytics application, and a Click Fraud Report that helps customers track down potential click fraud. This is a report that carries some authority in the eyes of the search engines.
Lastly, it should be mentioned that Clicktracks also offers a free application known as Clicktracks Appetizer (http://www.clicktracks.com/products/appetizer/). The most significant difference between Clicktracks Appetizer and Google Analytics is that Clicktracks Appetizer reads log files, and does not rely on JavaScript tags.
As a result, it makes a great choice for a free application for website owners that want to include measuring search engine robot data with their analytics software.
Key Technical Points:

Cookie Type First Party
Cookie Setup None Required
Cookie Inactivity Timeout 15 minutes (user configurable)
Continuous Session Timeout 15 minutes (user configurable)
Blocked Cookie Handling IP and UA tracking
Other Session Factors Search Engine hits always initiate a new session
Real Time No, but updates (up to every 15 minutes) are possible with the log file version of Clicktracks
Data Store Proprietary
JavaScript Customization None Required, except for detailed conversion tracking

Clicktracks – Key Strengths:

The first 10 items are taken from the article we developed with the help of Clicktracks titled: 10 Cool Things you can do with Clicktracks.
The article provides a more complete description of each of the first 10 items listed below, along with a rich array of screenshots. Additional strengths (beyond these 10) are listed starting with item 11.

  1. Optimize your PPC Campaigns: Clicktracks allows you to perform basic bid management functions without having to purchase an additional software component, as is required by some of the other vendors. All that is required is to enter your Google Adwords Account and Yahoo Search Marketing information in to the Campaign Manager Screen (or import any other campaign reports using the ClickTracks Campaign Template).
  2. Slice and Dice your Visitors with Segmentation: Clicktracks offers great segmentation capability in the form of “labels”. Labels allow you to select a wide range of criteria to look at specific groups of visitors, and analyze their behavior separately. For example, you can look at your organic traffic and PPC traffic separately. You can also analyze visitors based on the where they are located geographically. Or you can segment based on the page on which they visited or entered or exited your site from.

The Clicktracks label setup is simple, elegant, and powerful. Best of all, the labels automatically show up in all of the reports throughout Clicktracks. You can make the labels invisible in some or all reports by clicking a box in a configuration screen.
One of the niftiest aspects of this feature is that you can create labels which are combinations of other labels. This allows you to do some pretty sophisticated analysis.
For example you could create a label that tracks people who came from Canada and stayed on the site for longer than 3 minutes. Or, you could create a label for someone who came to the site from Google, visited a certain page, and generated at least $10 in revenue.

  1. Apply New Analysis to Old Data: Ad hoc analysis is a key capability in analytics. An example of an ad hoc analysis occurs when you decide that you want to see how conversion rates were last Christmas sorted by product, yet you were not specifically capturing that data last Christmas. With ad hoc analysis on Clicktracks you can define a new label, do a reanalysis of the data, and presto, you have that label applied to the historical data.
  2. Improve Your User Experience: Clicktracks offers some nice tools for gaining insight into your site’s visitors. Clicktracks provides Entrance Path and Exit Path analysis as a way to see the flow of traffic through your site. This type of information can help you better understand how users are experiencing your site.

You can also configure an unlimited number of funnels to track progression through your site. Combined with labels, you can track specific groups of users.

  1. SEO Optimization Tools: Clicktracks offers a search engine report that provides you with information on the keywords that are bringing traffic to your site, broken out by search engine. This helps you see what terms are ranking highly on a search engine by search engine basis. You can also easily see plenty of other data, such as the conversion rate, average time on site, cost per visitor, revenue per visitor, and total revenue for each keyword:
  2. Contextual Analysis: By displaying the statistics for one of the reports in a browser, Clicktracks’ Navigation Report allows you to see information in context, while navigating your site the same way a user would travel through your site. Seeing your actual web page the way a user sees it, along with analytics data embedded on the screen, and other key data on the right, allows you to see the analytics data in context.

  3. Powerful Testing Capabilities: Testing is the easiest way to get a positive ROI on your analytics investment. Clicktracks’ What’s Changed Report provides you with quick access to information about significant site changes, even with just a few days worth of data. This can help you in A/B testing, where you can create two different ad groups with different landing page URLs, and then swap the landing page URLs after a while, and see at a glance which combination of ads and landing pages brought the best results.

  4. Keyword Analysis and Research: Clicktracks’ Search Report helps you rapidly determine which of your keywords are performing the best. Chances are that these are vertically oriented terms, rather than major brand names, and rapidly identifying the most productive keywords (using metrics such as high average time on site, highest conversion rate, total revenue, and ROI) quickly helps you increase the profitability of your PPC campaign.

  5. Click Fraud Detection: This is one of the gems of Clicktracks. Clicktracks offers a Click Fraud Report that provides you with a way to identify click fraud when it happens. This is useful when it helps you identify click fraud, and it can also help you rapidly identify when suspicious looking activity is not likely to be click fraud, but something else, such as a poor performing ad. 10. Track KPIs Over Time: Clicktracks makes it easy to track your Key Performance Indicators (KPIs) over time. Once you have your KPIs setup, all you need to do is select it in one of the reports you are looking at, and right there on the fly you will see the KPI data in that report.

  6. Log file support: While the 2007 Web Analytics Shootout focused on analyzing all the vendors using JavaScript data collection, one of the limitations of JavaScript is that it can’t track the behavior of search engine robots (because search engines don’t execute JavaScript).

Clicktracks offers you the ability to purchase their software, install it on your own server, and then perform its data analysis on your log files. This provides you with the ability to capture search engine robot behavior analysis, and provides you with the security of having all of your website data kept under your own roof.
Better still, Clicktracks allows you to operate in a mixed mode, where you can use log file analysis for robot tracking, and JavaScript data collection for visitor tracking.

  1. Labels support Perl Compatible Regular Expressions: This may sound pretty specific and detailed to some, but if you are looking to create content groups (i.e., a collection of related URLs that you want to analyze as a single entity), there will be times when the string processing power of Perl is the only way you can accurately identify the group of pages you want to isolate.

For example, on City Town Info, we wanted to look at all the city pages as a single content group. But the URL naming structure of the site did not make it possible to do something like look for a specific string in the URL itself to identify a page as a city page. But we were able to describe city pages accurately using a Perl Compatible Regular Expression.

  1. What’s Changed Report: The What’s Changed Report makes it easy to makes it easy and fast to see what happens when you implement a new campaign, or when you are doing A/B testing.
  2. Campaign Analysis: Setting up PPC campaigns and tracking their performance is simple in Clicktracks. To do this, you enter your Google & Yahoo account information, or import a report from any other engine (such as MSN) you can easily see what’s losing money, and where the big winners are, and take action to improve your overall results.

  3. Education: Clicktracks offers regular free education sessions and demos that are available to all customers to help them get more out of the product.

Clicktracks Weaknesses:

  1. Clicktracks does not yet offer a true ASP mode. All versions of Clicktracks, even its browser based version, currently require you to download large compilations of log file data and perform detailed processing on your own machine. Clicktracks has let me know that they will be offering a new product with a true ASP mode in the near future.

  2. The interface is hard to follow at times. While this is obviously a bit subjective, we found that remembering where to configure one item versus another was at times confusing. Many times I found if difficult to remember where to configure one thing or another, even though I had done it before.

  3. No customizable dashboards. Clicktracks is a one size fits all interface. Individual users can customize some aspects of the data they see, such as creating their own labels, setting up revenue, importing campaigns, configuring funnels and Internal Search/Data Dissection reports, but the level to which the experience can be individualized is fairly limited.

However, Clicktracks does offer the ability to selectively deliver reports via email, so that your CEO and VP of sales can find the numbers they are looking for in their email inbox, without having to login at all.

  1. Labels are great, but the ability to apply them selectively is limited. You can configure what reports each label shows up in through a straightforward configuration screen, but you can’t do it directly in any one report. This makes it a multi-step process if you just want to look at one label at a time, without the clutter of the other labels. One way to work around this is through the use of multiple datasets.

Note that this is only an issue if you have a large number of label scenarios you are interested in looking at, and as a result, having all labels active on the screen at one time is an obstacle to your goals.

  1. No MSN support. The campaign tracking does not currently support MSN’s adCenter API. You can, however, run the Dynamic Ad Text report from MSN and import it into ClickTracks.

7.2 Google Analytics

Google made a big splash when they acquired Urchin Software Corporation. They overhauled the hosted version, renamed it Google Analytics, and made it free to all comers. The philosophy behind Google Analytics is extreme ease of use and installation, and quick easy access to analytics information to promote faster decision-making throughout the organization.
In May of 2007, Google Analytics had its first major update in some time, and lots of weaknesses in the product were addressed, including the addition of a configurable dashboard, and the ability to email out reports. Google has also made at least two smaller incremental releases since May 2007.
Google Analytics does not contain the deeper analytics capabilities of the other packages surveyed here, but offers a lot of information for website owners to get started with. In fact, many website owners will find that this is all the capability that they will ever want.
Google Analytics does have some enterprise level customers that are evidently satisfied with what they get in the tool, such as RE/MAX. So whether there is enough for you in Google Analytics is not solely determined by your company’s size. How you are using the analytics is the determining factor, but, of course, larger companies tend to have more complex requirements.
Key Technical Points:

Cookie Type First Party
Cookie Setup None Required
Session Inactivity Timeout 30 minutes
Continuous Session Timeout 12 Midnight
Blocked Cookie Handling Does not track visits or uniques
Other Session Factors None
Real Time No
Data Store Proprietary
JavaScript Customization None Required, except for detailed conversion tracking

Google Analytics – Key Strengths:

Items listed 1-6 below are taken from the article we wrote titled: 10 Cool Things you can do with Google Analytics.
The article provides a more complete description of the 10 things you can do with Google Analytics; only the items which are differentiating are listed below. The article provides additional information and screenshots.

  1. Email out your reports: While this feature exists in all the other packages, it did not exist in Google Analytics until the May 2007 release of the service. This was one of the major weaknesses of Google Analytics prior to that release.
  2. A/B Testing: Google Analytics offers support for A/B testing, and tracking the results to see how the two tests compare. It also integrates with Google’s complementary multivariate testing platform, Website Optimizer (also free).

  3. Drilling down on referrers. Google Analytics makes it easy to track what types of content gets the biggest response from a particular referrer. For example, you can use this to see how a social media site responds to different content on your site.

  4. Navigation summaries for individual pages: This feature allows you to quickly and easily track the traffic history for a specific page on your site. This is very useful in understanding and analyzing the life of a particular article or piece of content.

  5. Entrance Sources Report: This report provides a really simple way to see your inbound traffic on a page by page basis. Want to see who the biggest referrers are for a newly launched piece of content? This report makes it easy to do.

  6. Entrance Keyword Analysis: Google Analytics provides a standard report that allows you to see the entrance keywords on a page by page basis. This is helpful in combination with the Entrance Sources Report.

  7. Entrance Pages Report: This report is similar to the entrance sources report, but is structured around internal traffic on your site. It makes it easy to examine and monitor the path of traffic through your site.

  8. Price: When evaluating Google Analytics you need to include its price — free — as one of its strengths. Not ready to spend a lot of money on a higher end package just yet? Use Google Analytics to get started, learn a lot about your site, and better understand what your requirements are before deciding on a higher end, and more expensive, package.

  9. Ease of sign-up and setup: No other package is easier to get started with than Google Analytics. Simply sign up with your Google account, place the JavaScript on your site, and you are collecting data.

  10. Compare Traffic Between Sites: As the data in this report demonstrates, analytics packages measure differently from one another. So if you are trying to compare traffic between Site A and Site B, and they are running different analytics packages, you simply can’t rely on those numbers. The solution? Install Google Analytics on both sites and now you have data that you can compare.

Google Analytics Weaknesses:

  1. Limited Ability to Customize: You won’t find a wide range of capabilities to customize Google Analytics. That’s not to say that there is no such capability, it’s just limited in scoped compared to that of the other packages.

  2. No Log File Analysis Capability: As a result search engine crawling data is not available within Google Analytics. Google does offer a paid Urchin Web Analytics solution that reads log files and can report on all search engine robot activity.

  3. Ad Hoc Analysis is not Possible: All of your historical data is available within Google Analytics, but once you set up a new filter, you cannot reprocess the historical data. The result is that you can only look forward in time with Google

  4. Email Only Support from Google: Google provides free email support in the 19 languages they currently support with Google Analytics. They also have created the Conversion University, a detailed help center and a GA Google Group for peer-to-peer support. For professional services, they have partnered with a third party network of companies who specialize in a variety of services, but Google does not offer these services directly unless you are a large AdWords customer with a direct account relationship.

7.3 IndexTools

In general terms, IndexTools positions itself as offering enterprise level capability, but at a much lower price than competition such as Omniture, Visual Sciences, and WebTrends. IndexTools is designed to by very easy to set up and install, but still offers a powerful set of customization capabilities.
In fact, IndexTools philosophy is to provide minimal out of the box reports, and to provide users with access to powerful customization capabilities. This approach emerges quickly within the interface where all reports can be easily customized.
The IndexTools interface was one of the best we reviewed. Information was easy to access and find. The product allows a large array of filters, content groups, and segments to be set up directly within the UI, without a need to resort to editing the JavaScript. However, customizing the JavaScript is an option in those situations where the complexity of the filter exceeds what can be done in the UI.
IndexTools has a wide array of customers of all sizes, including a substantial number of enterprise customers. The customization capabilities exceeds that of Google Analytics or Clicktracks by a substantial margin, but is not quite as extensive as the level of customization that can be done with Omniture, Web Trends, and Visual Sciences.
However, installation and setup are in general easier than with those other applications, and in our experience the pricing IndexTools offers is substantially more aggressive.

Key Technical Points:

Cookie Type First Party
Cookie Setup None Required
Session Inactivity Timeout 30 minutes
Continuous Session Timeout 8 hours
Blocked Cookie Handling Tracks visits and uniques with IP and User Agent
Other Session Factors None
Real Time Yes
Data Store Proprietary
JavaScript Customization To supplement UI based customization

IndexTools – Key Strengths:

The first 10 items are taken from the article we developed with the help of IndexTools titled: 10 Cool Things you can do with IndexTools.
The article provides a more complete description of each of the first 10 items listed below, along with a rich array of screenshots. Additional strengths (beyond these 10) are listed below, starting with item 11.

  1. Customize Reports: IndexTools makes it easy to customize any of its reports. Just go to the report you want to customize and click on the “Customize Report” button, and you are presented with a drag and drop interface to do your customizations. You can pick any of a wide range of groupings or metrics, so there is a lot of flexibility here. In addition, you can bookmark the report so you can look at your customized report any time you want.
  2. Customize Dashboards: IndexTools provides you with the ability to offer different dashboards for each user, or even a custom set of dashboards for each user. This allows your CEO to have one experience, your VP of sales to have a different experience, and your business analyst another experience altogether.

  3. Ad Hoc Scenarios: Business analysts constantly want to know what has happened in the past. IndexTools supports this ability quite simply. All you need to do is to set up the scenario you want to analyze (by setting up filters or a custom report) and then use the calendar settings to go back to any date or date range setting you choose.

  4. Filters: In any of the standard reports in IndexTools you can apply custom filtering right there on the screen. All you need to do is pick “Show Filters”, select your filter type, and you can immediately see the results. You can take this further by implementing more than one filter, customizing the report as described above, or you can bookmark the report for later retrieval.

  5. Merchandising: IndexTools allows you to upload a spreadsheet with all the custom category information for your eCommerce site. Once this is done, these categories are readily accessible within IndexTools as filters or in merchandising reports. The merchandising reports also provide you with a broad capability to mix, match, and sort your various categories on the fly, so you can see the data you want to see.

  6. Path Explorer: The visual overlay feature within IndexTools is particularly powerful. For example, it deals with some of those hard to handle scenarios, such as DHTML menus, Ajax, Flash, and other interactive elements. In addition, you can define specific content areas within each screen, and analyze just that portion of the screen.

For example, in a newspaper business, knowing the historical click through rate for position 1 of a page can help you see if the current article in that position is doing better or worse than historical averages. This is easily handled in IndexTools.

  1. Alerts, Events, and Color Coding: IndexTools allows you to define events that you want to know about, such as a jump in traffic, or a drop in sales. Once an event occurs, an alert can be triggered, and you can see a color coded report that provides you with the details of what has taken place.
  2. Segmentation: IndexTools segmentation capabilities are straightforward and easy to use. One of the powerful features that IndexTools offers is that you can apply more than one segment at a time, to get a more detailed look at the data based on the segments you have already defined, rather than needing to implement another segment combining those attributes. Segments in IndexTools also get applied in real time, so there is no need to wait for a few hours after creating them before you can see the results.

  3. Campaign Management: All the analytics tools we looked at offer campaign management capabilities. IndexTools does this as well, but IndexTools also treats your organic traffic as a campaign. This allows you to see all of your results, including the organic results, in one place.

  4. Custom Fields: IndexTools provides you the ability to define custom fields. You can use these custom fields to track attributes of your business that are specific just to your business. Sell shirts? You might want to track men’s vs. women’s, or shirt size, or color, or manufacturer. These can be setup using IndexTools professional services and are then available within the UI.

  5. Real Time Tracking. IndexTools offers real time tracking (with a 3 second delay). While this may not matter to many businesses, there are businesses where real time data can have a large impact, such as high volume media businesses where the performance of each piece of content in generating page views is a key driver of financial results. Have content that’s underperforming? Get it out of there quickly and replace it with something that provides better results.

  6. Filters support Perl Compatible Regular Expressions: This newly added feature provides the power of Perl Compatible Regular Expressions in defining segments, scenarios, filters, and content groups within IndexTools. The big benefit of this is that it should reduce the amount of custom JavaScript tagging you need to do on your site.

  7. Scenario Analysis: Using scenario analysis in IndexTools allows you to model loose funnels that don’t require users to take steps in an exact order, but still show you whether or not your users are progressing towards the close.

IndexTools Weaknesses

  1. Measures daily unique visitors on a rolling 24 hour basis. The industry standard is to measure daily unique visitors within a calendar day. IndexTools tells me that this is being changed as of their next release, which is coming soon.
  2. No log file analysis. As a result, you can’t get search engine robot crawling data from IndexTools.

  3. Can’t place scenarios in custom reports. The tool provides a powerful ability to do your own scenario analysis. However, those scenarios can not be customized or placed in a custom report and then bookmarked. This prevents you from e-mailing the data to someone.

As a result, sometimes you end up implementing something in a scenario, and realize that you want to email the data to someone on a recurring basis. To do this, you find that you need to recreate the definition of the scenario in a custom report.

7.4 Unica Affinium NetInsight

Unica did not start out as a web analytics company. They originally focused on the management of offline customer data, and in providing related marketing tools (campaign management, optimization, lead management, e-mail, marketing management, etc.). As a result, the company has an enterprise customer focus, and provides a rich array of cross marketing capabilities.
The interface is simple, clean, and elegant. It was one of our favorite interfaces to work with. Customization of reports within Affinium NetInsight is usually simple and slick. The highly graphical presentation format is also nice.
The special differentiator for Unica is the degree to which they have integrated offline customer data into Affinium NetInsight. This enables some neat features to cross reference the offline data, and perform integrated marketing. Some of the things you can do with this are detailed below.

Key Technical Points:

Cookie Type First Party
Cookie Setup DNS A Record must be created
Session Inactivity Timeout 30 minutes
Continuous Session Timeout 24 hours
Blocked Cookie Handling IP+UA is used and sessions are tracked that way.
Other Session Factors Inactivity time out is user configurable, user name and parameters can also be used to define session criteria
Real Time Not standard, but upon request will offer up to 15 minute freshness of data)
Data Store Data Warehouse
JavaScript Customization To supplement UI based customization

Unica Affinium NetInsight – Key Strengths:

The first 12 items are taken from the article we developed with the help of Unica titled: 12 Cool Things you can do with Unica’s Affinium NetInsight.
The article provides a more complete description of each of the first 12 items listed below, along with a rich array of screenshots. Additional strengths (beyond these 12) are listed starting with item 13 below.

  1. Creating custom dashboards: It’s easy to set up custom dashboards in Affinium NetInsight. The benefit of this feature is that it allows you to provide a custom experience, based on the needs of the person using it. For example, the VP of sales can see what she wants to see, without the clutter of a bunch of other numbers that are of no interest to her.
  2. Ad-Hoc Analysis: Affinium NetInsight provides you with the ability to apply segments, filters, and content groups to historical data. This allows you to conduct historical research on user behavior that can be incredibly valuable to your business.

  3. Drag, Drop, and Drill Down: Affinium NetInsight offers extensive filtering capabilities. You can apply as many filters as you want simultaneously, and as mentioned above, you can look back in time. The range of filtering options provided is extensive.

  4. Correlate Data: Cross referencing data is another strength of Affinium NetInsight. Not only can you set up multi-level tables of data, you can easily re-arrange the table structure, dragging parameters and dropping them where you want them, and the table dynamically rebuilds itself on the fly.

  5. A/B Analysis mode: Affinium NetInsight provides a simple and elegant way to set up and see the results of two different scenarios. These can be seen on a single split screen so you can look at the results side by side.

  6. Integrate Offline Customer Data: Due to its origins, Unica has provided extensive capabilities for integrating offline customer data together with online customer data. Among other things, this allows offline data attributes to be used as filters and segments within Affinium NetInsight. For example, you may want to filter visitors based on whether or not they are customers through your brick and mortar stores, or belong to your “Gold Membership Program”.

  7. Examine Individual Click streams: Affinium NetInsight offers you the ability to review the detailed behavior of any single individual. B2C sites can use this to sample customer behavior data, but the real power is for B2B sites. For example, a B2B site’s sales person can be looking at the click stream data of a potential B2B customer while handling a sales call with them.

  8. Robot/Spider analysis: Affinium NetInsight offers the ability to read log file data to extract information on the behavior of search engine robots. As an interesting add-on to this, you can also follow the precise path of the robot step by step through your site, to see how the site is being crawled.

You can either run Affinium NetInsight as a pure log file based solution, or in a hybrid mode that uses the log file to get robots data, and JavaScript to get user traffic data.

  1. Remarketing: With Affinium NetInsight you can automatically implement fast direct marketing responses to events on your site. For example, if a user abandons their shopping cart part way through it, and you have their email address, you can see what product they were thinking of buying and automatically email them an offer for a 10% discount on that product.
  2. Ask NetInsight Wizard: This is a unique feature. Not sure how to get the data you want? Use the Ask NetInsight Wizard to get it. Pick from a list of standard questions, and the report is generated on the fly, or, ask a completely unique question, and the wizard will try to get you the data you want.

  3. Heat Map Overlay: The Affinium NetInsight site overlay capability has the added feature of allowing you to look at segments within the overlay. The segment data will show up in a heat map format, making it visually easy to figure out how that particular segment behaved on that page.

  4. Date Comparison Reporting: Affinium NetInsight was unique in the packages we reviewed in its ability to allow you to compare the results of 2 different dates side by side.

  5. Perl Compatible Regular Expression Support: The ability to use Perl Compatible Regular Expressions in defining content groupings, segments, and filters makes it possible to do less custom JavaScript work for some sites. This comes in handy especially if you have a site with dynamic URLs or complex content groupings.

  6. On Demand or On Premises: You can run Affinium NetInsight in either an ASP mode with the data hosted at Unica, or buy the software and run it in-house, and keep the data on your own servers.

Unica Affinium NetInsight Weaknesses:

  1. No Default Cookie Handling. You can’t get started with Affinium NetInsight On Demand unless you have set up a custom DNS record for your cookies, or until you have set it up to piggyback on another existing persistent cookie. If the site does not have a persistent cookie, NetInsight offers a web server plug-in that can be installed on the web servers to automatically generate a server side cookie.

  2. No Web Services API for Retrieving Reports – Affinium NetInsight offers an enterprise level solution, but does not have a web services API for retrieving reports such as the ones offered by Omniture and Visual Sciences. A workaround does exist, in that a database API exists, but using it is a bit more involved on the JavaScript based product from Unica.

  3. Real time data needs to be requested – You can get data with 15 minute freshness from Affinium NetInsight, but you need to request it from Unica. It also may cost extra, depending on the rest of the relationship, to get it.

  4. Education Curriculum is not as Extensive. Unica does offer an educational curriculum, but it’s not as extensive as that of some of the other vendors.

7.5 Visual Sciences HBX Analytics

Visual Sciences, Inc. (formerly known as WebSideStory), has been one of the major players in web analytics for many years. The company focuses on high end solutions that offer a wide range of flexibility and power. Correspondingly, there may be some configuration and setup work to do, in order to gain access to that additional capability.
The company was originally named WebSideStory, but re-branded itself to Visual Sciences after its acquisition of the company by that name in February of 2006. The move marks the company’s increasing focus on multi-channel applications.
Visual Sciences Report Builder is a powerful tool that provides direct access to the data within the HBX Analytics database as an Excel plug in, and with which that data can be analyzed and manipulated like any other data in Excel.
This is part of a family of capabilities that emerge from the built in APIs within the tool that allows third party applications that access the API to extract data and be used in a wide variety of ways. This programmability is one of the hallmarks of HBX, and permits flexible data access and manipulation that can be built directly into your web application.
You may be able to use a cheaper package to meet your needs, but HBX Analytics and its companion products offer the type of capability required by the most demanding applications.

Key Technical Points:

Cookie Type First Party or Third Party
Cookie Setup DNS CNAME Record must be created
Cookie Inactivity Timeout 30 minutes
Continuous Session Timeout no timeout
Blocked Cookie Handling Does not track visits or uniques
Other Session Factors None
Real Time Yes
Data Store Data Warehouse
Javascript Customization Yes, primary method of customization

Visual Sciences HBX Analytics – Key Strengths:
The first 11 items are taken from the article we developed with the help of Visual Sciences titled: 11 Cool Things you can do with HBX Analytics.
The article provides a more complete description of each of the first 10 items listed below, along with a rich array of screenshots. Additional strengths (beyond these 10) are listed starting with item 11 below.

  1. Active Viewing: Visual Sciences refers to site overlay functionality as Active Viewing. In HBX Analytics the expanded capability includes the following features:
  1. Look inside DHTML menus to see click through data on individual menu items
  2. Place an overlay on top of a form
  3. Place an overlay on top of Flash
  4. Apply segmentation within your overlay

The segmentation feature allows you to perform path analysis on different types of user groups on your site. In addition, when multiple links on a given page point to the same page, HBX will track the click through rates of each link as separate items (most vendors do not do this).

  1. Customizable Dashboards: HBX offers customizable dashboards so you can customize the experience of each individual user with HBX to meet their specific needs. This enables each user to get what they need, and only what they need.
  2. Navigate the Funnel: All the analytics packages provide funnel capability. However, not every package enables you to define a “loose funnel”, which HBX defines as one which can be interrupted, yet the funnel will keep tracking the user.

A common scenario is where users get part way through the funnel, and then decide to check the About Us page or the Privacy Policy page on your site before continuing. In a strictly defined funnel this would be seen as an exit. With a loose funnel you can still get a clear indication of how these users progressed through the process.

  1. Filtering Conversions: HBX Analytics lets you set up filters based on conversion type (e.g. type of product purchase, newsletter signup, contact us request). This provides you a way to track the path of visitors specific to each conversion type.
  2. Active Segmentation: HBX provides a simple and elegant way to take advantage of the segments you have defined. A list of all available segments shows up as a pull down list on the top of all the report screens, so just select the segment and you are off to the races.

  3. Custom Metrics: Even with all the slicing and dicing power of HBX, sometimes you need to do more. HBX does allow you to define Custom Metrics and build them into your analyses. For example, Visual Sciences’ Site Search tool can be integrated into HBX Analytics to offer a rich array of specific attribute data. For example, if you are an eCommerce vendor you may want to capture data on price, size, color, etc., and this is something you can do with HBX and Custom Metrics.

  4. Campaign Attributes: You can also isolate and filter on different types of campaign attributes within HBX. For example, if you are running a banner ad campaign, you can look at the size, color, and message as different attributes, or in an email campaign you can look at the effect of link position in the email. This capability helps in rapid optimization of campaign results.

  5. Report Builder: Of the packages reviewed in this section, only HBX offers anything like Report Builder, and it’s special. You install it as a separate module that is a Microsoft Excel add-on. With it installed, you can extract data directly from HBX into Excel, and then use that information the same way you use any other data in Excel.

With this capability you have a powerful combination of Excel and HBX that allows you to do a lot with your data. Better still, the information in the spreadsheet can be set up to automatically update on a regular basis (e.g., daily) so that you can see the updated information in Excel without having to do anything at all.
In addition, I believe that the ability to copy and paste that data, to expand upon the amount of data extracted from HBX on the fly, is absolutely unique to Report Builder.

  1. Active Dashboard: HBX allows for simple modeling of “what if” scenarios with its Active Dashboard functionality. Basically, this allows the integration of Flash into powerpoint, with the result that you can take certain KPIs and implement them as sliders. Moving the sliders around will then dynamically update the rest of the data (according to the business rules you have defined).

While this takes some time to setup, it makes for a very powerful modeling and presentation tool.

  1. APIs: Everything within HBX Analytics and Report Builder can be accessed through APIs provided by Visual Sciences. This allows for the design of rich applications that leverage that data on behalf of your site. For example, you can integrate the API into your content management system, and have it use live analytics data to drive content decisions on your site.
  2. Ad Hoc Analysis: HBX allows you to perform analysis on historical data. This is useful in looking back in time to check past results and apply filtering and segmentation where it’s of interest.

  3. Regular Classes: Visual Sciences runs free online training classes ona regular basis that any customer can join. These are invaluable in helping customers get more out of the Visual Sciences tools.

Visual Sciences HBX Analytics Weaknesses:

  1. More Setup Required. HBX requires more setup than some of the other packages. For example, the default type of cookie used is a third party cookie, and you need to modify your DNS record to be able to use a first party cookie.

There are also options that are not available by default, like the Navigation Report and Site Overlay. To get these capabilities, you need to set the hbx.lt parameter to “auto” on all the pages of the site for which you want them.

  1. JavaScript tuning is a potential source of error. While JavasSript tuning offers substantial power and flexibility to do whatever you might want to do, it is also a potential source of error. Simple coding errors can result in misleading data. Also, when changes are made to the web site, updating the HBX JavaScript needs to be part of the development plan.
  2. Can’t measure ROI on a keyword without an additional tool: The basic HBX Analytics tool does not come with the ability to measure ROI on a keyword by keyword basis. However, Visual Sciences offers a separate bid management tool with this capability.

  3. Can’t trap on URLs to look back in time: HBX does not store URL data by default, and this means if you want to do an Ad Hoc analysis that you need to rely on other parameters (such as Page Name and Content Group as defined in the JavaScript) to do this type of analysis.

Have comments or want to discuss? You can comment on the Final Analytics Report here

Appendix A – Placement of JavaScript on the participating sites

The following table shows the order of the JavaScript for each of the sites.

Placement/Site AMD CTI Initial CTI Second HPort TPD
           
In the Header         Google Analytics
           
Just before the closing BODY tag in the HTML Omniture IndexTools HBX Analytics IndexTools IndexTools
  Clicktracks Clicktracks NetInsight Google Analytics Clicktracks
  Google Analytics Google Analytics IndexTools Clicktracks HBX Analytics
  IndexTools HBX Analytics Clicktracks HBX Analytics  
  HBX Analytics NetInsight Google Analytics    
  NetInsight        

It’s been suggested that JavaScript order will have a significant impact on the results. We specifically analyzed this during the course of the study. See above (supply section reference) for more information on the results (hint: there are bigger issues than JavaScript order!).
In the table above, CTI Initial refers to the initial order of the JavaScript placed on City Town Info. CTI Second refers to the order that was used in the second phase test for City Town Info, in which we tested the affect of the order in which the JavaScript executed. This is reported on in detail in Section 6 of this report.
It should also be noted that JavaScript is disabled on some users’ computers. When that happens, none of the web analytics packages will return data in our study, with the exception of WebTrends, whose analysis was log file based.
Checking across our 4 participating web sites, we found that about 3.5% of users had JavaScript disabled. This means that the true user counts are potentially 3.5% higher than what was reported in the study.
Note, however, there are plenty of other sources of error in analytics, and this particular factor would be in common in all analytics packages that rely on JavaScript.

Appendix B – Sessionization

One of the challenges of web analytics that uses a JavaScript implementation (as we did for all packages in the 2007 Web Analytics Shoot Out) is that the software can only tell when a user loads a page that causes the JavaScript to execute.
This is naturally very accurate when counting page views, because every time the JavaScript executes it equates to a page view. There are still source of error, such as a user clicking on a link and leaving the page before the JavaScript executes, but these sources of error are relatively small.
Total Visitors and Unique Visitors, however, rely on cookies and session tracking. Every time the JavaScript executes, it must:

  1. See if a cookie already exists. If it doesn’t, try to set a cookie, and record the time of the user’s visit in it. This equates to the start of a new session. In the event that the cookie does exist, then proceed to the following step.
  2. Open the cookie and see if this is a continuation of an existing session (in principle, a continuous visit to the site), or a new one. Update the internal tracking data appropriately.

This still sounds relatively simple in principle, but it gets complicated when you start factoring in the processing that then has to take place. The analytics software has to parse through all the click stream data in the log file created by the Javascript, and match up the various clicks it sees with sessions.
It does this by keeping all sessions open for a fixed period of time. Once a session is deemed to be closed, as usually determined by there being no new clicks related to that session for a fixed period of time, such as 30 minutes, the session gets written out to disk.
As you can imagine, this task can place a heavy demand on processing power and memory. For practical purposes, analytics software can’t keep sessions open forever. In general, the industry has decided that a session ends after 30 minutes of inactivity. Note that Clicktracks actually uses 15 minutes as a default, and both Clicktracks and Unica make this a user configurable setting.
Let’s see how that affects two common scenarios:

  1. A user comes to your site, and then goes to lunch for an hour or so. They then come back and continue browsing your site. In theory, this should be considered one session, but all the packages in our study would treat this as two sessions.
  2. A user comes to your site, then goes to another site, and 10 minutes later comes back to your site by typing in the URL directly, or via a bookmark. In theory, this should be considered two sessions, but all the packages in our study would treat it as one session.

Another factor to be considered is what happens when cookies are blocked on the user’s machine. Some packages simply ignore the visitor for purposes of counting visits and unique visitors. Others fall back on pixel tracking and/or IP and user agent based tracking.
Across the sites in our study, only about 3.5% of users did not have JavaScript enabled, but 3.5% is a pretty large potential source of error. So even though pixel tracking and IP and user agent based tracking is less accurate in tracking sessions, it nonetheless is more accurate than ignoring 3.5% of the visitors.
The following table outlines some of the session handling defaults of the active participants in our study:

Tool Inactivity Timeout Continuous Session Timeout Blocked Cookie Handling Other
Affinium NetInsight 30 minutes 24 hours IP and UA tracking Inactivity timeout is user configurable
Clicktracks 15 minutes 30 minutes IP and UA tracking Any new PPC SE hit will initiate a new session
Google Analytics 30 minutes no timeout does not track visits or unique visitors
HBX Analytics 30 minutes no timeout does not track visits or unique visitors
IndexTools 30 minutes 8 hours IP and UA tracking

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Eric Enge

Eric Enge is part of the Digital Marketing practice at Perficient. He designs studies and produces industry-related research to help prove, debunk, or evolve assumptions about digital marketing practices and their value. Eric is a writer, blogger, researcher, teacher, and keynote speaker and panelist at major industry conferences. Partnering with several other experts, Eric served as the lead author of The Art of SEO.

More from this Author

Categories
Follow Us
TwitterLinkedinFacebookYoutubeInstagram