The following is the transcript of an interview with Eric T. Peterson. Eric is currently Vice President with the Visual Sciences division of web analytics firm, WebSideStory, is the author of several books on web measurement and moderates several popular discussion lists on analytics. He also manages an excellent site on web analytics called Web Analytics Demystified. He previously worked as an analyst with JupiterResearch and has been cited in the Wall Street Journal, Washington Post, New York Times, CNN, Business 2.0 and others as an expert on the topic.
Mr. Peterson will be giving a free webinar on web analytics on behalf of the American Marketing Association and Aquent on March 6th, 2007 titled “Web Analytics Demystified: Ten Simple Strategies for Using Web Analytics to Improve Your Online Marketing Efforts.”
Interview Transcript
Eric Enge: Why don’t we start with a brief background on Eric T. Peterson?
Eric Peterson: Sure. I started working in the web analytics industry in 1998 as a webmaster for WebTrends Corporation. At WebTrends, I could see that web analytics was going to be a very big part of how business was done online, and how people would justify the expenses that they were making. I eventually ended up at WebSideStory, as the strategic business consultant, somebody who could not just explain how the technology was implemented, but what you will actually do with the technology. I was given an opportunity in 2002 to go to Jupiter Research to cover the web analytics, site search, and content management technology markets. After working as an analyst for a couple of years I realized that I am more a roll up your sleeves kind of person, and the ten thousand foot view is not necessarily where my talents were best served so I took a position at Visual Sciences as Vice President, Strategic Services. In addition to Web Analytics Demystified, I did another book with O’Reilly & Associates, website Measurement Hacks, and now I have a third book, The Big Book of Key Performance Indicators.
Eric Enge: Can you talk a little bit about the merger of Visual Sciences into WebSideStory?
Eric Peterson: I am limited in what I can say about the merger of the two companies. Recent news includes the announcement that Jim Maclntyre, CEO of Visual Sciences, has been promoted and is now the CEO of WebSideStory as well as Visual Sciences. I think Jim’s whole-company view creates tremendous potential for our customers and for our prospects. The combined companies, we believe, provide the broadest and deep technology platform available in the market today. HBX, we believe is an excellent monitoring and reporting solution, one highly appropriate for companies of all sizes. We also offer Visual Site, the web analytics industry’s high-bar for analytics capabilities. Our approach to visualizations, our approach to data integration, and our approach to multi-channel customer analytics is all completely unique, not something that is easily replicated. And so, we are very proud to be able to provide not just the analytics solution to any company that would approach us with the need today, but also a clear and powerful growth path for them for the future.
Eric Enge: What are the key differentiating features of the HBX Analytics product?
Eric Peterson: Well, I think some of the differentiating features for HBX, include the breadth of strategies that we have for reporting. WebSideStory was the first company to deploy a really powerful and easy-to-use Microsoft Excel plug-in (HBX Report Builder). Report Builder really took advantage of the fact that people want web analytics data provided in a variety of formats. Some people in the organization would want Key Performance Indicators and simplified reports; they would want something that spoke directly to them.
We have a number of other products, such as the Active Dashboard, Flash-based reporting for when Excel is not quite “flashy” enough. Active Dashboard provides the ability to do what-if analyses and see how the numbers would change if you increase marketing spend for a given campaign. We also provide Active Viewing, our browser overlay technology, designed to make critical web analytics data available to users of varying levels of comfort with web analytics data and technologies.
I think that that’s the real strength of WebSideStory and Visual Sciences today, the recognition that one size fits all solutions don’t really fit everybody. So we built an organization, people, and tools that allow us to go into extremely large organizations with a clear message about how metrics will be widely distributed throughout the organization. Our goal is to make our customers very successful, not to force them to use a single interface to the data.
Eric Enge: Clearly one of the big things in analytics is who needs what data, so they can get something that they will make a decision on, rather than just issuing reports and getting a warm and fuzzy feeling. Everybody’s needs are different. The VP of sales needs one thing, and the business analysts need something else altogether.
Eric Peterson: Yup. I think the opportunity for the analytics vendors is to work out how to help people take better advantage of the data, and not to continue to roll out slicker looking user interfaces, or additional drag and drop functionality. All that functionality doesn’t help you at the end of the day if nobody actually uses it.
Eric Enge: Make the data more accessible and useful.
Eric Peterson: Work with the organization to help them understand what the web analytics business process is.
Eric Enge: Yes, everybody’s needs are different, so it’s not like you can have one off-the-shelf solution that helps everybody, because one company’s business problem they are trying to solve is completely different from another.
Eric Peterson: That is my essential belief, but that is not the universal belief in the analytics market. It is my core belief that it is not what your analytics software is capable of doing; it is what you can do with the software that people will understand and utilize.
Eric Peterson: We think that the Visual Site offering is exciting because there are companies that are actively trying to do a deeper analysis of the data. They are needing to create on-the-fly visitor segmentation-the ability to create different segments that are immediately transposed across all historical data and is available to all reports, or visualizations that you have created. This kind of segmentation allows people to get beyond the low hanging fruit, and to start to answer some really, really hard questions, and I believe that Visual Site provides for visitor segmentation better than any vendor in the market today.
The other real advantage of Visual Site and our Platform 4 technology is native support for multi-channel data integrations that generate customer intelligence solutions. For example, some of our financial services customers are not just looking at web data, they are also looking at web customer call data, their IVR center data, or their call agent data in the same stream. These customers are “sessionizing” their call data, allowing them to analyze and work to understand what is causing people to call, especially when they are calling about something they can do on the website already. Multi-channel data integration is not for every organization; but those companies with critical cross-channel analysis questions depend on powerful, scalable technology to help them answer their questions.
Eric Enge: Can you talk a bit about the underlying technology platform?
Eric Peterson: Visual Site and Visual Call are built on the exact same technology platform. Visual Call picks up IVR data, where Visual Site would pick up data directly from web servers, indirectly from JavaScript page tags, or from both sources using a hybrid data-collection approach. In the “Visual” family of products, we can work on any kind of structured data.
Visual Call questions are often surprisingly simple. For example, which calls subject in an IVR system leads more frequently to operator assistance during the call? Why does that seem to be happening? Does it happen because of someone’s geographic location? Or do callers make different choices based on the products they’ve purchased? Or perhaps their age group? These types of questions are easily answered with Visual Call.
Eric Enge: Your book, Web Analytics Demystified talks about the pyramid model of web analytics data, and you note some points of confusion or ambiguity in the definition of things like page views, visits, and unique visits. Can you talk about those?
Eric Peterson: For the longest time there was this strange notion of “what do you mean by unique visitor?” Are unique visitors defined by browser cookies, or by an IP address, or the combination of an IP address, and a browser application, and user agent on an operating system? If you don’t know what defines a unique visitor then you don’t know exactly what you are talking about when you use that term.
Early on, when people said unique visitors, laypeople immediately pictured human beings. Unique visitors, well that sounds like a person to me, but that’s not, right. Because I might use my work computer, my at home computer, and maybe I am using a mobile device. A unique visitor will show that as three unique visitors. Or, if I am deleting my cookie, then I show up as multiple unique visitors too.
Now in 2007, with the emergence of Web 2.0, you can begin to question what page views mean. For example, what if the HTML loads a rich internet application based on AJAX and people spend forty-minutes clicking around in an AJAX application?
Eric Enge: Now you begin to talk about measuring clicks on a page.
Eric Peterson: Yes, when you are viewing and using an application you might stay on that page for a long time. The result is additional ambiguity about what the metrics are telling you.
The ideal situation is when you can say I know who this person is. This can be done easily when someone creates an account on your site. This is an opportunity to do more with the data. Somebody logs in and now you know that much more about them. This can include past customer purchase history, support calls they have made, and more.
Eric Enge: Is there a higher level of effort required to make good use of your uniquely identified user’s data?
Eric Peterson: I don’t think there is any major new additional effort to get the data. From an analysis standpoint, many new opportunities are created, but these are all about maximizing revenue and return. You can pick and choose questions to get answers for and focus on pursuing those that will really grow your business.
People who come to us asking about multi-channel data integration or customer intelligence are usually pretty well set to take advantage of this information. It’s a self-selected group to some extent.
Eric Enge: By the time they understand the question, they are on the road.
Eric Peterson: Yes. Very rarely do questions about multi-channel data integration come up, but when they do, they are pretty good pretty well thought out questions, at least in my personal experience?
Eric Enge: Can we talk a little bit about content groups?
Eric Peterson: Content groups are typically defined one of a few ways, by analyzing the URL path, on page Javascript tags, or by sending over content groups from web-based applications, such as WebSphere, Active Server Pages, JSP pages, PHP, or whatever. You are physically saying this page is in this content group. Eventually, you work all the way up to a system that can do ETL and that will allow you to change content grouping post-data-collection, one that supports multiple content groupings for individual pages with dynamic groupings of content into packages for analysis. In the analytics industry, there are solutions that range from the very simple to the very sophisticated. The central challenge that I see is that one group of people in the Enterprise will want one set of content groups, but this grouping will not make sense to another group of people. The best case scenario is that the content grouping can be changed, or multiple content grouping can be used so that each group gets content that is right and relevant for them.
Eric Enge: Let’s talk about an example. We have a website with information on twenty thousand US cities and towns, and it has one master page for each city and town. It’s not meant to look at the data for the Aimes, Iowa page, because looking at the data would not result in any action.
But if you can look at all twenty thousand city and town pages in the aggregate, you now have something worth analyzing.
Eric Peterson: That’s a good example, where you are trying to create the ability to do geographic analysis in a system that is not set up to do that. In this example, with content groupings, you might be talking about two dimensions. One dimension to describe the physical location, the other dimension might be information about restaurants, and these might be located in any location.
Some systems don’t give you the ability to cross dimension very effectively. They force you to create content groupings such as /North America/USA/Iowa/Aimes/Restaurants/Downtown, but these cannot be cut the other way when you want to look at downtown restaurants across the entire country.
What you need to do gets back to why you invested in a web analytics system in the first place. Both the tag based systems and log-based solutions support alternatives in this regard. But if the client doesn’t know what they want at set up time, then they will typically get something that’s useful for some people, and useless for others.
Eric Enge: We hear people recommend that you begin by deciding what you want the analytics solution for, and then buy the package you want. Once it is set up and running, you then need to expect that it will take some time still before you are getting everything you want out of the analytics software.
Eric Peterson: There is a learning curve at the beginning. There is a related problem with trying to accommodate multiple reporting and analysis needs because things can get worse as you iterate. You may have somebody who says no, I am still using that thing you changed. You need to be thoughtful and cautious about it. The ideal solution is one where you can independently provide that information so you have a content grouping in one dimension, and your page has another dimension where page and content grouping can be combined.
Eric Enge: Most of the systems out there today seem to be focused on using a JavaScript-based tagging approach, although there are still a couple of products that do log file type stuff. Can you comment on the best approach?
Eric Peterson: I tend to agree that Javascript-based solutions are a bit more accurate, but if what you need is a combined vision of how your website is performing from a search marketing and search optimization standpoint, then maybe JavaScript is not the best thing for you. One problem with JavaScript is that you can’t get any data on search engine robot visits, as many ‘bots do not execute the Javascript tags. However, if you don’t want to have to spend the time to pay attention to robots, and spiders, JavaScript is fine.
The approach that I prefer is a hybrid approach that leverages web server data and JavaScript, using JavaScript to bust the cache, to properly account for any back buttons, and to allow you to measure content globally distributed on CDN’s like Akamai, etc. When we swung away from log file analyzers, the whole market decided that JavaScript was the way of the future that assessment may have been a tad premature.
Eric Enge: It seems like one of the big issues with JavaScript is that it presumes that you have implemented it correctly, particularly when you get into a lot of custom tagging.
Eric Peterson: I had a conversation with an industry analyst recently who commented that clients often forget that making any subtle change to the page tag for data collection purposes really should necessitate a complete retesting of the page and/or the entire site, depending on the scope of the change. JavaScript is a computer language and making changes is the same if you make them in any other computer language. You can introduce bugs pretty easily, and there are browser compatibility issues with JavaScript. There are lots of potential issues. The vendors who work really hard to support their customers to provide advanced implementation services and do provide lots of support during the implementation process, but the most important thing is to just be careful.
I have seen some really, really good advanced implementations, where people understand what they are trying to do, taking advantage of custom variables, and are really using JavaScript effectively to get a lot of information about the use of their website. If you are careful with it you can do great things.
Eric Enge: Even if you don’t change your JavaScript, but you change the website itself, you also need to retest your JavaScript, since it’s tied to the pages.
Eric Peterson: Too often organizations treat web analytics as unstructured tasks, essentially a series of ad hoc pseudo-processes. It’s hard to succeed that way. The people who succeed are successful because the organization understands the process of doing web analytics. When you update your website, there is a checkmark on the process of updating the website, there is the checkmark that says, make sure the pages are tagged; there is another checkmark that says check the page tagging; there is another checkmark that says QA the page tagging to make sure that data looks correct in the development environment. There is another checkmark that says, make sure that when the pages go live that the data is being collected properly. One challenge is that in the beginning people can stumble into success. But that is not scalable and repeatable. With the right process in place, it’s far more scalable. For example, if the person who did the initial Javascript takes another job, are you ready to handle that?
Eric Enge: Avinash Kaushik said at one point that you should plan on spending 10% of your budget on the analytics package and 90% for the implementation and ongoing analysis.
Eric Peterson: I have talked to Avinash about this a number of times; I disagree with the 90-10 model because it is an impractical model. I mean Avinash is fortunate to work at Intuit who has a number of very bright people with web analytics backgrounds in a variety of business units. But many companies don’t have the hiring power or the resources to spend that kind of money.
And so, if you can’t do that on your own, what do you do? If you only spend a dollar, you should spend it on a solution that is well supported, and a solution from which can you buy professional services to augment your efforts, so you don’t have to try and hire that help. Instead of 90-10, a better answer might be 60-40.
Eric Peterson: And hiring the right people is difficult. Qualified web analytics professionals are earning six-figure salaries because there aren’t very many qualified people out there for looking for these jobs.
Eric Enge: What advice would you give to the companies that were looking to revamp and greatly enhance their existing analytics strategies?
Eric Peterson: Embrace the value of process in your analytics strategy. I really think that companies should take a closer look at the processes that they had used to try to be successful with web analytics in the past. The ideal situation is one where everyone is getting regular access to the data they need to do their job and one where there is a known process for drilling-down more deeply to mine for valuable insights.
Eric Peterson: I should mention, I’m giving a free webinar on web analytics on behalf of the American Marketing Association and Aquent on March 6th titled “Web Analytics Demystified: Ten Simple Strategies for Using Web Analytics to Improve Your Online Marketing Efforts.” All of your readers are welcome to attend and they can sign up at http://www.marketingpower.com/webcast332.php.
Eric Enge: Thanks for taking the time to speak with us today.
Eric Peterson: Thank you!