Key Interview Points
Let me start you with the big ones up front. This interview had two startling parts to it. These were:
- The huge weight placed by Bing on user interaction with the search results as a ranking factor. This was amazing stuff. Basically, Bing is willing to test any page by indexing it. In fact, you can pretty much directly inject any URL you want into their search results using the Submit URL feature of Bing Webmaster Tools. Then they will test it, and if the click interaction data is bad, out (or down) you go.
- The ranking of the priorities for publishers in Duane’s eyes. #1 Content #2 Social Media #3 Links. Links were rated as the third most important area. Third.
This was fascinating stuff. Of course, each search engine is different, and they each test different things. But this is a very different headset than what we are used to in the world of SEO. Here are some of the other key points from the interview:
- (Duane): “You need to remember that the search engine sees everything across the web on every layer and as a whole, all at the same time. So, when you delight someone with the best user experience possible, we pick up all those signals that person shares about their delight and those signals will help influence our perception of your quality.”
- (Duane): “The end goal of everything we do at Bing is to provide a better result set for a searcher. That’s the core reason why a search engine exists”. To help put this comment in perspective, I find it useful to think about the search engine’s goal as being to provide the searcher with the fastest possible answer to their question. This is an incredibly important mindset to establish in your mind as you work on your Internet marketing strategy.
- (Duane): “If we are actually finding your pages, but we are not keeping them in the index, there is a reason for that”.
- (Duane): “Search engines are evolving and things like RSS are going to become a desired way for us to find content … It’s a dramatic cost savings for us”.
- (Duane): “Your Sitemaps need to be clean. We have a 1% allowance for dirt in a Sitemap. Examples of dirt are if we click on a URL and we see a redirect, a 404 or a 500 code. If we see more than a 1% level of dirt, we begin losing trust in the Sitemap”.
- (Duane): “Millions of movements per hour are happening across Bing”.
- Bing Webmaster Tools offers a crawl scheduling type feature where you can indicate times of days when you prefer that Bingbot does its crawl.
- Bing does not use page performance as a ranking factor. This is because a page with a 4-second load that has all the content someone wants may well be a better experience than a page with 1 second load time that does not answer the question as well.
- Bing’s Webmaster Tools inbound links feature shows a reasonable representation of links with a focus on those that matter most to your site.
- (Duane): “We have our internal metric that folks may or may not be familiar with. It’s called Static Rank and this is where we judge the value of a particular URL as we perceive it”.
- Take advantage of the Bing Webmaster Tools data to see the Average Impression Position for the search phrases that are delivering traffic to your site. If the reported position is decreasing (moving up in the search results) this is an indication that the trust in the page is growing.
Interview Transcript
Eric Enge: Duane, you are now running the Bing Webmaster program at Microsoft but you were formerly a SEO. How has that transition been?
Duane Forrester: It’s like being a kid in a candy store. What every SEO wants is a peek under the hood. When I started, I was told to come down to the garage and we will show you what we are doing now, and what we are planning for the future.
Eric Enge: What perspective would you share with SEOs who are still on the outside?
Duane Forrester: People need to wrap their heads around the fact that things are changing. If they say, “I will tweak my title tag and put a H1 tag in there and I’ll be fine”, they will be left behind. This is why we are investing so heavily in Webmaster Tools.
It’s a way for us to reach out directly to people who control websites and give them all kinds of data that we are willing to share it with them, but they need to take action. Those who adapt will be the ones who survive long-term.
You need to remember that the search engine sees everything across the web on every layer and as a whole, all at the same time. So, when you delight someone with the best user experience possible, we pick up all those signals that person shares about their delight, and those signals will help influence our perception of your quality. It could be as simple as I went to site ABC.com and absolutely loved it as a notation in a Facebook post.
Folks need to take a world view of things and not only focus on the minutiae and details. These tools can help people understand when they need to go back to the drawing board and say, “how do I improve my user-experience? What do I need to invest in?”
My perennial example is it’s the eBay school of selling. When you want to sell something on eBay, you don’t simply post a single photo of the item. You take a dozen pictures from different angles. You take photos of the box it came in, photos of the paperwork and its stationery, photos of it in action, and you write a detailed description. You put all of this out there because, in order to sell the product, if the person can’t lay their hands on it the next best thing is to describe it in detail.
When you look at the user experience, you need to look at it as if I put the piece of content on the internet, is it an authority piece of content, is it something that will wow the user when they click on that search result and come to me? They will realize they found their home, they found the answer they need, they can get their task completed, and they can complete whatever it is they set out to do.
Eric Enge: Right, it is a holistic view of the situation
Duane Forrester: Yes, it’s a holistic view of the situation and, at the same time, the view needs to be user-centric.
Eric Enge: You mentioned earlier that people who focus exclusively on optimizing their titles and H1 tags will get left behind. The discussion you just walked us through draws that out a bit more.
If I could offer my paraphrase for this, the first step is to forget the search engine and create a kick-ass user experience. Then create a site that the search engine can read. Next, promote your stuff effectively so the signals are emitted to make people aware of it. The signals are emitted and you get that mentioned on Facebook, you get that write up in the San Francisco Chronicle, etc. That’s sort of a holistic view of the whole thing.
Duane Forrester: Yes, you are bang on with that.
Bing Webmaster Tools – a conduit of information for the webmaster
Eric Enge: What are the goals of Bing Webmaster Tools?
Duane Forrester: First and foremost, the tools are a space where we have a conversation with the owner of the website. The goal is to give them something of value that will help them create a better product. The end goal of everything we do at Bing is to provide a better result set for a searcher. That’s the core reason why a search engine exists.
Sometimes we provide a cool new feature that wows people but often, the value comes through small things, such as helping the Webmaster understand that we cannot access a particular piece of content, but if they take the block off robot.txt, then we can get in there.
One main goal is to make sure people understand how we are viewing their websites. This is a critical component because Bing is highly centralized around the idea of quality content and a quality user experience overall.
We need to make sure we share as much of this data as possible whether it’s your average impression or your average click location.
If we are actually finding your pages, but we are not keeping them in the index, there is a reason for that.
We hope to educate the site owner so they understand where they need to make changes on their website to ensure the search engine is finding all their content. If we are finding your pages, but not keeping them in the index, there is a reason for that. If we have 17 million other results that are better, there is no need for us to keep 17 million and one if we know we are never going to use it. That’s an indication that the website has some work to do around the idea of quality.
Eric Enge: So, it’s a tool whose design is to allow publishers to improve their sites and give them a better chance of being one of the best results for search queries. What would you say you have done, and are thinking about doing, to guide people towards the holistic view you outlined earlier?
Duane Forrester: One of our main areas of focus within Bing Webmaster Tools is making sure we expose all of the correct data. We ask ourselves, “If I am a small business owner, I need to come in here and get something done. What is it I need to get done?” We back that up and then say, “From the search engine’s perspective, what do I need you to get done and how do I guide you in that regard?”
A large area we continue to invest in is help documentation. It’s understanding the most frequently encountered issues and creating documents that clearly explain how to take an action within that. The tools themselves are designed to make sure that core pieces of information are being shared with you.
Sitemaps and the Index Explorer Tool
Eric Enge: What can we use within Bing Webmaster Tools to see how Bing views our sites?
Duane Forrester: Recently, we rolled out expanded support for Sitemaps. You can provide a regular sitemap.xml, or you can give us RSS 2.0, Atom 1.0 to 1.3 feed or plain text files. This provides users with more ways to tell us about the pages on their site.
RSS is going to become a desired way for us to find content.
Search engines are evolving and things like RSS are going to become a desired way for us to find content. Imagine if we can ingest your RSS feed and every time you post up it comes directly to us. It’s a dramatic cost savings for us.
These things are very important to an engine. These are areas where we look for efficiencies as they save us crawl time.
It also allows us to get much closer to publication time because Bingbot can’t be everywhere at the same time. By sourcing these feeds as a location for finding content, when the feed pings and says I have something fresh, we get it instantaneously. That’s a big step forward.
The Index Explorer will show you a top-down view of the actual structure of your website in our index, it’s a really cool, useful feature within the toolset.
Another core area we are focused on is making sure people understand what the structure of their website looks like in our index. We have a specific tool that enables people to see everything that’s in the index. It’s called Index Explorer. It will show you, from a top-down view, the actual structure we see.
If we see more than a 1% level of dirt, we don’t trust the Sitemap.
As a webmaster, you can export all this data, compare it against your own Sitemaps, and see if there are any gaps in there. Maybe your newest content was never in there. Why is that? Your Sitemaps need to be clean. We have a 1% allowance for dirt in a Sitemap. Examples of dirt are if we click on a URL and we see a redirect, a 404 or a 500 code. If we see more than a 1% level of dirt, we begin losing trust in the Sitemap.
When it comes to feeds, it’s more straightforward. If there are gaps, you need to check that the feed is working properly. Are we actually finding that content, or do you have a redirect setup that moves people over to another place? That’s not the kind of the thing we want to see happening from the user’s perspective.
Inserting a URL directly into Bing’s index
Eric Enge: The tool to insert a URL looks handy.
Duane Forrester: It is a really powerful tool. It inserts the URL directly into our index. There are limitations placed on it to protect against spam such as how many and the frequency.
Eric Enge: 10 a day, 50 a month, right?
Duane Forrester: Exactly. It is a fantastic tool because if you have your latest and greatest, and you want to make sure it hits the Bing index, go in there and insert it via this tool. When we crawl content or ingest it from a sitemap, it passes through layers of filtrating where we do quality control on it, see what it looks like, rate it, and then put it into the full index. We then see how it populates and how the users interact with it.
If you use this tool to inject a site or a URL, it goes directly into the index and shows up almost instantly inside the SERPs.
If you use this tool to inject a site or a URL, it goes directly into the index and shows up almost instantly inside the SERPs. You are then at the mercy of user experience to tell us if it is good content. This is a new URL so there is no history, there are no links pointed to it, we have no other signals but we are willing to give it a try and it goes in the index.
If the users love it, it stays. If the users don’t like it, it gets dropped. This is a way to determine if the users feel this was a quality result.
Eric Enge: How do you decide if it’s a quality result? I’ll give you a scenario. Someone puts a URL in there and the first 10 users go to it and, as far as you know, they view only one page on the site.
Duane Forrester: We are looking to see if we show your result in a #1, does it get a click and does the user come back to us within a reasonable timeframe or do they come back almost instantly?
Do they come back and click on #2, and what’s their action with #2? Did they seem to be more pleased with #2 based on a number of factors or was it the same scenario as #1? Then, did they click on anything else?
We are watching the user’s behavior to understand which result we showed them seemed to be the most relevant in their opinion, and their opinion is voiced by their actions.
We are watching the user’s behavior to understand which result we showed them seemed to be the most relevant in their opinion, and their opinion is voiced by their actions.
Eric Enge: So, the primary data point is interaction with the SERPs. Either they don’t click on it or they click on it and are back in a second.
Duane Forrester: There is more to it than that. Remember, we parse against roughly a thousand parameters in our algorithm so we are also looking at things such as the overall trust of the website, the age of the website, the general tone of the website, the general sentiment about the website. These things play a role.
You can imagine CNN publishes a new blog post and that’s probably more trustworthy than DuaneForrester.com.
There are many other factors that come into play. To simplify it, if we show your result and no one clicks on it then obviously they are not happy about it for a reason.
Eric Enge: Yes, absolutely.
Duane Forrester: This is really simple. It is your URL, your description or your title. That’s all that’s shown so something in there is not grabbing their attention and not compelling them to check your page out by clicking on you.
Not all content is kept in Bing’s index
Eric Enge: You mentioned earlier that Bing doesn’t index all the content it crawls. Would you expand more on that?
Duane Forrester: Our goal is not to crawl the entire internet end-to-end every week. That’s not something we want to do. We want to find the best content out there. We crawl the internet from end-to-end, over and over again to try to find that good content.
Eric Enge: And recognize how it changes and evolves?
Duane Forrester: Exactly. That doesn’t mean we keep it all because, and this may be a shock to you and your readers, there is some garbage on the internet. I know you heard it here first; there is garbage on the internet.
Eric Enge: I’d like to articulate that because it is an interesting concept. If you find content, whether it’s through submit URL by crawling or from someone’s sitemap, and you see the users’ experience with those pages is not good, then you drop it?
Duane Forrester: Let’s define drop it. It could be that your ranking drops back to the second or third page. We know, based on the history on your website, that you have a slow cadence of building out your content set, so it will probably improve over the next few months.
If we know you will be a better result, we won’t drop you out of the index. We simply move that result back and wait for it to become a better item. Over time, as we check back and see changes happening, we try it again in a high ranking spot and see how people respond to it.
Millions of movements an hour are happening across Bing.
These changes, these movements, happen every day and every second against every query. Millions of movements an hour are happening across Bing. It is designed to help us understand if it is a good result for a user. Did they interact with it in a way that indicates they are happy with that result, that it wowed them?
That’s why we keep coming back to the user-experience and wowing your visitors because that’s what you are trying to do. If you partner with us, by giving us good content that wows people when we use it as a search result, we are going to love you for that. If you give us dreck we are not going to show it and are not going to keep it in our index because keeping that in our index is a hard cost to us.
Every site should submit a Sitemap
Eric Enge: Let’s go back to Sitemaps again. As we discussed earlier, you have a way to indicate what your Sitemaps are and see what Bing is finding in the Webmaster Tools.
What about sites that don’t have Sitemaps? Is that a major disadvantage for them if it’s a small site with a lot of trust and authority? Or, could absolutely everybody do a Sitemap?
Duane Forrester: We recommend everyone do a Sitemap. It helps us because there is a protocol built around it, and we adhere to the protocol. We look for your Robots.txt in the first instance, and, in the second instance, we look for your XML Sitemap. If you don’t have those, then it’s more of a problem for us than it is for you. We will crawl the website, we will go through the navigation as best we can, we will find your content, and we will ingest it that way.
Sometimes we may find ourselves up against navigational elements that we can’t get through, or an encoded link that essentially gives us a blank wrapper that we can’t see inside so we can’t see the link.
If we can’t see what’s inside the technology on the URL, it becomes a very dicey proposition for us to take that URL and start returning it as a search result. In most instances, it won’t be a problem, but in some instances, there may be bad content such as content no one should see, content that’s inappropriate, or content that’s not relevant.
Eric Enge: Or, you have to trust it less.
Duane Forrester: That’s exactly what it comes down to. In some cases, people will employ navigational elements that we cannot crawl through. If we can’t crawl through it then we can’t find the URL which means we can’t find the rest of your content.
Eric Enge: If you have a case where the site is too hard to crawl, and given what you know about it, how much you trust it and how much authority you associate with it, perhaps you will crawl it less? If they had given you a Sitemap, would you crawl it more?
Duane Forrester: I understand where you are coming from, and the short answer is no, not really. We are voracious about looking for new content. That is the lifeblood for search, the freshest content available. The beauty of all of this being crawler-based is that they just keep going.
We invest a lot of time in managing our ability to crawl at a certain pace and not melt people’s servers. We have to carefully police and make sure we are not harming people in our efforts to get their content.
Eric Enge: In the Webmaster Tools, you offer the ability to throttle the crawl, not just at the site level, but by time of the day as well.
Duane Forrester: That’s huge. The ability to say, “Look, I have a bandwidth cap now. I am thinking with my small business hat on”. I pay for bandwidth every month and if I go over that cap it’s like my power meter. I don’t get charged for what I use, I get charged for being in a new band. My costs jump up exponentially in some cases, because I hit that cap and I went through it.
If you are the business owner, you want to protect against that. It’s something I wish existed years ago for all engines because you need a way to manage those costs. In most cases, people will leave it on the default setting and let us figure it out for them, and that’s okay. We will try to minimize the amount of time we spend there and maximize the amount of content we discover.
We sometimes hit servers that are shared servers, and if that server is running slowly, we back away. You can also easily opt to change the settings to follow a preferred pattern based on your own needs. Have Bingbot crawl faster at night and slower during the day, making your bandwidth available to customers instead of crawlers. This control is entirely in your hands.
If you set something up on a free blogging system, you give up a lot of control and place yourself into a neighborhood where there is homogeneity, and you have to accept what comes along. Yesterday someone commented on the Bing Webmaster Blog that there is no way to put in a Sitemap on those open blogging environments.
When I walked through their website they have a lot of good content, they are focused on a good topic, and they produce good, unique content. Unfortunately, they are using a free blogging hosted service and it doesn’t allow them access to implement Webmaster Tools. They don’t understand how to migrate out on their own.
There are many good ideas that end up dormant because they don’t understand how they should start off, what they should invest in, and the areas they need to think of for future growth. That’s something I hope to write about in the coming year on the Bing Webmaster Blog.
Performance and the User Experience
Eric Enge: We talked about how webmasters can protect themselves from the crawling performance hit, but performance also can be a user-experience issue as well, right?
Duane Forrester: Yes. We look at it and say, “If we show up and we think it’s slow what does that mean to a human being?” When the industry started to note that page load times matter everybody started to freak out and ask, “What is a good page load time? How fast should I be?”
If the user is happy with the 3-second load time then saving a tenth of a second is largely irrelevant.
We saw websites strip features out because they said it made it too slow. They ended up with these slim-down user-experiences that were hyper quick. The problem is when you measure something at a tenth of a second or a hundredth of a second, you make gains by stripping things out. Essentially, this is done by removing features. The issue then becomes, if the user is happy with the 3-second load time then saving a tenth of a second is largely irrelevant.
Eric Enge: Plus you are taking off the content they were looking for.
Duane Forrester: Exactly, and it’s critical that people understand when we talk about content they have to open their minds. We are not simply talking about written words and text on a page. We are talking about videos, pictures, audio, apps they may have, calculators. We are talking about all these things which are the reason someone may come to your website.
For example, you start peeling those features back because you feel there is not much value in that calculator as not many people have used it, and it adds half a second of load time. Moving it off of that page on to its own standalone page is not the best user experience. Leave it in place. If you load it in ten seconds, a half-second isn’t going to change your world. You have bigger issues than the half second load time that function is causing.
Eric Enge: From the holistic perspective, you can optimize on any individual factor, in this case, we are talking about page performance. At the end of the day the issue that matters, and most likely the signal that matters to the search engines, is what user-experience signals are emitted based on what happens on the page. If there is a 4-second load time on a page with all the content they want, versus a half-second load time and they don’t find what they want, the 4-second load time is the better experience.
Duane Forrester: I feel like I am repeating myself because I constantly talk about it at conferences and seminars and everything I do. I talk about user-experience and wowing the user, about quality, and unique content. These are the filters everyone needs to judge their work against.
The nuts and bolts of SEO – title tags, meta descriptions, H1s – are the technical aspects and important to get right. At this stage of the game, we have to assume you are going to get those pieces right. What does the next generation SEO look like, what is important? It’s not a laundry list of technical items. It is social media, link building, and content.
If you look at every decision you make through the lenses of quality and user-experience, you are going to end up with a much better product.
If you take that approach now and look at quality and user experience, and if every decision you make is through those lenses, you will end up with a much better product.
Going with the fastest load time is probably not the best user experience because it essentially means you have nothing on your page. Putting everything up on your webpage and then saying, “You tell me what you want” isn’t that useful either.
Users are not there to be wowed by how many services you offer, they are there to find the one service they need. When I look online for a bank loan calculator, I don’t want to go to a page that tells me to go to another page, which then tells me here is a list of the calculators we offer. No, I want the actual calculator.
Backlink Functionality in Webmaster Tools
Eric Enge: Let’s talk about the backlinks functionality in Webmaster Tools? I did a quick check and used PerficientDigital.com. It’s interesting because the range of backlinks shown by different tools varies. For example, Bing Webmaster Tools tells me it gets 6,926 backlinks, Yahoo Site Explorer says 14,195. Google Webmaster Tools shows 68,285.
You can get SEOmoz’s Open Site Explorer or Majestic SEO from the folks in England, and they all give different counts. Obviously, for tools like Open Site Explorer and Majestic SEO that probably has something to do with their crawling infrastructure as compared to a search engine. Even among search engines, it varies greatly.
One thing that is clear is that this relates to what you choose to show. As an example, Perficient Digital has approximately 4,400 pages on Web Pro News that links to it. I gather there are things you choose not to show. Perhaps one of those things is a huge number of links from one domain. You might show a sampling. Would you talk about what you are showing and what you are not showing?
Duane Forrester: What we are trying to do is give a reasonable representation of what these links look like. We don’t have a problem sharing multiple links coming from within a domain. Two pages within one domain running over to one or two pages within another domain, or your domain, is not a big deal for us to show.
There are probably many links we are not seeing simply because the pages those links exist on aren’t quality pages. I see this on my own websites all the time.
I join some affiliate program and every instance of that affiliate program generates a link back to me. Their job is to replicate as many pages as possible. Each one of those is a unique product so it makes sense on their end; however, I end up with a thousand links pointed at me, essentially pages that don’t provide any value.
If there are links pointing at you that are not providing value, they don’t show up in here. We do a bit of filtering up front. Some of this is also driven by our index size. Our index is growing by billions of URLs per day, but that’s not to say that it has every URL from the internet out there.
I think the real value in the data these tools provide is links and link management. Being able to export this data and work with it in a spreadsheet is a godsend.
Eric Enge: I think part of what you are indicating in the case of PerficientDigital.com is that there are 6,900 important links that the site has.
Duane Forrester: Yes, as viewed by us.
Eric Enge: Without arguing about the thousand or so at the bottom of that pile, which may or may not be better than the ones you didn’t bother to show, certainly the great majority of them are the best links the site has.
Duane Forrester: Right. As we continue to grow, we are constantly adding billions of URLs. If you want an indication of how well that index growth is moving, take a look at the improvements in relevancy over the last 2 years for Bing. Relevancy has improved to the point where it is simply amazing. Our ability to understand exactly what a regular searcher is looking for and bring back the right result is fantastic.
Eric Enge: Is there a way from the backlinks to understand which ones are the most important?
Duane Forrester: We don’t really show any data. We have our internal metric that folks may or may not be familiar with. It’s called Static Rank and this is where we judge the value of a particular URL as we perceive it. That data is not shown here but it’s something we continually have discussions about internally. Right now, we are opting not to show that.
That’s why it’s handy to be able to export this. You can parse through this and get rid of duplicates from URLs or domain and narrow it down to the single domain. You can then take a look through what the anchor text is telling you and plan how to reach out to people, how you make your changes, how you make your pitch, what you suggest to them. In terms of how these things are ordered, they are ordered how we found them. It’s very straightforward from our end. We put them in there, and off you go.
Social Media
Eric Enge: Do you have plans to start showing social media data?
Duane Forrester: We have a good understanding of how social influences search and you can obviously see within Bing the integrations of social data into search. Ultimately, what it comes down to is can the data we have access to be applied in this particular area? That’s partly to do with policy, partly to do with partnerships, and partly to do with desire.
We would have to talk to partners to find out if we can share the social data. Then we would need to consider the privacy issues. There are many different layers we have to go through before we get to the stage where we can say you have this many Likes this week.
Eric Enge: Social media has become an important input to search.
Yes, in my view the priorities for websites are content, social media and link building, in that order.
Duane Forrester: Yes, in my view the priorities for websites are content, social media and link building, in that order. Think of it as a three-piece pie. At some point, social could be more important than content, but that assumes you have excellent content in place.
If you don’t have excellent content, all the social in the world will not save you.
Underneath this, your technical SEO has to be solid. We assume you have a well-optimized product, a platform that we can actually get through to find the content.
One of my biggest fears for the industry is that many people are new to this topic and are skipping the “crawl and walk” phases and moving directly to the “run flat out” phase. They are going straight into social media, they are going straight into hardcore optimization, they are going to all of these things and, there is no depth of content. They are not offering anything new to the conversation and there is not enough that differentiates them.
Then they end up in a failure position because they missed some major building blocks along the way. Content, social and links are all important and they are tied in with each other.
Eric Enge: You have my vote for including social data within Webmaster Tools.
We are always open to getting feedback on features that people think are important.
Duane Forrester: We are always open to getting feedback on features that people think are important. We have a dedicated discussion forum within the Bing Webmaster forum specific for future ideas. Anyone can come in and provide their input.
I read the forums every day or so. If people have good ideas I will respond back to them and say, “Thanks, I think that’s a great idea. I am going to capture it and it will go into our planning cycle.”
A nugget about average impression
Eric Enge: Can you talk about the average impression data you provide?
Duane Forrester: The average impression position is where we tell you this is the average location we showed you in and the average click comes as a result of that. Most times these numbers will be closely tied to each other. Sometimes you will notice there is a gap. That gap means when we tried you at the higher position, number 1, you got clicks. When we tried you at number 3, you got fewer clicks. What’s important is that average impression position, though. Let’s say you got an average impression position of 30, so you are at the top of the 4th page, right?
Eric Enge: Yes.
Duane Forrester: And you watch that for a few weeks, and then it becomes 21, then it becomes 16, then it becomes 12. Understanding the mathematics behind the concept of average, we are obviously showing you in a range that has you averaging that position. This means we are showing you higher at times and lower other times.
Eric Enge: Right, you are experimenting with placements for the content.
If your average impression position increases; we are increasing our trust in this result.
Duane Forrester: That’s what we are doing. As that number continues to increase, if you show the average click position, correlating a similar increase, you start to get an idea that we are increasing our trust in this result and in you for this query. We are not going to move you higher if we don’t think you are worth being higher.
If you start going backward, it is an indication of one or two things. You’ve changed something on your end, which calls into question the value. In many cases that is not the case because you didn’t alter anything on the webpage. It could be that your competition is outpacing you and effectively pushing you down. So that movement of average impression position is a loose interpretation of how much we love and trust you.
Eric Enge: That makes sense. It can give people a measurement of when they need to take action at one level, if they are moving down for example, or when they have an opportunity they might want to focus more attention and capitalize on it.
Duane Forrester: Exactly, between this and a good analytics package, you are able to make a decision on what has the biggest impact for you.
Eric Enge: Awesome. Thanks Duane!
Duane Forrester: Thanks for taking the time to pull this interview together, Eric. It’s always great to chat with folks across the industry. And if readers have feedback for us, please reach out. We listen and respond, whether it’s online or in person at the conferences. If you’re not running the Bing Webmaster Tools today, get crackin’! I can’t imagine an SEO out there who wants LESS relevant data to make decisions around.
About Duane
Duane Forrester is a Sr. Product Manager with Bing’s Webmaster Program. Previously, he was an in-house SEM running the SEO program for MSN. He’s also the founding co-chair of SEMPO’s In-House SEM Committee, was formerly on the Board of Directors for SEMPO and is the author of two books: How To Make Money With Your Blog & Turn Clicks Into Customers.
Duane was a moderator at www.searchengineforums.com and maintains his own blog at www.theonlinemarketingguy.com. He has also written for www.searchengineland.com and www.searchenginewatch.com over the years.
Other Interviews
Danny Sullivan, August 8, 2011
Bruce Clay, August 1, 2011
Google’s Tiffany Oberoi, July 27, 2011
Mona Elesseily, July 18, 2011
Vanessa Fox, July 12, 2011
Jim Sterne, July 7, 2011
Stephan Spencer, June 20, 2011
SEO by the Sea’s Bill Slawski, June 7, 2011
Elastic Path’s Linda Bustos, June 1, 2011
SEOmoz’ Rand Fishkin, May 23, 2011
Bing’s Stefan Weitz, May 16, 2011
Bing’s Mikko Ollila, June 27, 2010
Yahoo’s Shashi Seth, June 20, 2010
Google’s Carter Maslan, May 6, 2010
Google’s Frederick Vallaeys, April 27, 2010
Matt Cutts, March 14, 2010