Skip to main content

Analytics

3 Unique SEO Metrics to Investigate

Kpi

Metrics can tell us more than just how we’re performing—good data can also provide background for proactive site improvement. There are already dozens of guides written on SEO metrics, so rather than rehash the basics, let’s focus on three important metrics that you won’t find in most SEO KPI reports:

1. Percentage of Traffic from SEO

Don’t you want your traffic from SEO to be the largest source of traffic to your site? If you’re the SEO manager, the answer might be yes. If you’re the CEO, the answer is definitely no.

Generally speaking, the sites of strong brands get a lot of direct traffic and referral traffic. If more than 50 percent of your traffic is coming from SEO, it’s a sign that you may need to build a stronger brand before you can build your SEO traffic much higher. Think about it this way: If your site is providing something valuable, then presumably some of the people who visit will want to come back again.

[Tweet “Over 50% organic SEO traffic is NOT a good sign! Find out why”]

If you have lots of organic traffic, but little direct traffic, it means you aren’t getting a lot of repeat visitors. Some sites may have good reasons for this to be the case, but for most sites it’s a red flag.
Those repeat visitors are going to be the ones who help create the signals that tell Google you’re a legit player in the marketplace. They will talk about you online, link to your site, write reviews about you, share your content and generally make your life much easier. If you’re getting a lot of one-and-done visitors, think about how you can make your site stickier, so that you develop a relationship with the people who visit.

Conversely, if your percentage of site traffic from SEO is on the low side—especially anything under 20 percent—it likely means that your site is under-optimized and that there’s plenty of opportunities for you to improve.

[Tweet “20% to 50% is the sweet spot for Organic Search traffic to your site. Find out why at”]

2. Average Daily Crawl

Average Daily Crawl Image
In the Crawl Stats tab of Google Search Console, it shows the average number of pages crawled per day over the last 90 days. If you have fewer pages indexed than you would like, it’s possible that your average daily crawl could be a limitation.

If the number of pages in your XML sitemaps (which should be the number of pages you are trying to get indexed) is more than six times the average-daily-crawl pages shown by GSC, it means that you may not have the crawl budget needed to get all those pages indexed.
If your XML sitemaps don’t accurately reflect your indexation goals, the principle still stands—in order to get pages indexed, you need them to be crawled on at least a semi-regular basis. (The number six is an approximation—the actual number may be higher or lower in your market, and likely depends on a number of factors that are unique to your site, but it’s a good benchmark for determining if crawl budget may be a limiting factor.)

[Tweet “If pages in your XML sitemap >6 times avg daily crawl shown by GSC, crawl budget is too low. More >”]

If your daily crawl budget is short of where you need it to be, it’s a sign that you either need to increase the number of links and positive brand signals to your site, or there may be a technical issue that’s causing Google to curtail their crawling. Potential culprits are poor time to first byte and page load times or having a large number of duplicative and/or low-quality pages on your site.

3. Percentage of Indexable Pages

This isn’t a metric you can just pull out of Google Analytics, but there are a number of crawlers on the market that should be able to give you at least a good approximation of this number (including Perficient Digital’s proprietary crawler that we use with our clients). Ideally, you’ll be using a crawler that follows robots.txt instructions for Googlebot and keeps track of URLs that would have been crawled but weren’t because they were blocked.

Although the need to primarily feed Google high quality, unique pages is pretty well established, it often isn’t well planned within site design. So, we have to use a lot of solutions—noindex, robots.txt and canonicals are the primary suspects—that tell Google we don’t think those pages would be good choices for their index.

While each of these solutions is much better than not having them (assuming they are applied appropriately), they all cause inefficiencies in PageRank flow and/or crawl budget. If more than 50 percent of the URLs you’re exposing to Google fit under one of these categories, there’s a strong likelihood that your site architecture could be improved in terms of the pages it’s exposing to crawl. The closer you can get to 100 percent of the URLs exposed to Google being crawlable and indexable (provided they are unique, good quality pages) the stronger your site’s organic search traffic will be.

Summary

While it’s important to track metrics that tell you how you’re doing, it’s also useful to look at metrics that can inform and potentially diagnose where you can make improvements. These metrics are just a few examples of ways to look at your site from a different perspective.

Thoughts on “3 Unique SEO Metrics to Investigate”

  1. I agree that only SEO traffic isn’t a great sign of a strong brand but it is great to see organic traffic coming in. Most of our clients have built great social media pages for their local businesses and getting plenty of referral traffic that way.

  2. “If you have lots of organic traffic, but little direct traffic, it means you aren’t getting a lot of repeat visitors.”
    It doesn’t have to be that. Lot of direct/none traffic is not generated by repeat visitors. It can be traffic from lot of places where googlebot can’t reach (like Messenger). When you use UTM tags efficiently you can lower that number. We also have to remember how Google Analytics attribution works.

  3. Sure – repeat visitors can show up in different ways. The key is that they’re showing up somewhere.

  4. Thanks for this – do you guys have any stats or resource on average daily crawl/XML site maps ratio for specific industry niches?
    I’m specifically interested in software/technology and automotive (retail).
    Thanks!

  5. That’s interesting point of view. You wrote that SEO manager want to generate traffic from SEO and CEO doesn’t want it. So if I’m SEO&CEO (self-employment) – I’m a bit torn 😉 To be honest – I need to try to be more brand-building CEO. Brian, You gave me food for thought:)

  6. The best SEO Metrics Are GSC and GA:) i dont need other tools, ok sometimes i just use Ahrefs or Senuto but thats all for me. But its very important to look what happens every week with users on our site. Maybe our website structure suck or we have so diffucult menu. I know one – UX is the best friend right now for SEO.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Brian Weiss

Brian Weiss is a Managing Senior Consultant and has over a decade of experience in the digital marketing field and has consulted for some of the highest traffic websites in the world. Brian is a former writer for Searchengineland and has spoken on the Advanced Technical SEO panel at SMX.  

More from this Author

Categories
Follow Us