SEOs shall not live by Google announcements alone…but they sure can be interesting! In today’s post, I’m going to share data from the team here at Perficient Digital, plus my comments on each.
Long ago Google almost never talked to us in the SEO world at all. The big change occurred with the advent of Matt Cutts taking on the role of “Google guy”, and then later become more transparent about it. A senior (actually by title, “distinguished”) Google engineer on the web spam team, Matt pioneered the idea that the web might be a better place if the world’s largest search engine and the people who optimize for it actually communicated.
So Matt began speaking at conferences, accepting interviews (such as this one and these that I did with him in 2007, 2008, 2010, 2012, and 2013), and eventually producing a now-legendary series of informative webmaster videos.
When Matt left Google to take a position with the United States Digital Service, his oversized mantle was picked up by several spokespeople, most visible among them Gary Illyes at conferences and on Twitter, and John Mueller with his weekly webmaster Google Hangouts. In addition, the Google Webmaster Central Blog is also a source of regular information about changes and updates to Google Search.
Since Google provides that level of communication with the search engine optimization world, we thought it would be interesting to track what they’ve been saying. So we asked one of our team members to curate Google’s most interesting search-related announcements and answers from the past year, and present them to you now for your education and enlightenment. I’m also going to add my commentary on the significance of each one.
A Year in Google Search Announcements
Note: We intentionally did not include alleged algorithm updates, as Google generally no longer confirms these, and says most updates are small, incremental, and take place almost daily.
January 16, 2017: Gary Illyes posts on the Google Blog “What Crawl Budget Means for GoogleBot,” stating that Google defines “crawl budget as the number of URLs Googlebot can and wants to crawl.” He clarified that smaller sites (with no more than a few thousand URLs) don’t need to worry about crawl budget, as well as what is meant by crawl rate limit and crawl demand, the two factors that determine how often and to what extent a site is crawled.
As a site owner, if you have a larger site, managing crawl budget is incredibly important. That’s one reason why our crawler (we call it SEOCrawlTM) has built-in functions to examine your log files and analyze what pages Google is crawling, and then compare that to what we see in our crawl.
Learn more about why average daily crawl is an important SEO metric, see 3 Unique SEO Metrics to Investigate
February 2017: Google posted new industry standards for mobile page load times. Google discovered that 70% of pages took nearly 7 seconds to load visual content above the fold, noting that 53% of users abandon a site that takes longer than 3 seconds to load.
We’ve seen other data about performance as well. Amazon found that increasing their page load time by 0.1 seconds decreased their revenue by 1%. Walmart found that a one-second decrease in page load time increased their conversions by 2%. Bottom line is that page load time is a huge factor in the profitability of your site!
See also Why Your SEO Has a Need for Speed!
March 27, 2017: Gary Illyes explained why machine learning will not take over Google’s algorithms. While used in a few algorithms, such as RankBrain, machine learning and AI algorithms are too difficult to debug.
As someone that has developed machine learning algorithms, I can tell you that this explanation rings true. Machine learning is very powerful in many ways, but the resulting algorithms are opaque to humans. I.e., you don’t have a feeling or intuition for what they’re doing. In addition, you don’t know how the algo will handle various edge cases, or even what those edge cases are in the algorithm. This is a big deal, and a strong reason for using human-generated algos when you can.
Learn more in my post Machine Learning: Doing SEO When the Future is Now
March 30, 2017: Gary Illyes confirms that Google doesn’t use Facebook likes as a ranking signal.
The case of Facebook likes is an interesting one. It’s pretty obvious that they can’t use them because there is no place for them to go get that data. Google can’t crawl a page to see what pages you’ve liked. Hence, there is no doubt that they don’t use this signal.
See also “How Does Social Media Affect SEO?” and “Here’s Why Google Doesn’t Use Social Signals (video).”
April 25, 2017: Google announces Project Owl + Google Search quality updates to go after fake news, offensive content, and poor results in search features such as auto-suggest and featured snippets using user feedback. They also reiterated that they had been working hard to favor more “authoritative” sources, even for obscure or rare queries.
This was a critical step for Google. Problems with fake news and other quality issues really hurt them with users. People get turned off by these problems, and they will go elsewhere if Google has quality problems. Still doubtful about this argument? This issue of managing user satisfaction impacts Facebook as well. Check out their January 11, 2018 announcement about overhauling their news feed.
May 25, 2017: Gary Illyes posted on the Google Webmasters Blog about Google’s disfavor toward large-scale article campaigns for link building. While Google does not discourage guest posting that “informs users, educates another site’s audience or brings awareness to your cause or company,” it is clear that the main purpose of the content is to create links back to a certain site. Sites that routinely publish such content may have their search rankings lowered.
This was important reinforcement by Google. They’ve warned about poor guest posting practices for years. However, my sense is that there are some changes to how Google is willing to act on this policy. Whatever you choose to do, any guest posting you do should be focused on very high-quality content on sites that are highly relevant to your business.
May 31, 2017: John Mueller stated that various Google Algorithms share data with each other. For example, if Panda determines that a site is low quality, the indexing algorithm might use that information to slow down its crawling rate for that site.
If you think about it, that makes sense, as why should Google waste resources on a site that isn’t going to contribute much to the quality of its search results? I think that it’s likely that various quality signals are shared in many ways across their algorithms
June 22, 2017: Gary Illyes tweeted that Google likes breadcrumb navigation on a site, and that Google treats such links as normal links for calculating PageRank flow.
Breadcrumbs are something that every site should have. It helps users better understand where they are on the site, and it helps reinforce the site structure in a clear way for search engines.
June 27, 2017: Google redesigns Google News “with a renewed focus on facts, diverse perspectives, and more control for users.” They also sought to improve readability (chiefly by use of cards) and user navigation.
Another move targeted at the all-important goal of very high user satisfaction.
June 27, 2017: Google showed that they consider branded anchor text unnatural in widget links. Jennifer Slegg reminded us that Google had warned in the past about using widgets for link building, but many assumed that a followed link on a widget would be OK as long as its anchor text was “branded” (i.e., the name of the business or site rather than a commercial term).
However, SEO Marie Haynes tweeted that Google included branded anchor text in the examples provided to a couple of her clients that received manual penalties for widget links.
The issue here is that many of the people who publish widgets on their sites just don’t care about the link, and that means that it’s not really an editorial endorsement. And, if this is a large part of your link profile, that’s a problem.
June 28, 2017: Gary Illyes stated that the Google Search algorithm does not look at Google Analytics data. In testing we did we showed that it is also unlikely that Google uses it Analytics product to discover new URLs.
This is a long-stated position of Google. Personally, it’s hard for me to believe that some aggregated level of info isn’t used. On the other hand, if they were actively using GA data, it could really hurt the adoption of GA in the marketplace.
July 7, 2017: Aaron Bradley annotated and linked to Google posts about updates to its Structured Data Developer Guides.
One of the more interesting developments of 2017 was the fact that they began to reinforce the importance of structured data. I’m a strong proponent of actively using it on your site. It helps them better understand the site content, and I anticipate that we will see more search features enabled by them in 2018.
See also Why You Need to Understand Structured Data and Semantic Search and SEO: Everything You Need to Know
July 19, 2017: Google changed how it reports impressions and other metrics in the Search Analytics section of Google Search Console, particularly for “results in lower positions.”
Here is exactly what Google said about it: “An incremental improvement in Google’s logging system now provides better accounting for results in lower positions. This change might cause increase in impressions, but also a decrease in average positions. This change only effects [sic] Search Console reporting, not your actual performance on Google Search.”
So the change was not a bug, but instead a fix to a historical bug, and it impacts your data since July 13, 2017.
July 19, 2017: Google introduced the feed, a personalized news stream on iOS and Android. The feed serves up news items based on your search history. Users can also add topics they wish to follow via the feed.
Very interesting move by Google, as it’s designed to compete with Facebook’s news feed functionality. I’m dubious that this is a major threat to Facebook, but even with a modest number of users (e.g. tens of millions of them), Google would still be able to get a tremendous amount of information on the users who engage with this.
August 2017: Google published new documents on rendering and debugging for developers. These docs describe what Googlebot can render, and how to debug rendering problems. They also updated the Google Search Quality Rater Guidelines.
Everyone should read the Search Quality Rater Guidelines. No seriously, I mean it. As for the rendering and debugging docs, if you’re a technical SEO, you should read those too.
October 2017: The Chrome browser started labeling HTTP pages with forms “not secure.” This was a continuation of Google’s push to get all sites to adopt HTTPS.
Something to watch for: Once http: sites become a bit rarer, Google could potentially start to attach a negative ranking weight to those sites.
November 27, 2017: Google posts a reminder about “event” markup being misused for things like coupons or vouchers. John Mueller stated that structured data should match the content.
This is consistent with the earlier structured data analysis that I provided. Since Google wants to use that more, they also need to police it more carefully too.
December 4, 2017: Google extended search results snippets to a maximum of 320 characters to “provide more descriptive and useful snippets” and “to help people better understand how pages are relevant to their searches.”
This is pretty interesting. I bet that this ties into Google’s interest in featured snippets and voice search. If they can get site owners to give them more data on a page via the meta description, they can potentially use that in their analysis of content quality.
See also The Value of Metadata for SEO, and How to Use It
December: Google updated its SEO Starter guide, basic guidelines and best practices for SEO.
A great primer on Google’s position on SEO. Not much else to say about it. ;->
Did we miss anything you considered a major Google announcement over the past year? Let me know in the comments!