The Google algorithm is evolving. Gone are the days when you can read a single patent, or a document like the PageRank thesis written by Google’s Larry Page and Sergey Brin to figure out how it all works.
That PageRank paper was at the time a revolutionary algorithm, and it launched the dominant search engine of our generation. However, it also launched a generation of spammers. The reason is that people could read the paper and understand how it worked, and then make SEO decisions based around manipulating the algorithm.
But no more. The major search engines will all continue to publish patents, in fact, they publish lots of them. As a result, too many new possibilities have arisen, too many new potential rankings signals have been identified.
It becomes difficult to separate the wheat from the chaff. You can read through all the patents, but they are not all going to be implemented. For example, is Google going to use SearchWiki to provide input for rankings purposes? If so how? And how much weight will it get?
One possibility is to use it as a validation of what link data tells them. A publisher manages to spam their way to very high rankings in Google using some combination of methods for acquiring links. But the website experience is not very good, and the products or services are perhaps even worse. So over time, the site accumulates lots of negative votes and few positive ones. Can you imagine a search engineer looking at that data and not lowering the rankings for that site?
A short aside on patents
Patents are their own competitive landscape and game. Major companies (such as the search companies) use them as negotiating tools. Many times businesses file patents based on an idea they never intend to implement. It is the intellectual property equivalent of a land grab. It may be a good idea, or even a great one, but that does not guarantee its implementation by the company who files the patent.
However, perhaps one of their large competitors will build something that makes use of concepts that are covered in the patent. That would be a big win because they can then sue the competitor for patent infringement. Of course, this brings up the next layer of the patent game, which I will illustrate with an example.
Company A sues Company B for violating patent 1243. Company B then delves into its patent portfolio to figure out which patents they can argue that Company A is violating. If Company A and Company B both have large patent portfolios, chances are good that both of them are violating one or more of the other’s patents. As long as there is a balance in the level of violations, the companies work out a cross-licensing agreement and move on. Why go through all this trouble? Because it creates a huge barrier to entry for new rivals.
The consequence of this is that all the search engine companies are publishing dozens of patents, loaded with interesting ideas for ranking signals for one or more aspects of their respective algorithms. Yet only some of them will be implemented.
Bottom Line for SEOs
Links are still a huge signal, and they will be for a long time to come. But, search engines are going to introduce more and more signals that will help them improve their algorithms over time. One of the most important aspects of this is the fact that these changes are not known to the public. It is much harder to spam something when you don’t know how it is designed. Getting the algorithm to be secret and unknown again is a strategic objective for Google. Finding a set of signals that offsets the inaccuracies introduced by the practice of buying links to influence search rankings is a must for them.
So while it may sound a bit trite, they are bound and determined to create a world where the winners are the ones that combine the best user experience with the best promotional plan. This is, after all, what is best for users.