Skip to main content

SEO

The Role of Trust in Rankings

Trust is one of the factors in web rankings that is the least understood. Partly this is because the simplicity of the days when Google’s PageRank algorithm was all you needed to know are long gone. Now there are many other factors that enter into ranking, with the 2 major new ones being relevance and trust.
One of the first ways publishers got exposed to the issue of trust was when people started noticing the Google “Sandbox”. This was a phenomenon related to new sites, and how long it took before they were allowed to rank at the level that their content and link profile deserved. What people began to see is that a new site would begin to see some initial progress in Google, and then it would just stop progressing, even though they continued to add content and links.
It did not make sense unless some sort of braking action was being applied to the site. Personally, I have seen this happen in stages. A site builds up some initial traffic and then stops progressing. Lots of time passes while lots of new content and quality links are added, but no search traffic gains follow. But then, it takes a step function leap upward. Then it freezes again while lots of content and links are added without growing in search traffic, and then another leap occurs. And so forth. It is almost like there are levels of trust that the site goes through until their site finally achieves a high enough trust level that it responds to new link and content additions in a much more timely manner.
There are documents from the search engines that discuss trust, such as the 2004 Yahoo! and Stanford University paper titled Combatting WebSpam with TrustRank. The basis of this paper was the notion of using manual human review to identify a small set of seed pages (from sites that were deemed to be the most trusted/authoritative).
This certainly would provide a significant incentive to get very high-value links. Clearly, there is a deeper connection here, between authoritative links and the ability of your site to receive lots of search engine traffic. If you are looking to build a site that receives tens of thousands of search referrals a day in a competitive space, you had the best plan on getting some of these authoritative links. You will only be able to get so far with low to medium quality links.
Expanding on this slightly, there is the obvious incremental notion of “Reverse TrustRank”. I.e., if your site links to spammy sites that this should lower its TrustRank, and in fact, your distance from spammy sites could be a factor. This should provide ample motivation to make sure that you take care to not link to any bad sites or even sites that link to bad sites. You might want to check the outbound link profile of the sites you link to.
The available evidence suggests that the search engines all use some form of trust measurement to evaluate websites and that this can be a driving factor in rankings. The simplest method for calculating trust is based on the theory laid out in the Yahoo! and Stanford paper – your proximity to authoritative sites.
Whether this is done manually (as suggested in the paper) or not doesn’t matter. What does matter is figuring out who those authoritative sites are and getting them to link to you?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Eric Enge

Eric Enge is part of the Digital Marketing practice at Perficient. He designs studies and produces industry-related research to help prove, debunk, or evolve assumptions about digital marketing practices and their value. Eric is a writer, blogger, researcher, teacher, and keynote speaker and panelist at major industry conferences. Partnering with several other experts, Eric served as the lead author of The Art of SEO.

More from this Author

Categories
Follow Us