Skip to main content

Research and Studies

Majestic OSE Shootout

There are a number of tools available in the market today that provide backlink data for websites. The two most well-known ones are Open Site Explorer (OSE) and Majestic SEO. But which is better?

I decided to dig in and try to get an answer to that question. It then occurred to me that the best way to do that was to get Rand Fishkin (SEOMoz’s CEO) and Dixon Jones (Majestic’s Marketing Director) to help me out. Hey – great to offload some of the work, right?

So with their help, we built out a plan for what Dixon started to call “Pistols at Dawn with Rand”. In spite of the name, the whole experience was fascinating because of the tremendous mutual respect Rand and Dixon have for each other. There may have been pistols involved, but they were not doing much bodily harm given that they were not loaded. Honestly, it was a privilege to work with the two of them for that, and many other reasons.

Structure of the “Shootout”

Here are the main elements of how this was done:

  1. Dixon and Rand each named 2 people they know who they believed to be active users of both Majestic and OSE. These were subject to the approval of the other party, but the proposed parties were quickly accepted on the first pass. As a result, we ended up with 4 people who would participate in the Shootout.

The role of these people was to share their experiences with both tools, and address a number of questions that would also be picked by Dixon and Rand.

  1. In the same fashion, Dixon and Rand each picked 3 questions they wanted included in the Shootout. Of course, each party focused on questions that they thought would favor their product. These were also subject to the agreement of the other party, but as with the panelists, this happened on the very first pass.

  2. The panelists then received the questions and contributed their time to provide their answers. In many cases, they did additional research to determine what their thoughts were. This was a substantial effort, and we are grateful for their contributions!

  3. Dixon and Rand then provided their responses and comments to what was said.

The Panelists

Our panelists were well selected by Dixon and Rand, and they did an awesome job in providing their thoughts on the two products. Here they are:

Richard Baxter is the founder of SEO Gadget, an SEO agency specializing in large site architecture, conversion rate optimization (CRO), keyword research, technical SEO, infographics, content development and most of all, link building in competitive industries. Catch him speaking at Mozcon on the 26th of July.
Over the past 13 years, Wil Reynolds has dedicated himself to doing two things well: driving traffic to sites from search engines and analyzing the impact that traffic has on the bottom line of companies. Wil’s career began at a search marketing agency in 1999, where he spearheaded the SEO strategies for companies that included Barnes & Noble, Harman Kardon, and Mercedes Benz USA. Today he is the founder of SEER Interactive, a web marketing firm founded in 2002. Follow him on Twitter @wilreynolds to continue the conversation
Wiep Knol is a creative link marketer that lives in The Netherlands. From there he runs his company Gila Media, which offers link building services and consultancy. He is also the co-founder of, a Dutch link building agency. offers link building consultancy, workshops, training sessions and campaign management for (mainly) Dutch websites.
Branko Rihtman has been optimizing sites for search engines since 2001 for clients and own web properties in a variety of competitive niches. Over that time, Branko realized the importance of properly done research and experimentation and started publishing findings and experiments at SEO Scientist. Branko is currently responsible for SEO R&D at RankAbove, provider of a leading SEO SaaS platform – Drive.

Scores and Panelist Comments:

1. Freshness of data (delay between link being detected and indexed)

Which program finds new links the fastest? One of the key factors driving this is how often your index is updated, and the panelists got focused on this aspect of freshness in a hurry.


p class=”orangebox”>Winner: Majestic
Score: 7.5 to 5.25

Richard Baxter At the moment, SEOmoz’s index isn’t updated often in comparison to Majestic. Once every 30 days is the target, though the reality of the situation is that SEOmoz is far more intermittent than this. They’re open about the reasons behind this and seem confident in their aspiration to bulletproof their update frequency.

“Given Majestic tends to crawl a single domain more deeply than SEOmoz, I’d put my money on the Fresh index getting there first.”

Wil Reynolds I think Freshness for your own backlinks may be owned by the search engines, but for competitors, Majestic is the winner, with Moz improving every day.

Wiep Knol As MJ and OSE have different index update rates, it’s hard to compare. MajesticSEO’s daily updated Fresh Index clearly has a huge advantage over OSE’s monthly updates. For smaller and/ or local (European) websites, Majestic’s index is much more comprehensive than Moz’ index.

Branko Rihtman In a lot of cases, especially when the links involved are of low quality, they will not be shown in OSE, only in Majestic Fresh Index.”

Some excellent incremental analysis by Branko on OSE and Majestic vs. the search engine tools appears here.

Feedback from Rand: I think this is a totally fair assessment. Despite our goal of indices every 30 days, we’ve had several misses in the past 6 months, and we should absolutely be called to account for it. Our engineers and technical operations staff has been working hard to stop this from happening, but the truth is that until we move off Amazon’s unstable hardware and rebuild parts of our legacy code, we’re going to be continually plagued by hard drive failures, service interruptions and the delays those cause to processing.

The basic story is that crawling and indexing the web’s links are a challenge, but one we’ve largely solved (as have Majestic). Processing those hundreds of billions and sometimes trillions of rows of data is another matter. Particularly for complex processing operations like sorting anchor text counts for every page on the web, producing machine-learning-based metrics like Page/Domain Authority and running expensive calculations like mozRank and mozTrust, machines have to stay up and running perfectly for many uninterrupted hours. When that doesn’t happen, we need to roll back to the last save point and try again, hence index delays.

In the next 3 months, we’re going to be trying a large number of experiments to get indices out every 2 weeks (rather than monthly). However, given the historical challenges we’ve faced, I wouldn’t feel confident guaranteeing we’ll meet that. Sometime soon though, we will be at bi-weekly indices and then moving to weekly updates. Our own hardware, a new system of processing and more efficient code will make this possible. We just need time to put our new investment capital to work.

Feedback from Dixon: I would have been very concerned if we had not won this one! We have two indexes – one which updates faster than daily and includes a similar index size to that of SEOMoz. To be honest – even our historic index (which is MUCH larger and goes back 5 years) is updating consistently every month. The daily update does mean that we are better placed to report on fresh data faster.

2. Predictability of updates (conformance to scheduled times)

Do updates come when expected? Some SEO marketers want to know when the update is coming as it has it may impact when they update their own work.

Winner: Majestic
Score: 8 to 5.25

Richard Baxter SEOMoz have been unpredictable recently, in that their target dates are missed, and usually explained with a later blog post. They’re pretty transparent on this (and let’s be fair, have been striving to achieve some pretty lofty goals given the age and experience of the startup).

Some incremental commentary by Richard on domain diversity appears here.

Wil Reynolds “Moz is improving regularly, which means they are slightly less reliable to come in at the same time every time, but both tools update frequently enough, that to me this is a not a HUGE issue.

Wiep Knol SEOmoz is very clear about their Mozscape updates, see the Linkscape schedule. Majestic’s fresh index updates daily and the historic index monthly, so that’s pretty predictable as well.

Branko Rihtman There is no comparison here. Majestic Fresh index is updated daily and is pushed to historic index on a monthly basis. As I mentioned above, I am already finding links in Majestic to articles published just a few days ago, while OSE has no data for those URLs. Majestic even reports on links not yet reported by search engines so Majestic takes this one by far.

Feedback from Rand: No excuses here. We need to become better and more consistent and we will. As I noted above, we’re taking a lot of steps and putting both talented people and financial resources to work to make this happen. Nothing’s a higher priority inside SEOmoz today.

Feedback from Dixon: Apples and oranges. We update daily. SEOMoz attempt to update monthly. Our monthly updates are also pretty regular… but on a MUCH larger index.

3. API performance (uptime and speed of response)

Extracting data via an API is a key feature when you offer your own tool. But, if the performance is slow, it can make your tool look bad. The score on this one was exceedingly close!

Winner: Open Site Explorer (by a nose!)
Score: 8 to 7.5

Wil Reynolds Uptime is fine for both. I think when there are issues Majestic gets back to you slower than Moz from a customer service standpoint.

Branko Rihtman They are both performing great for what I needed (and that was very small scale), however, OSE API has a free version, which, even though is throttled, can help a lot for concept testing and small tasks.

Feedback from Rand: We made some recent investments in additional hardware capacity for the API, and are seeing good results here. This work will continue, as we plan to make more and more available via API both for free and in the paid version.

Feedback from Dixon: I am glad that the SEOMoz API seems to be running well. I don’t think we have ever had many complaints about speed and we are built mostly for scale. As long as you use our GetTopBacklinks command in our API (which can return details on the top 50,000 links for any domain or URL in a fraction of a second) that I think you are fine with our API at scale. I think that Branko’s point about SEOMoz having a free API is well made. I hope that’s costing you loads to run Rand!

4. Percent of links reported by Google Webmaster Tools that Majestic/Moz know about from a variety of sites

Does the tool find all the links to be found? In all do frankness, neither tool did that, but for many marketers, more is better!

Winner: Majestic
Score: 6.5 to 4.5

Richard Baxter I initially had no idea so I went to check it out. Majestic Historic carries approximately two-thirds of the root domain diversity of the GWMT index. OSE and majestic fresh, around half

Wiep Knol I did a quick check of several websites with relatively small link profiles (as OSE exports are limited to 10k links). On average, Majestic scored 50% better than OSE, but both still missed quite a lot of (indexable) referrers.

Branko Rihtman OSE has made great improvements in their percentage of coverage recently and I believe they will continue to improve here. I took a random sample of 10 URLs from 7 domains in very different industries. As can be seen from the chart below, the results vary, although, in a lot of cases, MajesticSEO (blue columns) gives numbers higher than GWT, even if OSE (red columns) reports numbers lower than GWT. If we take into the account the high decay rate that Majestic Historic Index has, discounting those links would probably bring the link numbers closer to the real number than OSE. This is something that I have noticed in real life situations as well.

Feedback from Rand: I’d be careful about drawing conclusions from small sample sets here (say a few dozen or even a few hundred sites). As Branko’s data shows, the numbers on any given URL can bounce around a ton for who has more links or linking root domains, and it’s often a function of how crawling is prioritized and what each index chooses to keep vs. toss (based on quality/spam/depth metrics).

Richard’s guess that Majestic Fresh and OSE have around half of what Google Webmaster Tools reports for any given site strikes me as reasonable. We suspect Google maintains around 250-500 billion URLs in their main index at any given time these days (up from ~150 billion several years ago). Those are our goal numbers for the future.

Feedback from Dixon: I think it is fair to say that both of us can do better here. Can try harder… will try harder!

5. Percent of links reported by Moz & Majestic that still exist

How good is the tool at clearing out those old links? While it may be interesting to see historical links that have disappeared, some SEOs would prefer to only see those that are currently in place.
Note that Majestic was scored based on their Fresh index for this question.

Winner: Open Site Explorer
Score: 6.5 to 6.75

Richard Baxter SEOmoz tend to filter for the higher quality links so their accuracy over time isn’t bad, even if their updates are infrequent.

Fresh is better than historic at this, on that note I’ll score Majestic higher than I would have if fresh didn’t exist!”

Wil Reynolds I will say that Moz’s links are much more often than not actually up and serve many fewer false positives.

Wiep Knol OSE and Majestic’s fresh index are comparable, but MJ’s historic index contains *lots* of links that don’t exist anymore.

Branko Rihtman I have actually done an extensive study on percentages of live links reported by OSE and Majestic (and other tools) and Majestic Historic Tool had the highest rate of decay, while the Fresh index had the lowest decay rate of all the tools reported.

Lots more data from Branko available here.

Feedback from Rand: We do our best to crawl pages and sites we’ve seen to be high quality and well-linked-to rather than include anything and everything the web puts out (which can vary widely in quality and reliability). As our indices get fresher, we expect these numbers to improve even more dramatically.

Feedback from Dixon: Fair enough that SEOMoz won. Well done SEOMoz. But again it depends a bit on where you look in our data. I absolutely accept that in our historic index, we just cannot check 3.8 trillion links going back over a 5-year cycle – but this is why we have the Fresh index. Every link in the fresh index has been verified within at least the last 60 days and we include the date it was last seen. The majority of these links were seen within a few days and here – I think we would fair very well with SEOMoz, which is more “apples with apples” as these indexes are of similar sizes.

6. Correlation of key metrics (e.g. PA/DA and ACRank) with SERPs

How do the metrics the tool provide help you evaluate which links are the most important? If you are looking for authoritative links this is a key factor in working fast and efficiently.
The scores used Majestic’s ACRank as this is what the panelists judged. Majestic’s Citation Flow and TrustFlow metrics were noted as being much better, but the panelists did not have enough experience with them yet to use them in the scoring.

Winner: Open Site Explorer
Score: 7 to 3.33

Wil Reynolds Without a doubt this is where Moz shines (and user experience of the tool). AC rank is for the most part not something I would recommend any SEO to use.

Wiep Knol I’m not a fan of AC Rank, but the recently released Citation Flow and Trust Flow are a big improvement, imho. However, I still prefer Page Authority & Domain Authority.

Branko Rihtman If we compare PA/DA and ACRank, OSE metrics outperform ACRank by a huge margin.

A more extensive commentary by Branko of the correlation of these metrics to rankings can be found here.
“Majestic have recently released a new set of metrics which correlate with PR much better than any other metric (not that it means anything about correlation to rankings) and are slowly moving away from the ACRank.”

Feedback from Rand: Page Authority and Domain Authority continue to be huge investments for us, but, we believe worthwhile. The frustration and cost associated with producing them means indices are harder to create and deliver, but we believe they’re an essential part of what marketers need to evaluate sites and pages. As index quality and the algorithms have improved, we’ve seen correlations with rankings as high as 0.42 for Page Authority – that’s a great signal that the link measurements calculated by these metrics are predictive for rankings.

Feedback from Dixon: SEOMoz won that by a mile – but since this project started, we have made dramatic improvements to our key metrics, to the point that ACRank is now a legacy command and no longer easily available in our web product. Flow metrics ( are intrinsically different and we are currently getting some high-level independent research carried out as to whether they are “doing the job”. The results are extremely encouraging and what is more – we know how to make flow metrics even better.

Summary Comments

Richard Baxter Both tools have their own strengths and weaknesses. I tend to default to OSE for its usability and ease/speed of use, and very often (several times, daily). Majestic data is good for really large scale link analysis but I always supplement it with Linkscape’s (de-duping the resultant data set) – and then we supplement that with our own data! I also refer to Majestic less frequently – perhaps once or twice a month.

Branko Rihtman I feel that MajesticSEO and OSE have focused, over time, on different target markets of SEOs. While OSE is a better tool for people that want to get a quick glance over link profiles, identify quality link sources according to proprietary SEOMoz metrics and use the numbers that OSE provides, MajesticSEO is a better tool for those that prefer working with large amounts of raw backlink data, with higher granularity analysis capabilities and without applying someone else’s judgments over what links are of higher quality.

Feedback from Dixon: I think the reviewers were pretty fair in their assessments and thank you to them especially for their diligence and time. I can see where we clearly need to improve and where we are strong. I am sure neither SEOMoz or MajesticSEO will rest on their laurels – not for a second. I thoroughly enjoy having such a great product as Open Site Explorer to benchmark against and it’s clear that we drive each other to a better product for users – which is what it’s all about, I hope.

Feedback from Rand: Like many SEOs, I’m impressed by Majestic and I’m a user of both services (Mozscape & MJ). Both have obvious strengths and weaknesses. At Moz, we’re doing everything in our power to move forward on areas of strength (API, metrics, correlations, UX and flexibility of OSE, etc) and to shore up areas of weakness (frequency & reliability of updates, total size of indices).

Both our services are going to improve dramatically, as both are backed by passionate people who care about their products and customers. And for that, I’m grateful and I believe everyone who consumes link data should be as well. We have the best of all worlds – competitors who work fiercely to build better and better products, but who do so without vitriol or negativity. The friendships between our two organizations and between many of the marketing industry’s best and brightest makes ours an exceptional field to be in. I’m proud of the work of both companies and even prouder to be part of an industry where this type of friendly, personable and passionate rivalry is commonplace. May it always be thus.

Final Thoughts

I think that Branko summarizes the comparison very well. OSE for identifying which links are the most important, Majestic for getting a more comprehensive data set. As you see above, both companies are working hard to shore up their weaknesses and improve.
In the meantime, they face additional market pressure from other companies offering tools, such as Link Research Tools and ahrefs. Bing Webmaster Tools now offers a way to pull link data on competition as well. One key market development would be if Bing decided to offer a comprehensive data set. That would certainly make their tool impossible to ignore.
Regardless, OSE and Majestic will remain major players in the market. Which one is best for you? Like many others, we use both!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Eric Enge

Eric Enge is part of the Digital Marketing practice at Perficient. He designs studies and produces industry-related research to help prove, debunk, or evolve assumptions about digital marketing practices and their value. Eric is a writer, blogger, researcher, teacher, and keynote speaker and panelist at major industry conferences. Partnering with several other experts, Eric served as the lead author of The Art of SEO.

More from this Author

Follow Us