Home > Blog >

Reviews, The Rating Guidelines & Ranking

Reviews, Rating Guidelines & Ranking
16th Nov 18 (updated: )

Do Reviews Directly Influence Rankings?

There’s been a fair bit of speculation of late around the topic of reviews on third party sites directly affect ranking of the businesses they relate too.

Whilst the focus seems to have been on one service particularly, the America’s centric Better Business Bureau, It’s probably more useful  to consider the whole world of third party review sites, there’s plenty including Google’s own offerings, Trustpilot, Trusted Shops & many more.

So why the focus on the one service? Because it is the one mentioned in the Google search quality rating guidelines (pdf) .

What are the Search Quality Evaluator’s Guidelines?

This is a topic that has been covered extensively in many places, I’m sure many of you are familiar with them, but basically they are the manual detailing how someone tasked with the job of seeing how good a given set of results are by using the methods it details to assess the quality and relevance of the individual sites returned in that search. It is in essence a quality control manual.

It was an internal only Google document, but one that they took the decision to share publicly, and one that was on its way into the public arena anyway via a leak.

Who are Search Quality Evaluators?

There’s a great YouTube video on this, filmed by Matt Cutts and published under the Google Webmasters YouTube channel:


In essence they are part of the testing process Google use to evaluate how good a set of results are. What they don’t do is give a given site or page a score, say 8 out of 10, that is then taken and used as part of the ranking process out in the actual results returned for a given query in the public search engine.

The first few jobs I had were varied. I worked a summer job in a chocolate factory, then moved onto lab work developing ceramics for the aerospace industry. From there it was a company that milled minerals and products into fine powders.

As varied as they were, they all had a similar concept of quality evaluation. The chocolate was produced, then tested to make sure it met the standards of quality, and safety. There was a manual dictating how these tests should be performed.

In the ceramics lab, there was, as you can imagine, an even more extensive testing regiment to make sure that the required tolerances, performance margins etc were being met. Again all with documented procedures to make sure these tests were being carried out consistently. Much the same with the mineral milling company.

Naturally, the tests these documents detailed were as varied at the companies they belonged too, but they all had one thing in common. None of them defined the product. Reading through the manual from the chocolate producer would give you a great idea on how to make sure the strawberry crème centres remained consistently tasty, and free of bits of glass and salmonella. But it absolutely wasn’t a recipe, you couldn’t take it and make an exact matching strawberry crème.

Likewise the testing documents in the ceramics lab told you how to check if a material was likely to stand the heat and vibration at the core of a jet engine running full tilt, powering a Jumbo Jet full of passengers down the runway. But it wouldn’t tell you how that material was made.

In the same way, the evaluators guidelines don’t dictate the search algorithms, they are a measure and test to see if what they wanted to happen is indeed what has happened in a change.

So, with that context, it’s important to understand that what appears in the guide may very well not relate, at all, to what Google use in their algorithms to rank a site.

A barometer gives you an idea of the weather, turning the dial doesn’t turn a rainy day into a sunny one.

Are the Guidelines Useless?

No, I don’t believe they should be dismissed. There’s some great testing methodology in there to apply to your site to give you a somewhat objective view of measuring how likely it is to satisfy a user’s needs. The focus should always be on that.

They are a useful measuring stick, just as they are for the evaluators Google employs, but you need to keep them in that context. They are not the recipe for Google’s ‘Secret Sauce’

So what about those Reviews Sites?

So, I started this post with the topic of review and rating sites, I get I’ve wandered off a little on the topic of the guidelines. That’s really down to the fact I’ve seen lots of references to the existence of the mention of BBB in them as evidence of their use in rankings of a site.

Of course that naturally doesn’t mean that they don’t use them. But here’s my reasoning as to why it wouldn’t make sense for them to do so.

Fundamentally, there is no one source of truth for reviews. The marketplace is fragmented, and covers different areas of the world, and have different criteria for judgment.

Many of the more visible ones also suffer from the fact they are pay to play to a certain extent. Many of the market leaders in the review space charge (a not insignificant) amount to gain proper, full access to enable a business to monitor and manage the feedback they receive on the services.

With the coverage of these services being fragmented, it means that a great business may not be visible on any, the same is true for a poor business too. Likewise a really good business many have a particularly motivated angry customer who is unfairly rating businesses because they wouldn’t bend to unreasonable demands all across the board. Perhaps the other, much happier clientele may represent an overwhelmingly bigger and more accurate picture, but as the business either is unaware of, or unwilling to pay the fees to these services, perhaps these happy customers are not being pushed to them, leaving the perception these reviews sites offer of that company wildly and inaccurately skewed.

Adding to this, there is the fact that they are open to abuse. There’s a whole cottage industry sprung up of places you can purchase positive reviews for your company, and even more nefariously, negative reviews for your competition.

This makes this signal so incredibly noisy, it just wouldn’t make sense for a search engine to rely on them as part of their ranking criteria. Google would have to sacrifice the integrity of the quality of these results to third parties, whose business goals are to charge companies that are getting reviewed, rather than perhaps the overall fairness of those reviews.

A prime example of just how bad this ‘noise’ can be is John Lewis, who manage to be voted both the 6th best site, with a 90% satisfaction rating in a study by Which, covered here by the BBC, and yet manage only 1.9 out of 10 on Trustpilot: John Lewis on Trustpilot.

Review sites do have merit, and can be a great way for you to help reassure your potential customers that you are a safe pair of hands to trust with their hard earned money.

A bad reputation shouldn’t be ignored either, if you are consistently doing something wrong, you need to fix those business issues. Satisfying your customers and providing good service is business 101, but it isn’t a direct ranking factor.

UK travel giant, Thomas Cook, manage to hold a high ranking position for ultra-competitive terms such as ‘package holiday’ (3rd organic result when I checked), yet if third party rating metrics were being relied upon, their 1.1 out of 10 score on Trustpilot: Thomas Cook on Trustpilot would surely be a ticket to the bottom of the serps.

It comes back to that barometer, these rating sites can be a good general test of how your business is performing. But it is that, a measure, not a control.


About the Author:

Dave Smart

Dave Smart

Technical SEO Consultant at Tame the Bots.

Previous:
< JavaScript to Detect Clicks in iFrames

7th Aug 18 // updated: 30th Oct 19

Next:
Managing Correct Status Codes in Modern SPA Sites >

10th Mar 19 // updated: 30th Oct 19