SEO

Reverse Engineering Human Rating to Predict the Future of Search

The key to remaining competitive in organic search is to always be two steps ahead of the search engines. Given the frequency and impact of search engine algorithm updates, those who don’t consider the long-term effects of their optimization efforts are subject to pitfalls similar to those many brands experienced after Google’s Panda and Penguin updates of 2012.

search Reverse Engineering Human Rating to Predict the Future of Search

Part of the puzzle of predicting the future of search engines will take with respect to changes to their search algorithms is first understanding their goals and aligning them with the goals of any brand’s SEO campaign. Google’s mission is to provide users with the most relevant and useful information relevant to a user’s search. Therefore, in order to maintain organic performance, a brand’s content plan must take similar aim.

The other important puzzle piece in predicting how search engines will evolve is gaining a deep understanding of the current landscape and what quality signals the engineers at Google, Bing, etc. are closely monitoring. This allows search marketers to begin to understand the direction search engines are moving toward in further tweaking their algorithms. This direction was much easier to see in the days of Google Labs, which allowed public observation of some of the new products Google was testing – unfortunately, Google Labs was shut down in 2011.

Although the Google Labs project was discontinued, Google gave SEO professionals a very useful clue this year as to where its search algorithms are headed. This came with the leak of the company’s human rater handbook in early September. As part of its internal process to improve search results, Google recruits hundreds of individuals to manually assess the quality of content on specific URLs. Google then uses that data to modify its algorithms and (hopefully) provide a better user experience for its search engine users.

The handbook provides clear cues as to how Google assesses the quality of brands’ content.

By understanding the information Google is attempting to gain through this human-review process, SEO practitioners can begin to understand the direction search engine algorithms will be moving as they progress.

In order to translate this knowledge into tactical SEO operations, it’s important to think about how Google could algorithmically look for the signals it’s asking human reviewers about.

Of course, Google can’t possibly use human raters to assess every webpage in its index. It’s up to the company’s engineers, therefore, to develop ways for Google’s spiders to look for the same quality factors that humans use.

For example, Google asks, “Is it clear who is responsible for the content of the website?” and “Is it clear who is responsible for the content of the page?” By thinking about how these factors could be used algorithmically, it would make sense for search marketers to have a visible link on all content pages to an “About Us” page, which succinctly describes the company, brand, relevant individuals, and provides contact information for those who manage the website.

In order to understand who is responsible for the content of the specific page, search engines could look for a byline marked up with rel=author rich snippets. Therefore, those elements should be used wherever applicable.

Before a new page goes live, a good way to determine if it has potential to rank in search engines for a substantial amount of time is to test the content against Google’s quality guidelines by conducting a heuristic quality evaluation with real test subjects. Recruit target users to review the page(s) and ask them the same questions Google asks its human reviewers. If the content passes the human evaluation, chances are it will fit within Google’s quality guidelines and satisfy its algorithms.

In the modern era of limited positive impact from SEO tactics aimed at artificially enhancing search engine results, it’s time to start focusing on optimizing content for users first and search engines second—not the other way around. Optimizing content to engage actual users, and providing them with real value, will not only help build trust in a brand it will also build long-lasting organic search performance.

marc Reverse Engineering Human Rating to Predict the Future of Search
Marc Purtell is Director of SEO at Matomy SEO, a search marketing and SEO consultancy that is part of the global performance-based marketing company Matomy Media Group (LSE:MTMY). He can be contacted at mpurtell@matomy.com.

Comments are closed.

4 thoughts on “Reverse Engineering Human Rating to Predict the Future of Search

  1. I think a lot of it is common sense. If a company has been optimizing sites correctly, i.e., good content and no spam, their sites should weather the Google changes without a glitch. What’s changed is that Google likes to see a company web community, if you will, not solely thousands of irrelevant back-links and keyword spam.

  2. Marc,

    I agree with your thoughts on the future of SEO and the criticality of thinking beyond today. However, I am no longer sold entirely on the premise: “Google’s mission is to provide users with the most relevant and useful information relevant to a user’s search.”

    I say this because Google has, of late, demonstrated a lack of objectivity by returning many of their results first, even if they are not the best. Two examples would be marginal Google author and YouTube results gaining the top spot. I wrote about this a while back in a post called Google Then and Now: Has Google Lost Their Way…or Found It?: http://richardcummings.info/google-then-now-lost-way-found-it/

    Certainly, everybody optimizes for Google, but, as you mention, it’s important to keep an eye on what everyone is doing in case the landscape changes.

    Cheers,
    Richard