Or! Why ‘Being Natural’ Is the First and Foremost Golden Rule of SEO
When it comes to updating the search algorithm, Google takes some sharp steps to make sure that they maintain the actual exercise of white hat SEO. The only motto behind this is to deliver better and much more reliable search results to the users. Continuing on this line, this time too, the EMD update that Google introduced along with the Panda update will issue a check and reduce the “low quality” exact match domains in the search results. Disavow was yet another blow to traditional link building. Site owners can now clean low quality outbound links off their site.
The traditional way to find a place in the top 10 rankings is to register a domain and build an authority site. The more exact the keyword featured in the domain name is, the higher is the possibility of ranking well. This strategy has helped most of the B2B and the B2C sites enjoy a raised ranking by incorporating high-traffic keywords into their domain names. Such domains bear an exact relevance to the searches, and are thus called Exact Match Domains (EMDs). However, nothing remains ‘unregistered’ to Google’s eyes. The new EMD update will take action against these sites, and offer a more natural and appropriate result oriented search to the users.
Most of the exact match domains tend to be spammy as most of them are devoid of quality content. Useless articles with a large number of keywords, unrelated links, and too many ads are the things that exact domain websites usually indulge in, and most of them provide zero or little value to the users. The new EMD update will now downgrade all these websites, and encourage those sites which indulge in only white hat SEO techniques and provide value to users with quality content.
Getting a Better Understanding of Google’s New Update
Google’s principle engineer Matt Cutts had twitted “reduce low-quality ‘exact-match’ domains in search results”, which was all about the EMD update. The most dominant feature of this algorithm update is how it works. The algorithm does not necessarily eliminate all the exact domain websites, but keeps a tight check on them to make sure that only sites with useful and quality content gets promoted. This algorithm will run quite often to deal with those sites that escaped the previous updates. Google has made the process of filtering a lot harder and is now intolerant to all shortcut SEO strategies and blackhat optimization methods.
It is making the search process and ranking more natural, and this has led to the worry of most webmasters who had highly ranking sites earlier. Figures obtained till now show that almost 1000 SERPs have seen a drop from the top 10 rankings, and the newcomers have seen sharp declines. In fact, it is strange that this process of downgrading does not follow any clear pattern. The penalty has taken a toll on many of the renowned top ranking websites. For instance, sites that ranked at #1 are now sitting at #234 on the 24th page or so.
The question remains about whether all the exact domain websites should worry about this. The answer is a ‘no’. Google has not downgraded all the exact matching domains since some of them have quality content and do not indulge in black hat marketing and SEO methods.
Overcoming the Updates
The answer to the ultimate question of how to achieve good rankings in Google can be found in the precise criteria that Google follows in ranking websites. Whatever the niche may be, Google looks for genuine and authority sites, which have quality content, provide a rich experience, and above all look natural. Naturality and authority are the principal factors that can catapult any website to the top of the search results. Now what is naturality? In the prying eyes of Google, a natural website is one which has high quality unique and dependable content, and an average commercial value. The website must possess a blend of relevant and non-relevant backlinks acquired naturally though approved backlinking methods over a long period.
Any authentic and legitimate website in any niche will have high quality original content, rich user experience, and a naturally acquired diverse backlink arsenal. It might be true that ranking a website in the natural way is time-consuming, but it is undoubtedly the most effective and the straight-forward way to update-proof the website. Here are some key points to be considered while optimizing a website in the post Panda, EMD and Disavow scenario that emphasize the importance of being natural:
- Take down all the poor quality content and links from the website. Clean out content used for the sole purpose of SEO, and not for readers. It is always good to outsource all the writing work to a professional ghostwriter. A good writer can do wonders to the website. Leave out a bit of the low quality and irrelevant links just to maintain the link diversity of the website’s link structure, and thereby look natural to the search engines.
- Unnecessary link farming and keyword stuffing can get a website penalized in no time. The content and the quality of the articles in any website should be top notch and more information oriented. Usage of the keywords just for the sake of higher ranks is not at all recommended as it could turn out to be counterproductive. Moreover, grammar, spellings, and other aspects of the content must be error free and original.
- If a majority of the contextual links, whether inbound or outbound, seems to be of low quality or spammy, it could result in poor search engine rankings and low page ranks. Try to provide outbound links related to the content of the page. The outbound links may be directed to authority sites for users who wish to read or find out more on a topic. Using outbound links is one of the most effective ways to improve user experience.
- Internal linking is another white SEO strategy that can boost rankings. Try linking contextual keywords to a related page of the site. The best example of a website with an immense internal link structure is Wikipedia. It can be observed that every single keyword on a Wikipedia page is linked to a related page within Wikipedia itself. This adds to the user experience of the site, apart from increasing the backlink count.
- Go for a 2% keyword density and avoid keyword stuffing. During on-page optimization, stop over-optimizing the alt and meta tags. Use keywords that blend into the content. Never hinder the readability for optimization. Also use more images and videos on the home page. Google just loves content that provide a rich user experience.
- Let people know about the website. Sharing content on various social media sites can make a website visible to a ridiculously large number of people. Social media acceptance is proof that people like the website and find it useful. This has also turned into an important criterion for ranking. Social media are also a terrific source of targeted traffic.
- Link building is the best way to direct high traffic to a page. Most traditional backlinking strategies have been ruled out by Google. Stop link farming and spammy backlinking through article submission. Having spammy links within the content is a crime as far as Google is concerned. Such methods no longer work. To stay on the safe side, use backlinking methods approved by Google.
Guest posting is one of the quicker link building methods through which it is possible to reach out to a wider section of readers. Today, most authority sites in any niche allow a few backlinks for guest posts with high quality unique content. Opt for blogs in the same niche that have high page ranks. Guest posting also helps the writer to build a good reputation in the niche.
Authoring articles for Web 2.0 sites is another sure-shot strategy that can get a lot of high quality backlinks. Some Web 2.0 sites like Squidoo own page ranks as high as 7. Make use of such sites for building up an arsenal of high quality backlinks. Do not entirely rule out article submission either, for link diversity is essential to look natural. Whichever method is being followed, make sure that the links are indexed over a long time so that the link building program as a whole looks natural. It is to be noted that outsourcing the link building process may not be always a good idea.
It all got started with the Google Panda update followed by Penguin. The Panda still keeps a check on the poor quality content and keywords on the net, while EMD runs a check on the low quality exact matching domain names. As all this happens, Disavow cleans up the link structure of the sites. It would be absurd to say that SEO is dead, but it sure is tougher than ever. More Google updates are expected to be in line with the recent ones, but now it is all about running away from the EMD metrics of Google. All we have to do is follow the golden rule and be natural.