One of the most common problems experienced when trying to rank in Google, is that your website is not currently being indexed correctly. If this is the case, it means Google is failing to access your web pages to index your site’s content effectively.
To check whether your site is efficiently crawled and listed, you will need to log into your Google Webmaster Tools and check the “Google index” tab. There you will find the total number of pages the search engine has indexed. If you see a drop in the number of these pages, you are likely to experience a decrease in traffic levels.
Finding the Reason Behind Your Indexing Issues
If you’ve taken a look at your Webmaster Tools and it’s clear that not all your pages are being found by Google’s crawlers, now is the time to take a closer examination at the possible problems Google is experiencing with your website.
Does Your Site Have Crawler Errors?
To find out if Google is indexing your site fully, begin by heading to your Google Webmaster Tools dashboard and checking your Crawler Error messages. The most-likely error message you will find is a 404 HTTP Status Code warning. It signals that the URL cannot be found.
Other crawling errors include:
- Robots.txt – A poorly scripted Robots.txt file can be detrimental to your Google indexing. This text file is like a set of instructions telling a search engine crawler not to index parts of your website. If it includes a line such as “User-agent: *Disallow: /” this basically tells every single crawler it experiences to ‘get lost’ – including Google.
- .htaccess – This invisible file can do nasty things if incorrectly configured on your site. Most FTP clients allow you to toggle hidden/seen files so that you can access it if required.
- Meta Tags – If you have pages that aren’t being indexed, be sure they don’t have the following meta tags in the source code: <META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”>
- Sitemaps – If you receive a Sitemaps crawling error, it means your website sitemap is not updating properly; your old sitemap is being repeatedly sent to Google instead. When you’ve tackled any issues signalled by the Webmaster Tools, make sure you run a fresh sitemap and re-submit it.
- URL Parameters – Google allows the option to set URL parameters when it comes to dynamic links. However, incorrect configuration of these can result in pages that you do want picked up being dropped instead.
- DNS or Connectivity issues – If Google’s spiders simply can’t reach your server, then you may encounter a crawler error. This could be for a variety of reasons such as your host is down for maintenance or had a glitch of their own.
- Inherited Issues – If you have bought an old domain or moved your website to an old website’s location it is possible the previous site had a Google penalty. This will inhibit indexing of the new site. You will have to file a reconsideration request with Google.
If you are considering using a historic domain for your site, be sure to take a look at its history before purchasing. You can make use of the Internet Archive’s Wayback Machine to see pages that were previously hosted on your domain.
Does Your Site Have Syntax Errors or Structural Complications?
Google is very tolerant when it comes to HTML mark-up mistakes within webpages, but it is possible that syntax errors can prevent indexing (in extreme cases). Check your site’s HTML with the W3C’s HTML Validator to see a report of errors you need to correct.
Does Your Site Have Inbound Links?
To be indexed with Google, your website needs to have at least one quality inbound link from another website already indexed in the search engine. This is a common reason it takes a lot of new websites a while to be successfully indexed.
One way to create some quick links is to update social networks with your website URL or add a link on an existing related website that you own. Social media profiles that carry high weight include: Facebook pages, Twitter profiles, Google+ profiles/pages, LinkedIn profiles, YouTube channels, and Pinterest profiles.
Offsite content is another excellent way to build links that will help your site get indexed properly. Offsite content is content relevant to your site that is hosted elsewhere: guest posts on other blogs in your niche. Just keep in mind, you need to make sure these external sites are all high quality, as links from ‘spammy’ sites will do your website harm instead of good. The best way to ensure your links are high quality is to make sure that they have ‘natural links’, links that develop as part of the dynamic nature of the internet where other sites link to content they find valuable.
See Google’s Webmaster Guidelines for a more in-depth understanding of what they consider to these to be.
Has Google Penalized You?
One of the most difficult obstacles in proper indexation by Google is a Google penalty. There are a number of reasons why you might encounter a penalty from Google, but if you do not deal with the issue they raise, you may be deindexed (removed from their search engines).
Avoid Google Penalties by Steering Clear of The Following Techniques:
- Automatically generating content
- Link schemes
- Plagiarizing or duplicating content
- Sneaky redirects
- Hidden links & text
- Doorway pages
- Content scraping
- Affiliate programs with little content value
- Using irrelevant keywords
- Pages that install trojans, viruses, & other adware
- Abusing rich snippets
- Automating queries to Google
Recovering from Google penalties requires hard work and due diligence to remove links on your part; you will need to submit a reconsideration request before your site is effectively indexed and ranked once more.
Fix Your Indexing
Most of these checks are quick and easy to make, so don’t let your SEO and link building efforts go to waste – make sure your website is indexed correctly by Google. It’s surprising how many websites make some of the smallest mistakes, and it prevents their site from being indexed correctly. In the end, it hurts their website’s rankings, which hurts their traffic, which hurts their sales.