Why Google not crawling my pages?

  • Post author:
  • Post category:MarTech
  • Reading time:3 mins read

Many a times we get in a situation where it seems we have done everything right while developing website but still Google is not crawling and hence indexing the pages.

Here are most probable reasons that can address your discrepancies:


No or mis-configured robots.txt

Robots.txt is a way to tell search engine spiders including Google, which Web pages of your site should be indexed and which Web pages should be ignored. So, it extremely necessary that you configure this file correctly.

Incorrect “.htaccess” file settings

.htaccess file is mainly used to redirect your urls properly not only if your website is new one but also you are looking to redesign you existing website with change in URL strategy or changing domain. If not optimized correctly, you can lose your existing organic traffic.

Badly written title, meta & author tags

These tags add value to your content. They describes your pages and help associating known author to the same.
Follow below RIGHT FORMAT.

<title>Google typically displays the first 50-60 characters</title>
<meta name="description" content="Meta descriptions can be any length, but search engines generally truncate snippets longer than 160 characters. It is best to keep meta descriptions between 150 and 160 characters.">
<meta name="author" content="http://plus.google.com/{your-profile-id}">

Apart from this, make sure content is not Excluded by the nofollow Robots META Tag.

Incorrectly configured URL parameters

You can configure URL parameters in Webmasters tool to tell Google what dynamic links you do not want to get indexed.

Connectivity or DNS issues

These issues will restrict Google spiders to reach your web server. This could be several reasons such as your host is down for maintenance or glitch at your environment end. Do check your environment setup or contact hosting service provider for that.

Domain with bad history

Domains used as part of link spam or other penalization worthy schemes, Cloaking or Pages that install trojans, viruses, & other adware, etc will result in de-indexing by Google.

Alternatively, take care of below things to help Google crawl your pages:

  1. Check out Crawl error on Google Webmasters tool and address them.
  2. Be extra careful with AJAX application with content and do follow the guidelines to make then crawlable & indexable. Follow guidelines for Making AJAX Applications Crawlable.
  3. Add robots.txt and make sure that’s working.
  4. Add sitemap to your site.


(Visited 108 times, 1 visits today)