Enhancing Website Crawlability

Crawlability: it sounds like something that you don’t want to have.  Are you crawlable?  Good heavens, no!  But in the SEO world, crawlability is a desirable attribute.  It is the ease with which search bots can comb through your site, and it has a tremendous impact on organic search rankings.  This is especially true of large sites that have thousands of pages, but following a few good practices can benefit any site.

First, diagnose your crawlability problem, if you have one.  One indicator may be poor traffic flow to your site, but to be sure crawlability is the culprit, use the Fetch as Googlebot tool. This allows you to see your site from the perspective of a Google robot; you can determine which pages are problematic and if you might be unintentionally blocking bots from crawling your site.

If you do have areas in which you need help, here are some steps you can take to make it easier for your site to be crawled:

  • Give the bots some more high quality content to crawl.
  • Go through existing content and see if you can consolidate or take down pages that do not add enough value by themselves. It is better to have 100 strong content pages than 1000 weaker ones.
  • Build a sitemap to give the spiders more information and signals to crawl.
  • Pare down your navigational structure and avoid excessive use of Flash, Ajax, JavaScript, and other heavy elements.  Not only will this help bots crawl your site more easily, it will appeal to the humans, too.
  • Make sure to use search engine friendly URLs, optimised with relevant keywords.
  • Use robot.txt to suggest which sites the bots should crawl. Google’s bots do not technically have to heed these requests, but they typically do so it can be a worthwhile step.

The next step is to realise that just because a bot is able to crawl your site doesn’t necessarily mean it has indexed it.  We’ll talk about that next time!

Leave a Comment