So it’s been a couple weeks since you launched that flashy new website, and it’s still not indexed. What you may have is a nasty case of poor crawlability.


Crawlability refers to the ability of a search engine to crawl through the entire text content of your website, easily navigating to every one of your web pages, without encountering an unexpected dead-end. Basically what it means is that there is something that you’re doing in one or many potential areas that is restricting access to your website (thus eliminating any possibility that it will be scanned and indexed by search engines).

Since the only was people can find your site (other than directly typing it in or clicking a hyperlink on a website) is via search engines, this is a pretty big issue. If you have a good SEO, they should be able to catch this pretty quickly, but if you have a bad one…yikes!?!

Luckily, you have me. So if you’re having problems getting rankings, check to see if you’re guilty of doing one of the following 6 things.

#1. You’ve Disallowed Indexing via Your Robots.txt File

Your robots.txt file is a file that should be in your root directory that you can use to restrict access to certain areas or folders on your website (so that they won’t be indexed). However, if you’re utilizing this during a site build out for example and you’ve forgetten to fix it. Well, then there’s your problem.

Your restricting access to your website if you’re robots.txt file contains something like this:

User-agent: *
Disallow: /

In order to fix this, you can:

#2. You’ve Disallowed Indexing via the Noindex, Nofollow Tag

Sometimes, it’s as simple as a Meta tag. Check your website’s <head> section to see if you’re using a potentially restrictive Meta tag.

It will look like this:

Change it to this:

#3. Too Many URL Parameters To Pass

In today’s age, this shouldn’t be too much of a problem. Many websites use dynamic URLs for large inventories, etc. However, it is still something to check. If your URL’s makes a webcrawler pass through more than 3 parameters, chances are they need to be shorter.

Example of URL with too many parameters (parameters marked in bold):

Personally, I recommend just using keywords in your URLs instead, as this is a better SEO practice.

#4. Poor Internal Link Structure

Maybe you have 1 page indexed and that’s it. If this is you, better check your site’s internal links structure. Check to see if your homepage is linking to other pages on your website, and if in turn those pages are further linking to additional interior pages.

Remember, if you don’t link to your site’s internal pages then a webcrawler won’t be able to do its job (which is to crawl through your site’s links, scan the pages, and index them).

#5. Limiting Yourself With Technology

You would be surprised how many people have done this unknowingly. Every see a website whose homepage forces a user to fill out a form to go through to the site’s internal pages? Ever see a website whose content shifts via AJAX or Javascript, but the URL doesn’t change?

#6. Server & Redirection Errors

This is what I like to think of as “Oh, Shit!” mode for your site. Ever seen a pesky 404-error page, or done a 301-redirect. Time to double-check your site to see if this is the problem. Here are some server codes and what type of problem they indicate…

100-199: SRCs provide confirmation that a request was received and is being processed.

200 – It simply means all is OK. What the client requested is available.

300-399: Request was not performed, a redirection is occurring.

400-499: Request is incomplete for some reason.

500-599: Errors have occurred in the server itself.

Source: Server Respones Code Documentation

Make sure you get on top of your server issues fast. If you don’t know how to handle it on your own, go to your IT or in-house tech guy right away. They will probably know how to fix it.

Hope this helps. Happy Tuesday everyone!

Image credit: SEO Hacker