As a webmaster, you have done everything to make sure that your website catches Google's attention. You do everything from producing great SEO-content to using all sorts of digital marketing tactics. Yet, your website doesn't make it to the top in Google search rankings. Why? Well, you might be encountering what are known as crawl errors.

What are crawl errors?

These are errors that prevent the search engine bot from crawling through your website, accessing all the webpages and index the entire content. As a result, your webpages remain partially accessed or fully unaccessed. These errors are very common, but if they are not fixed on time, they will disrupt your website health.

You definitely don't won't these errors coming into your way to get better rankings.
All these Errors can be fixed by an expert. If you are looking for an expert you can hire seo agency in bangalore or from any other city too which will give the best solution to these errors. Let us classify them and deal with each one of them. Crawl errors are mainly of two types:

A.Site Errors
These include all the troublesome errors that deny Google or any other search engine the right to access your website.

DNS Errors




Known as (Domain Name System) errors, this can seriously trouble your website. The DNS consists of a web of servers that are on the lookout for alphanumeric names of each and every website. This happens when the Googlebot can't access your website due to DNS issues that its facing. Your internet connection also needs to be checked for this as DNS tries to connect to an IP address.

How to fix it



  • Open Google Search Console and click on the Fetch as Google tool. This shows how Google crawls through your website.
  • Contact your DNS provider if the issue can't be rectified through Google console. 
  • Lookout for 404 and 500 error codes on your server. These can help you rectify the error. 
  • Use ISUP.me tool. It lets you know if your site isn't working just for you or everyone.

Server Errors




These errors indicate that your server isn't responding well and the Googlebot's request for access expires. It means that the Googlebot can view your URL and connect with it, but the webpages don't load. This might happen if there are too many people visiting your website at a given instant of time.

How to fix it
  • Use the Google Fetch tool to see if the Googlebot is able to access your website. If it displays your homepage minus the errors, then the site is absolutely fine.
  • Analyze the type of server error you are facing. There are various types like Connection reset, timeout, truncated headers, truncated response, connection refused, connection failed, connect timeout, no response, etc. The Google Search Console's Help page will provide a solution to solve each of these problems.

Robots failure




These errors occur when Googlebot is unable to get into your robot.txt file. This is what Google says about the robot.txt file:

“You need a robots.txt file only if your site includes content that you don't want search engines to index. If you want search engines to index everything in your site, you don't need a robots.txt file — not even an empty one. If you don't have a robots.txt file, your server will return a 404 when Googlebot requests it, and we will continue to crawl your site. No problem.”

If you are constantly uploading new content or making some changes on your website, you need to rectify this error at the earliest. If you don't, then Googlebot won't crawl into your website at all.

How to fix it
  • Check which webpages you don't want Googlebot to crawl into. You shouldn't have the "Disallow:/ everywhere otherwise your website won't be accessed by anyone.
  • Use the server header checker tool for 200 or 404 error.
  • Make sure that your robot.txt file appropriately configured and totally error free. Otherwise, don't have this file at all. All your webpages will be accessed by default.

B. URL Errors



These are specific errors that leaves certain webpages on your site unaccessed by Google. There can be too many errors of this kind on a website. Google search console displays these errors for three categories: desktop, smartphone and feature phone.

A lot of errors of this sort may pop up but they are ranked priority-wise. Click on the Mark All errors as fixed box lookout for these errors regularly.

Soft 404 errors




This error fails to retrieve an HTTP 404 status code and displays a 'page not found' notice. Manier times, a non-existent page can direct one to some irrelevant page too. The page is still crawled and indexed in this case.

How to fix it
  • Use 301 redirect to direct each page to a relevant page.
  • Make sure to have a good amount of content on the page, otherwise this error will be displayed.
  • Don't redirect dead pages to the site's homepage.
  • If the page isn't getting enough response, allow 404 or 410 but not 200.

404 errors




This an error message that indicates that the Googlebot has crawled into a page that is non-existent. This happens when Google shows other pages or sites connected to that link.

How to fix it
  • Make sure that your page is really published and not deleted or saved as a draft.
  • Check if the 404 URL is the proper page and not some other version.
  • Check if the error shows up in the www Vs non-www versions of your website.
  • Redirect the page through 301 redirect.
  • On Google Console, go to the url Errors section in crawl errors. Click on the URL you want to fix. Click on it and remove all the links to that page from other links that are connected to it.

Access denied




This error shoots up when the Googlebot is totally unable to access a particular webpage.
  • Use robot.txt tester to view warnings about your robot.txt file.
  • Check robot.txt file to see which are the pages you have blocked for real.
  • Use Fetch as Google tool to check how your site is viewed by a Googlebot.
  • Use Screaming Frog for scanning your website.


Author Bio:
Rtn Bala Kumaran is a CEO and Founder of BrandStory who writes for a variety of online publications. He loves writing blogs and promoting websites related to SEO, Guest Blogging, education, fashion, travel, health and technology sectors.