Bobcares

Sitemap blocked by robots.txt – Resolve it now

by | Feb 20, 2021

Got stuck by the error sitemap blocked by robots.txt? We can help you resolve it.

This error can occur due to many reasons that include if there are any disallow rules set in the robotx.txt file if any migration done from HTTP to HTTPS, and so on.

Here at Bobcares, we receive requests from our customers to fix errors related to robots.txt as a part of our Server Management Services.

Today, let’s look at how our Support Engineers resolve sitemap blocked by robots.txt.

 

How we fix this sitemap blocked by robots.txt indicate error?

Now, that we know the causes of this error message, let’s see how our Support Engineers fix this error.

 

1. Setting HTTPS in the robots.txt file

One of our customers had recently approached us with the same error message telling us that the robots.txt was working fine until an SSL was installed on the website:

For any robots.txt trouble, our Support Engineers normally start troubleshooting the error by checking the robots.txt file. We check it for any disallow rule being set which causes the error.

On checking, we found that the below code was set in the robots.txt file.

# robots.txt
#
# This file is to prevent the crawling and indexing of certain parts
# of your site by web crawlers and spiders run by sites like Yahoo!
# and Google. By telling these "robots" where not to go on your site,
# you save bandwidth and server resources.
# Website Sitemap
Sitemap: http://www.mydomain.com/sitemap.xml

# Crawlers Setup
User-agent: *

# Allowable Index
Allow: /index.php/blog/

# Directories
Disallow: /404/
Disallow: /app/</code

Here, the sitemap URL is set to HTTP. The customer had earlier told about the website to be working well before SSL was installed. So after SSL installation on the website, the Sitemap URL also must be updated to HTTPS.

Once this was set, we suggested the customer wait for the search engine to re-crawl the website.

Finally, this fixed the error.

 

2. Telling Google to re-crawl the website

We had another customer with the same problem but a different solution.

Let’s take a look at it.

Our Support Engineers started troubleshooting the problem by checking the robots.txt file and here it is

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

The file was fine as the disallow was set to the other path than the one having the problem.

Also, we confirmed with our customer that there were no disallow rules set in the sitemap.

So we further continued troubleshooting by manually telling Google to crawl the website.

We did it by navigating to the Search Console property >> Crawl >> Fetch as Google >> Add then entered the URL path which Google was warning about and >> Fetch. Once reloaded, click on Request Indexing >> Crawl only this URL.

 

3. Robots.txt tester to fix sitemap blocked error

We also suggest our customers use robots.txt tester to check the warnings and error messages that are generated.

This will provide a detailed description of the error.

[Need any further assistance in fixing robots.txt error? – We’ll help you]

 

Conclusion

In short, sitemap blocked by robots.txt is generally caused due to developers improperly configuring the robots.txt file. Today, we saw how our Support Engineers fixed this error.

PREVENT YOUR SERVER FROM CRASHING!

Never again lose customers to poor server speed! Let us help you.

Our server experts will monitor & maintain your server 24/7 so that it remains lightning fast and secure.

GET STARTED

var google_conversion_label = "owonCMyG5nEQ0aD71QM";

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Never again lose customers to poor
server speed! Let us help you.