'Sitemap contains urls which are blocked by robots.txt.' Warning - However robots.txt file doesn't appear to be blocking anything

by Peter Rowlands   Last Updated July 04, 2017 09:04 AM

Our site is built on WordPress.

Whilst in the development stage, we had the option to Discourage search engines from indexing this site checked (Settings Reading).

We have now made the site live and unchecked this option.

Yesterday I submitted a sitemap into Googles Search Console but a lot of the sitemap paths are coming up with the warning:

Sitemap contains URLs which are blocked by robots.txt.

As far as I can tell, none of the site (Apart from the /wp-admin/) URLs are being blocked by our robots.txt file. I have tested it in the Search Console and that is saying it is fine.

I read some articles which say that the file can be cached for a little while, but it has now been a day since submitting it.

Is there anything I am missing or can do to stop this warning being thrown?

Sitemap - https://www.justaccounts.com/sitemap_index.xml

Robots File:

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

A few examples of the URLs which are being shown as blocked:

  • https://www.example.com/page-sitemap.xml
  • https://www.example.com/category-sitemap.xml
  • https://www.example.com/attachment-sitemap.xml


Related Questions


Updated May 21, 2019 06:04 AM

Updated January 26, 2018 17:04 PM

Updated February 22, 2018 18:04 PM

Updated August 02, 2019 09:04 AM

Updated January 27, 2017 14:01 PM