When using Google's fetch test on my WordPress site, it reports that my robots.txt is blocking pages and resources from being indexed.
Google reports that my Robots.txt looks like this:
User-agent: * Disallow: /
How can I resolve this issue?
You need to remove both lines from your
robots.txt file. The robots file is located in the root directory of your web hosting folder, this normally can be found in
/public_html/ and you should be able to edit or delete this file using:
You have to update robots.txt file in WordPress through Cpanel because When you install Yoast SEO plugins, It automatically generates robots file for your site. But i you really want to edit it with your own instructions, login your hosting server->file manager then edit it.
If this is fresh install of Wordpress it could because you have set the privacy settings to stop search engines from crawling the site.
(and there will be no physical robots.txt on the server, as wordpress creates it on the fly)
Go to your settings in wordpress and see if this box is ticked:
If so, uncheck it and then the robots.txt should change to
User-agent: * Disallow:
If you still have issues and the robots.txt is still set to block crawlers, then explore the other option as outlined by Facet.