Yoast SEO had created a sitemap and robots.txt that did not suit my page. I have since replaced both and waited a few weeks but my robots.txt is still blocking 13 pages. Can anyone help me with what I might be doing wrong?
Here are tags of the pages that are being blocked:
https://micronanalytical.com /sp/ /samples-submissions/ /cr/ /analytical-laboratory-directions/ /analytical-news/ /sem/ /forms-downloads/ /wp/wp-content/uploads/2018/08/2.FDA-License-2018.pdf /laboratory-services/
Here is my robotstxt
robots.txt User-agent: * Disallow: /quote/ Disallow: /forms-downloads/ Disallow: /MA/
Here is my sitemap: https://micronanalytical.com/sitemap.xml
What am I doing wrong?