robots.txt execution

by JohnDoea   Last Updated September 07, 2019 00:04 AM

One can sort robots.txt this way:

User-agent: DESIRED_INPUT
Sitemap: https://example.com/sitemap-index.xml
Disallow: /

instead:

User-agent: DESIRED_INPUT
Disallow: /
Sitemap: https://example.com/sitemap-index.xml

I assume both are okay because it's likely the file is compiled in correct order by generally all crawlers.
Is it a best practice to put Disallow: before Sitemap: to prevent an extremely unlikely bug of a crawler's bad compilation of crawling before ignoring Disallow:?



Related Questions


Updated March 14, 2017 18:04 PM

Updated April 15, 2016 08:01 AM

Updated September 04, 2017 17:04 PM

Updated July 09, 2018 06:04 AM

Updated June 25, 2019 01:04 AM