Can it be destructive to disallow all MediaWiki:Special pages in robots.txt?

by JohnDoea   Last Updated August 29, 2019 16:04 PM

I consider preventing indexation of all of MediaWiki 1.33.0's Special pages.
In Hebrew, מיוחד means "Special":

Disallow: /מיוחד:*
Disallow: /index.php?title=מיוחד:*
Disallow: /index.php%3Ftitle%3D%D7%9E%D7%99%D7%95%D7%97%D7%93:*

Doing so is good in general because many of these pages aren't useful to the avarage surfer (rather to staff only) but some are important to both a regular users and crawlers.
A problem of not indexing "RecentChanges" and "Categories" is plausible though, as both these Special pages serve as "small, dynamic pseudo-sitemaps" that give access to virtually all webpages in the site.

Would you remove Disallow of MediaWiki special pages from robots.txt altogether?
Would you keep it with a good list exceptions just for "RecentChanges" and "Categories"?
Would you take a totally different approach?



Related Questions


Updated January 12, 2017 08:01 AM

Updated May 01, 2016 08:01 AM