I have a multilanguage and multidomain site. It runs through a unique CMS installation (Drupal), so I have a single root directory. So if I have a static robots.txt, there I only can show the files for a single domain, as far as I know.
Could I put a line in .htaccess
Redirect 301 /robots.txt /robots.php
(or equivalent instruction, and please, indicate which one if allowed)
so it redirects to a dynamic php file, where I can serve different contain according to the
And the same question for sitemap.xml, so I can serve a dynamic sitemap.php which indicates different links for each different domain.
The problem with no using .txt and .xml is, as mentioned, that all the domains share a single physical directory in the server computer.
Yes, the same way any request can be "dynamic".
However, you would not redirect (as in your example code), you should internally rewrite using mod_rewrite. (The same as what Drupal is probably already doing.)
For example, in your root .htaccess file:
RewriteEngine On RewriteRule ^robots\.txt$ robots.php [L]
RewriteEngine should only occur once (although it doesn't really matter if it occurs multiple times).
You just have to make sure that it doesn't conflict with any other directives in your .htaccess file. So, this should probably be near the start of the file, certainly before your front controller.
You can make any file dynamic. The best way to do so is not through redirects, but through rewrite rules.
RewriteRule ^robots\.txt$ /robots.php [L]
That way, you power it with a dynamic script, but the URL doesn't change. Most crawlers (including Googlebot) will follow redirects for robots.txt, but some crawlers will get confused if you introduce redirects.
Note that even if you power it with PHP, your robots.txt should appear to be static to each crawler for each domain. It is fine to serve different content for different domains, or even for different user agents. However, serving different content randomly, or based on time of day can really confuse search engine crawlers and mess up your SEO.
Sitemaps are fine to name however you want. You could redirect those, or use a rewrite rule to power them dynamically at the same URL. You can also name them like
Then refer to them in robots.txt:
or submit them to the search engines manually through their webmaster tools or search console.
Making the sitemap file dynamic is fine -- it's a good way to auto-update your sitemaps.
Making the robots.txt file dynamic (for the same host! Doing this for separate hosts is essentially just a normal robots.txt file for each of them.) would likely cause problems: it's not crawled every time a URL is crawled from the site, so it can happen that the "wrong" version is cached. For example, if you make your robots.txt file block crawling during business hours, it's possible that it's cached then, and followed for a day -- meaning nothing gets crawled (or alternately, cached when crawling is allowed). Google crawls the robots.txt file about once a day for most sites, for example.
There is no need to create sitemap.php because for each language you can run a separate sitemap.xml file and specify each in search engine consoles.
Dynamic sitemap.xml files are required when there is large volume of data and it makes the sitemap file larger than 50mb. Then a system with dynamic sitemap files being created, but actual sitemap1.xml etc files remain static. It means that sitemap2,3... .xml will add up to the list of the main file, but content in these files remains fixed (until these files either being recreated by cron for example).
Also to mention, that once a search engine has accessed the file, it won't return to it again very fast (unless it's done manually). It confirms that there is no need in any case creating a dynamic sitemap.php, because a normal sitemap.xml is by itself can be dynamic, updating with new content throughout the day or a week.
I can't think of any pros using a sitemap.php. It will do no good, as there are other better/proper ways to use these files.