This question is a follow up to this one:
Should one hide RTL encoded URLs in robots.txt or not?
Is it problematic to include both decoded and encoded versions of the same directives in
robots.txt, such as the following?
Disallow: /מדיה_ויקי:* Disallow: /%D7%9E%D7%93%D7%99%D7%94_%D7%95%D7%99%D7%A7%D7%99:*
The "rationale" if there really is one, is to "aim" to any current or possible-in-future standard from Google (and any other major search engine for that matter) --- for example; today one major search engine would prefer the decoded version and tomorrow another would prefer the encoded version; hence the question if it's okay to just have both and be done with it.