Having to manage a multilingual site, where users are redirected to a local version of site, like
myBurger.com/en // for users from US, UK, etc..
myBurger.com/fr // for users from France, Swiss, etc...
How should be organized the robots.txt
file in pair with the sitemap?
myBurger.com/robots.txt // with - Sitemap: http://myBurger.com/??/sitemap
OR
myBurger.com/en/robots.txt // with - Sitemap: http://myBurger.com/en/sitemap
myBurger.com/fr/robots.txt // with - Sitemap: http://myBurger.com/fr/sitemap
kwnowing that en
and fr
sites are in fact independent entities not sharing common content, even if similar appearance.
2
Answers
You need to put one robots.txt at the top level.
https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt
Put the
robots.txt
at the root:myBurger.com/robots.txt
and register your sitemaps in therobots.txt
file using thesitemap:
directive (see an example I maintain if necessary).