skip to Main Content

Having to manage a multilingual site, where users are redirected to a local version of site, like

myBurger.com/en // for users from US, UK, etc..
myBurger.com/fr // for users from France, Swiss, etc...

How should be organized the robots.txt file in pair with the sitemap?

myBurger.com/robots.txt // with - Sitemap: http://myBurger.com/??/sitemap
OR
myBurger.com/en/robots.txt  // with - Sitemap: http://myBurger.com/en/sitemap
myBurger.com/fr/robots.txt  // with - Sitemap: http://myBurger.com/fr/sitemap

kwnowing that en and fr sites are in fact independent entities not sharing common content, even if similar appearance.

2

Answers


  1. You need to put one robots.txt at the top level.

    The robots.txt file must be in the top-level directory of the host,
    accessible though the appropriate protocol and port number.

    https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt

    Login or Signup to reply.
  2. Put the robots.txt at the root: myBurger.com/robots.txt and register your sitemaps in the robots.txt file using the sitemap: directive (see an example I maintain if necessary).

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search