skip to Main Content

I have a couple of web apps in azure (same codebase, in different regions) that I need to set up as end points in Traffic Manager.

One of those sites is already Live. It is configured to support multiple domains, but all requests are 301 redirected to a specific domain, for SEO reasons. The other site needs to work in the same way of course, within the Traffic Manager setup.

The issue is that Traffic Manager needs to be able to ping the *.azurewebsites.net domain and receive a 200 response to work, but with the current redirect rule in place on the endpoints, this will not work.

If I remove the redirect rule then Traffic Manager will work, but it means that requests for the sites at *.azurewebsites.net will not be redirected (and so presents an SEO concern).

The solution I’m heading towards is serving up a different robots.txt file (with a Disallow: / rule) if the request is for the azurewebsites.net domain. Is this feasible? How might I go about doing this?

Are there any other ways I could make this work?

thanks

2

Answers


  1. Chosen as BEST ANSWER

    I'm going to rework the current redirect rule so that it doesn't redirect for one particular path on the azurewebsites.net domain (*.azurewebsites.net/favicon.ico), which should enable Traffic Manager to ping the site, whilst keeping SEO ok for the rest of the Urls.


  2. 7 years later and some months, the answer seems to be in the traffic manager’s config under other expected codes, so you can add 301-302 to that list to make your host health show online.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search