skip to Main Content

I have a website on a production server and I have changes to the site I’ll like to test on another webserver.

Is there a way to avoid Google’s SEO on the test website. Maybe setup in the web.config?

2

Answers


  1. It works likes this: a robot wants to vists a Web site URL, say http://www.example.com/welcome.html. Before it does so, it firsts checks for http://www.example.com/robots.txt

    So if you want to disallow all the search engines then upload a robots.txt file on your webserver.
    and include following piece of code:

    User-agent: *
    Disallow: /
    

    This will stop all the search engines from crawling.

    and when you will put it on your production server. Change the piece of code to(in robots.txt file):

    User-agent: *
    Disallow:
    
    Sitemap: http://www.yourdomainname.com/sitemap.xml
    

    and also include a sitemap.xml file.
    Remember, The “User-agent: *” means this section applies to all robots. The “Disallow: /” tells the robot that it should not visit any pages on the site.

    Login or Signup to reply.
  2. Use this piece of code in your robot.txt file:

    User-agent: *
    Disallow: /
    

    This will stop the search engines from crawling your webpage.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search