I have a website on a production server and I have changes to the site I’ll like to test on another webserver.
Is there a way to avoid Google’s SEO on the test website. Maybe setup in the web.config?
It works likes this: a robot wants to vists a Web site URL, say http://www.example.com/welcome.html. Before it does so, it firsts checks for http://www.example.com/robots.txt
So if you want to disallow all the search engines then upload a robots.txt file on your webserver.
and include following piece of code:
This will stop all the search engines from crawling.
and when you will put it on your production server. Change the piece of code to(in robots.txt file):
and also include a sitemap.xml file.
Remember, The “User-agent: *” means this section applies to all robots. The “Disallow: /” tells the robot that it should not visit any pages on the site.
Use this piece of code in your robot.txt file:
This will stop the search engines from crawling your webpage.
Click here to cancel reply.