skip to Main Content

I have pages that end in a search?=keyword and I want Google to prevent from crawling these pages. Example pages are below. I was wondering if I put a Disallow: /search in the robots.txt, will that work?

Example URL’s:

http://www.website.com/search?=neymar
http://www.website.com/search?=ronaldo
http://www.website.com/search?=kobe

So basically, I do not want Google to crawl all the URL’s that end in /search?=keyword. Thank you.

2

Answers


  1. It will work.

    Disallow: /search?=*
    
    Login or Signup to reply.
  2. you can simply use the "Disallow: /" tag along with the URLs. This will work

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search