skip to Main Content

I am trying to run the SEO toolkit IIS extension on an application I have running but I keep getting the following error:

The request is disallowed by a Robots.txt rule

Now I have edited the robots.txt file in both the application and the root website so they both have the following rules:

User-agent: *
Allow: /

But this makes no difference and the toolkit still won’t run.

I have even tried deleting both robots.txt files and that still doesn’t make any difference.

Does anyone know any other causes for the seo toolkit to be unable to run or how to solve this problem?



  1. To allow all robots complete access I would recommend using the following syntax (according to

    User-agent: *

    (or just create an empty “/robots.txt” file, or don’t use one at all)

    The allow directive is supported only by “some major crawlers”. So perhaps the IIS Search Engine Optimization (SEO) Toolkit’s crawler doesn’t.

    Hope this helps. If it doesn’t, you can also try going through IIS SEO Toolkit’s Managing Robots.txt and Sitemap Files learning resource.

    Login or Signup to reply.
    1. Check to make sure the DNS record is pointing to the correct server
    2. If you’re searching for the file, account for case sensitivity – robots.txt vs Robots.txt
    3. Verify that the Toolkit is actually attempting to visit the site. Check the IIS logs for the presence of the “iisbot” user-agent.
    Login or Signup to reply.
  2. The robot.txt may have been cached. Stop/restart/unload IIS (application). The robots.txt will refreshed. Open a browser reload the file. You can even delete the file to be sure that IIS is not caching it.

    Login or Signup to reply.
  3. Basically robots.text is a file that does not allow Google to crawl the pages which are disallowed by admin so Google ignores those pages that’s why those pages never rank and google never shows that data.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top