I am trying to run the SEO toolkit IIS extension on an application I have running but I keep getting the following error:
The request is disallowed by a Robots.txt rule
Now I have edited the robots.txt file in both the application and the root website so they both have the following rules:
User-agent: *
Allow: /
But this makes no difference and the toolkit still won’t run.
I have even tried deleting both robots.txt files and that still doesn’t make any difference.
Does anyone know any other causes for the seo toolkit to be unable to run or how to solve this problem?
4
Answers
To allow all robots complete access I would recommend using the following syntax (according to robotstxt.org)
The allow directive is supported only by “some major crawlers”. So perhaps the IIS Search Engine Optimization (SEO) Toolkit’s crawler doesn’t.
Hope this helps. If it doesn’t, you can also try going through IIS SEO Toolkit’s Managing Robots.txt and Sitemap Files learning resource.
The robot.txt may have been cached. Stop/restart/unload IIS (application). The robots.txt will refreshed. Open a browser reload the file. You can even delete the file to be sure that IIS is not caching it.
Basically robots.text is a file that does not allow Google to crawl the pages which are disallowed by admin so Google ignores those pages that’s why those pages never rank and google never shows that data.