I created a sitemap using rankmath plugin when i added that sitemap url to the search console it shows couldn’t fetct error then i used URL Inspection tool to check my sitemap It show url is not on google. After live test It shows ‘noindex’ detected in ‘X-Robots-Tag’ http header.
check this image to view issue
I want to fix this issue. If anybody know how to fix x-robots noinex tag from response header of sitemap please hrlp to fix it.
2
Answers
The couldn’t fetch issue on Google Search Console can happen for several issues. I strongly suggest you to visit the below URL to see if that resolves this issue for you:
https://rankmath.com/kb/couldnt-fetch-error-google-search-console/
For
noindex' detected in 'X-Robots-Tag' http header
you’re seeing on URL Inspection tool is logically correct and you shouldn’t be bothered in this case as the sitemap URL(s) are set tonoindex
intentionally. This is because the sitemap URLs are not built to be indexed on Google or shown on SERPs, rather, they should be used as a backup mechanism to tell Google to crawl your site URLs. Thus, you should ignore that error on the URL Inspection Tool.Please be aware that you should not submit your sitemap URLs to the URL Inspection tool, but rather to the sitemap section of Google Search Console.
I have facing same problem from last few days. But after doing some experimant with my blogger website I found 2 mistake which I have done
Turn off button for visible to search engine
2.eble custom robot header tags
Which is good but I have on the button noindex in custom robot for home page which I don’t need to do
custom robot header tags
on the button noindex in custom robot
For answer this question for blogger website
— on button for visible to search engine
enter image description here
—- of noindex button in custom robot for home pageenter image description here
Dellso
Check all the blue link for photo I don’t know how to show photo in this website