Do you know if it is possible to force the robots crawl on www.domaine.com and not domaine.com ? In my case, I have a web app that has enabled cached urls with prerender.io (to view the HTML code), but only on www.
So, when the robots crawl on domaine.com, it has no data.
The redirection is automatic (domaine.com> http://www.domaine.com) on Nginx, but no results.
I said that my on my sitemap, urls have all www.
My Nginx redirect :
server {
listen *:80;
server_name stephane-richin.fr;
location / {
if ($http_host ~ "^([^.]+).([^.]+)$"){
rewrite ^/(.*) http://www.stephane-richin.fr/$1 redirect;
}
}
}
Do you have an idea ?
Thank you !
2
Answers
Could you have a robots.txt file with
on domaine.com and a different one with
on http://www.domaine.com?
If you submitted a sitemap with the correct URLs a week ago, it seems strange that the Google keeps requesting the old ones.
Anyway – you’re sending the wrong status code in your non-www to www redirect. You are sending a 302 but should be sending a 301. Philippe explains the difference in this answer: