I want google bots to prevent crawling of a specific part of a page say a div. My site is in angular and I am using escape fragment pages for seo.
I do not want to use iframe for that. Any suggestion?
I want google bots to prevent crawling of a specific part of a page say a div. My site is in angular and I am using escape fragment pages for seo.
I do not want to use iframe for that. Any suggestion?
2
Answers
I can think of two ways through which this can be done:
1- Using jQuery or Javascript
Just wrap the part of the page that you don’t want to be crawled in a div and assign it an id. Then, apply
display:none
style to the id in a css file and then make itdisplay:block
via either jQuery or Javascript on page load.This bit of HTML, CSS and javascript should do it:
HTML:
CSS:
jQuery:
2- Detect User Agent and Skip the Content
Another approach could be to somehow detect the USER AGENT and then wrapping the part of the page that you don’t want to be crawled in a conditional statement like this:
The best way today is using
googleon
/googleoff
tags to exlude certain parts of the site.More on the topic to be found in Google Search Appliance Help.