skip to Main Content

I want google bots to prevent crawling of a specific part of a page say a div. My site is in angular and I am using escape fragment pages for seo.
I do not want to use iframe for that. Any suggestion?

2

Answers


  1. I can think of two ways through which this can be done:

    1- Using jQuery or Javascript

    Just wrap the part of the page that you don’t want to be crawled in a div and assign it an id. Then, apply display:none style to the id in a css file and then make it display:block via either jQuery or Javascript on page load.

    This bit of HTML, CSS and javascript should do it:

    HTML:

    <div id="hide-from-bots">The content of this div will be hidden from bots</div>
    

    CSS:

    #hide-from-bots {
    display:none;
    }
    

    jQuery:

    <script type="text/javascript"> 
    $(document).ready(function()
      {
          $("#hide-from-bots").show();
      }
    </script>
    

    2- Detect User Agent and Skip the Content

    Another approach could be to somehow detect the USER AGENT and then wrapping the part of the page that you don’t want to be crawled in a conditional statement like this:

    if (USER AGENT != Googlebot){
      This content wouldn't render for Googlebot
    } 
    
    Login or Signup to reply.
  2. The best way today is using googleon/googleoff tags to exlude certain parts of the site.

    More on the topic to be found in Google Search Appliance Help.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search