skip to Main Content

My application uses AngularJS for frontend and .NET for the backend.

In my application I have a list view. On clicking each list item, It will fetch a pre rendered HTML page from S3.

I am using angular state.

app.js

...
state('staticpage', {
    url: "/staticpage",
    templateUrl: function (){
        return 'http://xxxxxxx.cloudfront.net/staticpage/staticpage1.html';
    },
    controller: 'StaticPageCtrl',
    title: 'Static Page'
})

StaticPage1.html

<div>
Hello static world 1!
<div>

How do I do SEO here?

Do I really need to do HTML snapshot using PanthomJS or so.

7

Answers


  1. Yes PhantomJS would do the trick or you can use prerender.io with that service you can just use their open source renderer and have your own server.

    Another way is to use _escaped_fragment_ meta tag

    I hope this helps, if you have any questions add comments and I will update my answer.

    Login or Signup to reply.
  2. Do you know that google renders html pages and executes javascript code in the page and does not need any pre-rendering anymore?
    https://webmasters.googleblog.com/2014/05/understanding-web-pages-better.html

    And take a look at these :
    http://searchengineland.com/tested-googlebot-crawls-javascript-heres-learned-220157
    http://wijmo.com/blog/how-to-improve-seo-in-angularjs-applications/

    Login or Signup to reply.
  3. yes you need to prerender the page for the bots, prrender.io
    can be used and your page must have the
    meta tag
    <meta name="fragment" content="!">

    Login or Signup to reply.
  4. My project front-end also has biult on top of Angular and I decieded to solve SEO issue like this:

    1. I’ve created an endpiont for all search engines (SE) where all the requests go with _escaped_fragment_ parameter;

    2. I parse a HTTP Request for _escaped_fragment_ GET parameter;

    3. I make cURL request with parsed category and article parameters and get the article content;

    4. Then I render a simpliest (and seo friendly) template for SE with the article content or throw a 404 Not Found Exception if article does not exists;

    In total: I do not need to prerender some html pages or use prrender.io, have a nice user interface for my users and Search Engines index my pages very well.

    P.S. Do not forget to generate sitemap.xml and include there all urls (with _escaped_fragment_) wich you want to be indexed.

    P.P.S. Unfortunately my project’s back-end has built on top of php and can not show you suitable example for you. But if you want more explanations do not hesitate to ask.

    Login or Signup to reply.
  5. Firstly you can not assume anything.
    Google does say that there bots can very well understand javascript application but that is not true for all scenarios.

    Start from using crawl as google feature from the webmaster for your link and see if page is rendered properly. If yes, then you need not read further.

    In case, you see just your skeleton HTML, this is because google bot assumes page load complete before it actually completes. To fix this you need an environment where you can recognize that a request is from a bot and you need to return it a prerendered page.

    To create such environment, you need to make some changes in code.

    Follow the instructions Setting up SEO with Angularjs and Phantomjs
    or alternatively just write code in any server side language like PHP to generate prerendered HTML pages of your application.
    (Phantomjs is not mandatory)

    Create a redirect rule in your server config which detects the bot and redirects the bot to prerendered plain html files (Only thing you need to make sure is that the content of the page you return should match with the actual page content else bots might not consider the content authentic).

    It is to be noted that you also need to consider how will you make entries to sitemap.xml dynamically when you have to add pages to your application in future.

    In case you are not looking for such overhead and you are lacking time, you can surely follow a managed service like prerender.

    Eventually bots will get matured and they would understand your application and you will say goodbye to your SEO proxy infrastructure. This is just for time being.

    Login or Signup to reply.
  6. At this point in time, the question really becomes somewhat subjective, at least with Google — it really depends on your specific site, like how quickly your pages render, how much content renders after the DOM loads, etc. Certainly (as @birju-shaw mentions) if Google can’t read your page at all, you know you need to do something else.

    Google has officially deprecated the _escaped_fragment_ approach as of October 14, 2015, but that doesn’t mean you might not want to still pre-render.

    YMMV on trusting Google (and other crawlers) for reasons stated here, so the only definitive way to find out which is best in your scenario would be to test it out. There could be other reasons you may want to pre-render, but since you mentioned SEO specifically, I’ll leave it at that.

    Login or Signup to reply.
  7. If you have a server-side templating system (php, python, etc.) you can implement a solution like prerender.io

    If you only have AngularJS-only files hosted on a static server (e.g. amazon s3) => Have a look at the answer in the following post : AngularJS SEO for static webpages (S3 CDN)

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search