I am building an Angular 2 app using the Angular-Meteor framework.
I would like to achieve fast and consistent indexing by google and other search engines, and allow Facebook sharer and other scrapers to generate previews of my JS-generated content.
Usually SPAs use the PhantomJS to render the page server-side and send the static HTML to the client.
Of course I can spawn PhantomJS myself when I intercept an _escaped_fragment_ or when I see the google or scraper user agent, but I always experienced memory leaks and orphan Phantom instances when spawning PhantomJS directly on websites with a big traffic (I used NodeJS and this module ).
For Angular 1 apps, I used to solve this with angular modules like Angular-SEO, but it seems hard to convert such module to angular 2.
I did not find any appropriate Angular 2 module for this yet. Should I build it myself, or is there any other good way to achieve this as of today ?
3
Answers
The great thing about Angular2 is that when fired up, all content inside your root app-element goes away. This means that you can put whatever you want in there from the server which you want to be picked up by crawlers.
You can generate this content by using a server-rendered version of the content in your app, or have custom logic.
You can find some more information here: https://angularu.com/VideoSession/2015sf/angular-2-server-rendering
and here: https://github.com/angular/universal
I just created ng2-meta, an Angular2 module that can change meta tags based on the current route.
You can update meta tags from components, services etc as well.
While this caters to Javascript-enabled crawlers (like Google), you can set fallback meta tags for non-Javascript crawlers like Facebook and Twitter.
Support for server-side rendering is in progress.
Serverside rendering is not a requirement for a decent google ranking …
I had a forum with about 33.000 entries in its google sitemap files. This website was written using asp.net webforms, and had a decent stream of incoming requests from google. This website did have very bad mobile readability (something that is penalized by google, it actually mentioned this in my google “search console”)
I rewrote everything with angular (deployed version is angular5). I am using the Title and Meta services to set my title and meta tags. All routes contain keywords extracted from the actual content. I also made sure that every element with a [routeLink] attribute was an A tag on which i also specified the href element (that is what a crawler looks for …) And of course i paid a lot of attention to mobile readability.
Result: i actually get more incoming traffic than before, and in the search console i clearly see that my indexed pages went up: of the 30k+ pages, only about 10K were included in the index. Now i have almost 25k pages in the index.
I am not saying that serverside rendering is irrelevant. Using universal or other methods will result in faster download times, which will probably lead to a higher score. But google is definitely able to properly index an angular SPA.
edit: some proof: if you google “3ds max threadripper”, you’ll see that it actually outranks one of the biggest hardware sites on the internet.