I am intending to user Angular Universal for server side rendering (SSR) but this should only be done for crawlers and bots from selected search engines.
What I want is the following schema:
source: https://dingyuliang.me/use-prerender-improve-angularjs-seo/
After following the official instructions to set up SSR I can now validate that Googlebot (finally) “sees” my website and should be able to index it.
However, at the moment all requests are rendered on the server. Is there a way to determine whether incoming requests are coming from search engines and pre-render the site only for them?
3
Answers
You can achieve that with Nginx.
In Nginx you can forward the request to the universal served angular application via..
..assuming that you are serving angular universal
via 127.0.0.1:5000
.In case a browser user agent comes along, we serve the page via
root /var/www/html
So the complete config would be something like..
This is what I came up with IIS:
In order to get rid of complex folder structures, change the following line in
server.ts
to this:
npm run build:ssr
command. You will end up with thebrowser
andserver
folders inside thedist
folder.Create a folder for hosting in IIS and copy the files that are in the
browser
andserver
folders in to the created folder.Add a new file to this folder named
web.config
with this content:Inside this folder open a Command Prompt or PowerShell and run the following:
Now you should be able to view your Server-Side Rendered website with
localhost:4000
(if you haven’t changed the port)Install the IIS Rewrite Module
IIS will redirect requests that have
googlebot
orbingbot
in them tolocalhost:4000
which is handled by Express and will return server-side rendered content.You can test this with Google Chrome, open Developer Console, from the menu select “More tools>Network conditions”. Then from the User Agent section disable “Select automatically” and choose Googlebot.
Just managed what you wanted but did not find any anwser providing a detailed step by step with Angular Universal and Express server.
So I post here my solution, any idea of improvement welcomed !
First add this function to the server.ts
this function uses 2 modes to detect crawlers (and asumes that the absence of user-agent means that the request is from a bot), the first is a ‘simple’ manual detection of a string within the header’s user-agent and secondly a more advanced detection based on the package ‘crawler-user-agents’ that you can install to your Angular project like this :
Second, once this function added to your server.ts, just use it in each
of your Express server export function, for which the ‘whatever’ route should have a different behaviour based on Bot detection.
Your ‘server.get()’ functions become :
To further improve the server load for SEO when a bot is requesting a page I also implemented ‘node-cache’ because in my case SEO bots do not need the very lastest version of each page, for this I found a good answer here :
#61939272