I am trying to implement a simple project that works like Medium.com. My project is built in ReactJS. My issue is on meta tags and SEO. I know that modifying the meta tags on client side (e.g. using react-helmet) is not advisable because web crawlers are not considering this yet.
I am now exploring Gatsby and Next.js. I understand that I have two options:
- pre-render on build
- render on every request
With a website similar to Medium, I do not find it worth it to build every single time that a new post is made or a change in a post is made just to display a static content to the viewers. Say, I have millions of blog posts already, it would take a while to build and I cannot build the application every time someone makes a post.
If I render a post on every request, it is also not worth it if a post is very popular and viewed by millions and this post is only modified once.
Is there something that I am missing? I am currently leaning more on rendering on every request.
EDIT:
I am now planning to use caching in parallel with rendering on every request.
EDIT 2:
I am banned from answering questions. So I’ll add my answer here:
.. ANSWER BELOW ..
I have done a little bit more research on Next.js. Apparently, Next.js’ has a feature of fallback
for static generation of pages. This will fix my issue of having more pages added at any point of time. See this link for details. As per the documentation, it states:
- The paths returned from getStaticPaths will be rendered to HTML at build time.
- The paths that have not been generated at build time will not result in a 404 page. Instead, Next.js will serve a “fallback” version of the page on the first request to such a path (see “Fallback pages” below for details).
- In the background, Next.js will statically generate the requested path HTML and JSON. This includes running getStaticProps.
- When that’s done, the browser receives the JSON for the generated path. This will be used to automatically render the page with the required props. From the user’s perspective, the page will be swapped from the fallback page to the full page.
- At the same time, Next.js adds this path to the list of pre-rendered pages. Subsequent requests to the same path will serve the generated page, just like other pages pre-rendered at build time.
Also, I have learned that Next.JS introduced a new feature called Incremental Static Regeneration
. So whenever we invalidate a page, it will continue serving the old page while a page is being updated. This way, if there is an update on a certain page, I do not have to build the entire application. It will only rebuild the page. This is still in beta state as of this post. Details can be found here.
3
Answers
Creating a sitemap and updating it regularly (Example: once in a day) is one of the easiest ways to make the site content search engine friendly.
Sitemap need be created and updated using a server-side script.
‘sitemap-generator’ is a good utility available on npm, for the purpose of generating sitemap.
It can be installed using:
More information:
https://www.npmjs.com/package/sitemap-generator
You might have to use
Helmet
The other answers are addressing your question about metadata, but I feel your question is primarily "does it make sense to prerender and serve static HTML (Jamstack approach) or rather to render pages dynamically on every request?"
If you have a highly dynamic website that needs to be updated frequently (say more often than once an hour) you should go for dynamic, server-side rendering. It is much easier to get good performance while pre-rendering, though, so if infrequent updates work for you, then it’s the better solution.
Building a website with thousands of pages will take at least a few minutes. If you’re ok with the site only updating daily, that’s fine and then pre-rendering is a good approach. If you expect to ever need content to update more often than that, it is not.