I have an application using React, React Router, Redux and Webpack for bundling, basically the problem is the client said that the SEO is bad because the content or the way on React render the app, because if you go to see the source is not there, I understand that the Googlebot can crawl an SPA, I checked the Fetch as Google and the pages are rendering good, but the client feels that is is the problem.
I tested with react-snapshot, prerender-spa-plugin, static-site-generator-webpack-plugin, and also Next but anyone of these does not work for me. I want to know if exists a way to resolve this without changing the project structure or at least reasons or an explanation to not do this. Thanks
2
Answers
Basically, you can do this with React’s server-side rendering feature.
For each function, in the Node.js server that responds to the request for the page:
require(..)
your root component.require(..)
your redux storeReactDOMServer.renderToString
on it. Note react router has special handling for server-side rendering so read that.ReactDOM.hydrate
on it with your store on the client side.Alternatively if your code has
window.x
all around and can’t run in Node.js – you can do something else just for SEO:Note that regular users still get the regular React site as they are not detected as crawlers.
Basically agree with the idea upstairs (Benjamin Gruenbaum)
The difference is that I use puppeteer directly.