skip to Main Content

i have a react app, it is deployed, available and registred at google long ago. Google sees sitemap, pages that are on sitemap are indexed, but all what google sees – is like – so only page before loading. The same in url inspection tool – look the same blank.
I am not using create-react-app but proper webpack build

...
  <body>
    <div id="root"></div>
  </body>

So what am i missing? Should’t nowadays google crawler be able index react app?

UPD: I understand that other frameworks, with another rendering aproach may work. But does that mean that after all the years – google can’t index second most popular framework? So at the end of the day react totally unusable by itself?

UPD2: react produces the html page with main.js script – that is executed on page and shows all content. Not only react does this. There are other frameworks doing the same. So it means google tottaly helpless in this situation?

2

Answers


  1. I think this is due to React being SPA so Google is only cataloguing the first page. I believe NextJS solves this problem.

    Login or Signup to reply.
  2. No. Google is not able to index the contents of a client side react application. This is because the Google Crawler does not wait for asynchronous requests to resolve, and because your pages are rendered on the users’ client, they will appear to be empty pages:

    <!-- This is googles perspective -->
    <body>
      <div id="root"></div>
    </body>
    

    Modify your app to render on the server with something like Next.js. This will pre-render what is possible on the server while still being able to dip into client-side code.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search