skip to Main Content

One of my client having website which is entirely based on API Content i.e. content coming from 3rd party website. He wants to do some seo on the data. I wonder if it is possible as there is data not available in his database and i think google crawler redirect to 3rd party website while crawling on such pages. We already asked for permission from that website owner to let us store API data on our end in order to do some SEO but he refused our request.

It will be highly appericited if you can suggest any other way that should not be against policies and guidelines.

Thank You
Vikas S.

2

Answers


  1. It’s not recommended to take the same content and post it on your website, its duplicate and Google will give you penalty.

    If you still want to post it on your website, you have to make some changes on the original text and then post it on your website to look like its original.

    Also if you want to keep it without any changes and to avoid any penalties from Google, you you have to add a link for the original article from your website or add a cross domain canonical link like the below example:

    <link rel="canonical" href="https://example.com/original-article-url" />
    
    Login or Signup to reply.
  2. Yes – with a huge BUT:

    Google explains how parameters can be set within their Search Console (Google Webmaster) and how these can effect the crawler’s behaviour.

    @Nadeem Haddadeen is right with the canonical links between duplicates. There’s also an issue if you don’t have consistent content when calling up the same parameters. This essentially makes your page un-indexable as it’s dynamic content. If you are dealing with dynamic content then you need to optimise a host page based around popular queries rather than trying to have your content rate itself.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search