skip to Main Content

I have the firebase extension for streaming data to Big Query installed https://extensions.dev/extensions/firebase/firestore-bigquery-export.

Each month I run a job to import data into my Firestore collection in batches.
This month I imported 2706 rows but only 2646 made it into Big Query (60 less).

I am got the following errors from the extension:
[![enter image description here][1]][1]

I contacted Firebase support and they suggested I upgrade to the latest firebase admin and function packages but these have breaking changes. Updating the latest version of firebase-admin gave me errors. I have not got any more help from them and it is still happening for multiple collections.

The options I see are:

  1. Update to the latest firebase-admin and firebase-functions packages
    and change my code to work with the breaking changes. I think this is
    unlikely to help.
  2. Update the firebase extension to the latest version from 0.1.24 to
    0.1.29 which now includes a flag called "Use new query syntax for snapshots" which can be turned on. I can’t find much information
    about this.
  3. Increase the Big Query quota somehow.
  4. Slow down the data being entered into Firestore or add it daily/weekly rather than monthly.

Here is my code in Nodejs:

  const platformFeesCollectionPath = `platformFees`;
  const limit = 500;
  let batch = db.batch();
  let totalFeeCount = 0;
  let counter = 0;

  for (const af of applicationFees) {
    const docRef = db.collection(platformFeesCollectionPath).doc();
    batch.set(docRef, { ...af, dateCreated: getTimestamp(), dateModified: getTimestamp() })

    counter++;
    if (counter === limit || counter === applicationFees.length) {
      await batch.commit();
      console.log(`Platform fees batch run for ${counter} platform fees`);
      batch = db.batch();
      totalFeeCount = totalFeeCount + counter;
      counter = 0;
    }
  }

  if (applicationFees.length > limit) {
    // Need this commit if there are multiple batches as the applicationFees.length does not work
    await batch.commit();
    totalFeeCount = totalFeeCount + counter;
  }
  
  if (counter > 0) {
    console.log(`Platform fees batch run for ${totalFeeCount} platform fees`);
  }

Update:
If I look in the GCP logs using the query:

protoPayload.status.code ="7"
protoPayload.status.message: ("Quota exceeded" OR "limit")```

I can see many of these errors:
[![Errors][2]][2]


  [1]: https://i.stack.imgur.com/BAgTm.png
  [2]: https://i.stack.imgur.com/eswzI.png

Edit:
Added issue to the repo:
github.com/firebase/extensions/issues/1394

Update:
It is still not working with v0.1.29 of the bigquery extension. I am getting the same errors.

2

Answers


  1. Would it be possible to provide the BigQuery Extension version number for your current installation. This can be found on your Firebase console => Extensions tab.

    The error "Exceeded rate limits: too many api requests", was an error we hoped to resolve with a release in June 2022. So perhaps be resolved with an upgrade, at the very least using the above example will provide the maintenance team a way of reproducing the bug.

    In addition, if you would like to create an issue on the repository, it would be easier for maintainers to track this issue.

    Login or Signup to reply.
  2. The error that you are facing is due to the Maximum number of API requests per second per user per method is exceeded.BigQuery returns this error when you hit the rate limit for the number of API requests to a BigQuery API per user per method. For more information, see the Maximum number of API requests per second per user per method rate limit in All BigQuery API.
    To prevent this error you could try out the following:

    • Reduce the number of API requests or add a delay between multiple API
      requests so that the number of requests stays under this limit.

    • The streaming inserts API has costs associated with it and has its
      own set of limits and quotas.

      To learn about the cost of streaming inserts, see BigQuery
      pricing

    • You can request a quota increase by contacting support or sales. For
      additional quota, see Request a quota increase. Requesting a
      quota increase might take several days to process. To provide more
      information for your request, we recommend that your request includes
      the priority of the job, the user running the query, and the affected
      method.

    • You can retry the operation after a few seconds. Use exponential
      backoff between retry attempts. That is, increase the delay between
      each retry.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search