skip to Main Content

I have two collections:
Db.collection(groups)
Db.collection(users)
The group doc stores a map of members.
Id like to fire a function which iterates over the collection, storing aggregated data from the user document (eg. Number activities last week)
How to handle this, if the collection contains many documents?

I would solve this with a promise. But i guess i run into limits of collection.get() or a timeout during the forEach loop.
Is a paginated read (.startAt()) the solution here? What about the timeout?

export const someMethod = functions.https.onRequest((req, res) => {
    let stuff: any[] = [];
    let db = getFirestore();
    db.collection("groups").get().then(snapshot => {

        snapshot.forEach(doc => {
            var newelement = {
                "Members.${doc.id}.activities": doc.data().activities.lenth }
//batched write for every doc
// ...
 
        });
        res.send("ok")
        return "";
    }).catch(reason => {
        res.send(reason)
    })
});

2

Answers


  1. On Cloud Functions v1 a single function invocation can run up to 9 minutes, on gen 2 it can run up to 60 minutes. If that isn’t enough for your workload, you will have to split it out over multiple invocations – for example by having your main function create a number of tasks.

    For more on this, see:

    Login or Signup to reply.
  2. You seem to be returning and sending back a response after the forEach. This will cause the function to run in the background which can’t guarantee execution after this point:

    https://cloud.google.com/functions/docs/troubleshooting

    Background activity (anything that happens after your function has terminated) can cause issues, so check your code. Cloud Functions does not guarantee any actions other than those that run during the execution period of the function, so even if an activity runs in the background, it might be terminated by the cleanup process.

    You might want to use async/await syntax here. Promises within a forEach don’t execute how you would think. Best bet would be to iterate with a for..of over the groups. However, batchWrites only allow 500 operations at a time, so you’ll need some logic to slice(0, 500) to only get 500 operations at a time.

    // Since Firestore batch write limit is 500, divide operations into 
    multiple batches if needed
    const batches: WriteBatch[] = [];
    let currentBatch = getFirestore().batch();
    let operationCounter = 0;
    
    for (const groupDoc of groupsSnapshot.docs) {
        if (operationCounter === 500) {
      batches.push(currentBatch);
      currentBatch = getFirestore().batch();
      operationCounter = 0;
    }
    
    const newElement = { [`Members.${groupDoc.id}.activities`]: 
    groupDoc.data().activities.length };
    
    // ... add newElement to the batch write operation
    // assuming we have a document reference, docRef
    // currentBatch.update(docRef, newElement);
    
    operationCounter++;
    }
    // Push the last batch if it has operations
    if (operationCounter > 0) {
      batches.push(currentBatch);
    }
    
    // Execute all batch writes
    const batchPromises = batches.map((batch) => batch.commit());
    
    try {
      await Promise.all(batchPromises);
      return "ok";
    } catch (error) {
      return error;
    }
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search