skip to Main Content

I am trying to map through an array [of nearly 1200 objects] with the Javascript’s .map() method to individually save each object to a Mongodb database (enriched with extra data).
The first 270 or so would get saved without problems to Mongodb. But subsequent attempts to save would resolve in this error:

    Request failed:  Error: Timeout of 60000ms exceeded
    at RequestBase._timeoutError (/home/i8mi/Desktop/work/projects/s/mailer/node_modules/superagent/lib/request-base.js:722:13)
    at Timeout.<anonymous> (/home/i8mi/Desktop/work/projects/s/mailer/node_modules/superagent/lib/request-base.js:736:12)
    at listOnTimeout (internal/timers.js:554:17)
    at processTimers (internal/timers.js:497:7) {
  timeout: 60000,
  code: 'ECONNABORTED',
  errno: 'ETIME',
  response: undefined
}

The whole operation ends in less than 2 minutes – the successful saves and the failures.

I understand this is a timeout error from nodejs(superagent) error. I can’t seem to find any information on rate limits for mongoDB cloud.

At the end there would be erros like these:

(node:13061) UnhandledPromiseRejectionWarning: MongoNetworkTimeoutError: connection timed out
    at connectionFailureError (/home/i8mi/Desktop/work/projects/s/mailer/node_modules/mongodb/lib/core/connection/connect.js:342:14)
    at TLSSocket.<anonymous> (/home/i8mi/Desktop/work/projects/s/mailer/node_modules/mongodb/lib/core/connection/connect.js:310:16)
    at Object.onceWrapper (events.js:420:28)
    at TLSSocket.emit (events.js:314:20)
    at TLSSocket.Socket._onTimeout (net.js:484:8)
    at listOnTimeout (internal/timers.js:554:17)
    at processTimers (internal/timers.js:497:7)

Any idea what could possibly be done to solve this? Do I need something like Redis, Kafka or RabbitMQ for this?

Stack versions:

- Node v12.19.1
- Mongoose: "^5.12.2",
- Database: MongoDB Cloud (free tier)

Update:

Actual code:

    function GetRequestResourceIds(data){
    console.log(data[0].tagList)
    TagList.find({_id: data[0].tagList}, (err, result) => {
        console.log(result)
        return new Promise(async(resolve, reject) => {
            if(!err && result){
                // Now create mxLead for each person in this Tag List
                     // find the greeting info to use
                const allRecipients = result[0].jsonList
                // console.log(result)
               await Greeting.find({_id: data.greeting}, (err, result) => {
                    if(!err && result){
                        // do this for each person in the list
                        const avartarGreeting = result.avartarGreeting
                        const initialTextGreeting = result.textGreeting
                        allRecipients.map(async(recipient) => {
                            //construct recipient (baseRecipient) data here
                            const baseRecipient = {
                                firstName: recipient.data[0],
                                lastName: recipient.data[1],
                                emailAddress: recipient.data[2],
                                avartarUrl: recipient.data[3],
                            }
                            // const textGreeting = initialTextGreeting+baseRecipient.firstName
                            const textGreeting = initialTextGreeting
                            const avData = {
                                avartarGreeting,
                                textGreeting,
                                avartarUrl: baseRecipient.avartarUrl,
                            }
                            
                            return personalizedAvatar.personalizedAvatar(avData)
                                    .then(newAvatar => {
                                      const newAvatars = new Avatar({
                                            recipientEmail: baseRecipient.emailAddress,
                                            recipientFirstName: baseRecipient.firstName,
                                            recipientLastName: baseRecipient.lastName,
                                            newAvatar,
                                            status: "done",
                                        })
                                        newAvatars.save()
                                       // console.log("success")
                                        resolve('success')
                                    })
                        })
                        
                    }
                })
                // return result
            } 
            else {
                console.log("failed")
                resolve("failed")
            }
        })
    })
}

module.exports.GetRequestResourceIds = GetRequestResourceIds

This is intended for a use case where game developers can generate avatars for players automatically as they achieve ranks, join a group or just for seasonal greetings. So they could be sending an array of one, hundreds or thousands users.

3

Answers


  1. your error’s stack trace does not mention MongoDB at any point. superagent is an HTTP client, which has nothing to do with MongoDB – so that timeout is coming from somewhere else.

    The second error, which I suppose comes from the actual piece of code that does the writing, indicates a connection timeout – either when attempting to connect (are you establishing a new connection for each write?) or when writing the documents due to connection starvation. In either case, you are probably suffering from unbounded concurrency – a condition where arbitrarily many asynchronous operations are executing at the same time (read more: Promise.allpocalypse) – though I cannot be sure without seeing some code.

    When writing large numbers of documents to MongoDB, you should probably be using collection.insertMany (see code: MongoDB usage examples), which enables you to pass whole arrays of documents to save.

    You do not need a message queue for writing large numbers of documents to MongoDB, unless each of them requires special processing by workers.

    EDIT: Based on the now-included code, here are some possible improvements:

    • Iterate over the recipient list first, compute everything you need and store in some variable
    • Save the new documents in bulk – here, we assume they’re all new, so insertMany should do
    • Handle all promises in a way that doesn’t lose rejections (.catch())
    • Avoid mixing new Promise(), .then() and async functions – settle on one style, preferably async/await if possible (this lets you avoid .catch() and just do try {} catch () instead)
    • Do not resolve with "failure" if an error occurs – that’s what exceptions and throw are for
    Login or Signup to reply.
  2. The way you are trying to accomplish this task doesn’t sound healthy or db/network-friendly. You should rather research and implement mongoDb bulkWrite operations – or at the very least find a way of using insertMany(), updateMany(), or any of its peers.

    Login or Signup to reply.
  3. You are probably making too many requests for execution, try to put code inside foreach and execute one request at a time.

    jobs.forEach(a => {mongoose.set("strictQuery", false);
                mongoose.connect('your string')
                .then(()=> {
                    const job = new Job({
                        _id: new mongoose.Types.ObjectId(),                   
                        title: a.title,
                        location: a.location,
                      });            
                    job
                    .save()                
                    .catch(err => {
                        console.log(err);                
                    });
                });
            });
    

    The example above is my code, so see that just as the inspiration 🙂

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search