skip to Main Content

I’m currently developing an application with React and Mongoose (MongoDB) where I upload a text file, that contains roughly around 9000+ lines, containing JSON. Each line is important and I need to be able to store it in the database to be accessible later.

I have set up two functions, one that reads the file and one that creates a model for each request and saves it to MongoDB.

Function to read each line

const createAndReadFile = async (filePath) => {
    //create a read stream of the File
    let fileStream = fs.createReadStream(filePath, 'utf8');

    const rl = readline.createInterface({
        input: fileStream,
        crlfDelay: Infinity
    });

    for await (const line of rl) {
        let parsedJSONLine = JSON.parse(line);
            createAndAddToDB(parsedJSONLine);
    }
};

Function to add to the database each line

const createAndAddToDB = async (jsonRequest) => {
    try {
        let request = new Request(jsonRequest);

        await request.save();
    } catch (error) {}
};

My question is, what’s the most effective way to wait until all the lines have been converted to a model and saved in the database ?

2

Answers


  1. Use Promise.all to await the resolution of all promises that are created in the for-await loop:

    var promises = [];
    for await (const line of rl) {
      let parsedJSONLine = JSON.parse(line);
      promises.push(createAndAddToDB(parsedJSONLine));
    }
    return Promise.all(promises);
    

    Alternatively, you can perform the conversion in the for-await loop and write to the database in a batch:

    var promises = [];
    for await (const line of rl) {
      let parsedJSONLine = JSON.parse(line);
      promises.push(convertToDBFormat(parsedJSONLine));
    }
    return Promise.all(promises).then(function(converted) {
      return addToDBBatch(converted);
    });
    
    Login or Signup to reply.
  2. const createAndReadFile = async (filePath) => {
        //create a read stream of the File
        let fileStream = fs.createReadStream(filePath, 'utf8');
    
        const rl = readline.createInterface({
            input: fileStream,
            crlfDelay: Infinity
        });
    
        for (const line of rl) {
            let parsedJSONLine = JSON.parse(line);
                await createAndAddToDB(parsedJSONLine);
        }
        
        return 'file is processed';
    };
    

    You have written it correctly, I guess. When the loop is over, it will be the last line of the file.

    Now, if there are any lines that are not saved you can push them over to the failed array maybe and save them in the batches.

    Suggestion: also, instead of saving every line to DB every time that’s a lot of requests 9000+ (lines), you should use batch insert.
    When you are out of the loop, you can again check if there are lines that were not saved. Re-run the batch save function and you are good to go.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search