skip to Main Content

I have a tiny server I’m trying to fetch a bunch of files from; ~100 to fetch. I’m using JS to fetch the files using a forEach loop. Unfortunately, the ~100 requests knocks over the server.

    links.forEach((link) => {
        const name = link.split("/")[5];
        const file = fs.createWriteStream(name);
        const request = http.get(link, (res) => {
            res.pipe(file);
            file.on("finish", () => {
                file.close();
            });
        });
    });

I’m trying to write synchronous blocking JavaScript; I want the file writing to complete before beginning the next fetching action. I’ve been experimenting with generators, while loops, and fs.writeFileSync. I’ve also been using setTimeout to emulate a slow network call (just so I don’t have to reset the server after knocking it over each time).

It seems like I’m missing some concept. Here’s my very naive experiment that I thought should take 3 seconds, but only takes 1. After thinking about it, it’s clear all the timeouts are happening at once.

function writeSlow(path) {
    setTimeout(function () {
        console.log("write now");
        fs.writeFileSync(path, "lorem");
    }, 1000);
}
writeSlow("junk/one");
writeSlow("junk/two");
writeSlow("junk/three");

I did some reading and was convinced using Promises was the way to go, but this doesn’t appear to work either:

function sleep(ms) {
    return new Promise((resolve) => setTimeout(resolve, ms));
}

async function run() {
    arr.forEach(async (str) => {
        await sleep(1000);
        fs.writeFileSync("junk/" + str + ".txt", "lorem");
    });
}

My expectation, or what I’m trying to get to, with the experimental code, is the point where I can watch the filesystem, and every second, a new file appears.

(edit)
The actual end result I’m looking for is for the next http request to only fire once the last one completes.

2

Answers


  1. You could write an asynchronous loop:

    function loop(links, i) {
        if (i >= links.length) return; // all done
        const link = links[i];        
    
        const name = link.split("/")[5];
        const file = fs.createWriteStream(name);
        const request = http.get(link, (res) => {
            res.pipe(file);
            file.on("finish", () => {
                file.close();
                loop(links, i+1); // Continue with next
            });
        });
    }
    // Start the asynchronous loop:
    loop(links, 0);
    
    Login or Signup to reply.
  2. forEach doesn’t behave like you may think it will with async/await. Take a look at this post on SO for more information:Using async/await with a forEach loop

    Instead, in your case (as you will see above), you can use a standard for loop:

    function sleep(ms, fileName) {
      return new Promise(resolve => setTimeout(() => resolve(fileName), ms))
    }
    
    const files = ["file1", "file2", "file3"];
    
    async function run() {
      for(let i = 0; i < files.length; i ++) {
        const fileName = await sleep(1000, files[i])
        console.log(fileName, new Date().toLocaleTimeString())
      }
    }
    
    run();
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search