I have a tiny server I’m trying to fetch a bunch of files from; ~100 to fetch. I’m using JS to fetch the files using a forEach
loop. Unfortunately, the ~100 requests knocks over the server.
links.forEach((link) => {
const name = link.split("/")[5];
const file = fs.createWriteStream(name);
const request = http.get(link, (res) => {
res.pipe(file);
file.on("finish", () => {
file.close();
});
});
});
I’m trying to write synchronous blocking JavaScript; I want the file writing to complete before beginning the next fetching action. I’ve been experimenting with generators, while loops, and fs.writeFileSync
. I’ve also been using setTimeout to emulate a slow network call (just so I don’t have to reset the server after knocking it over each time).
It seems like I’m missing some concept. Here’s my very naive experiment that I thought should take 3 seconds, but only takes 1. After thinking about it, it’s clear all the timeouts are happening at once.
function writeSlow(path) {
setTimeout(function () {
console.log("write now");
fs.writeFileSync(path, "lorem");
}, 1000);
}
writeSlow("junk/one");
writeSlow("junk/two");
writeSlow("junk/three");
I did some reading and was convinced using Promises was the way to go, but this doesn’t appear to work either:
function sleep(ms) {
return new Promise((resolve) => setTimeout(resolve, ms));
}
async function run() {
arr.forEach(async (str) => {
await sleep(1000);
fs.writeFileSync("junk/" + str + ".txt", "lorem");
});
}
My expectation, or what I’m trying to get to, with the experimental code, is the point where I can watch the filesystem, and every second, a new file appears.
(edit)
The actual end result I’m looking for is for the next http request to only fire once the last one completes.
2
Answers
You could write an asynchronous loop:
forEach doesn’t behave like you may think it will with async/await. Take a look at this post on SO for more information:Using async/await with a forEach loop
Instead, in your case (as you will see above), you can use a standard for loop: