When downloading small 25mb chunks from a big file, the chunks are acutally much larger then 25mb. Can someone please help.
const fs = require('fs')
const files = fs.readdirSync('../in/')
files.map(fileName => {
const readable = fs.createReadStream('../in/'+fileName)
let size = 0
let temp_chunk;
let count = 0
readable.on("data", (chunk) => {
temp_chunk += chunk
const byteSize = new Blob([chunk]).size;
size += byteSize
let amount = size / 1000000
amount = amount.toString()
amount = amount.substring(0, 5) + 'mb'
console.log(amount)
if (amount > 24.5) {
console.log(fileName+' '+count+' downloaded')
fs.writeFileSync('../out/'+fileName+count, temp_chunk)
temp_chunk = ''
size = 0
count++
}
})
})
i tried reading the size of the file, from temp_chunk, this worked but made download significantly slower.
2
Answers
Use a Buffer to concatenate chunks. Directly concatenating binary data with += on a string (like temp_chunk += chunk) is not appropriate for binary data and can lead to unexpected results.
Solution
You can use WHATWG standard streams (which have been part of Node since v17) to read and buffer bytes before emitting. Using this pattern will also increase portability of your code to other environments (e.g. browsers, Deno, Bun, etc.)
A
TransformStream
(Node, MDN) which buffers up to a parameterized byte length before emitting might look like this:buffered_byte_stream.mjs
:And then using it in your main application code might look like this:
main.mjs
:Code in TypeScript Playground