Context:
I have some code like this in my application:
let blob = new Blob([JSON.stringify(json)], {type: "application/json"});
However, it sometimes fails because in Chrome the maximum string length is ~500MB, and json
can sometimes be larger than this.
Question:
I’m looking for a way to go straight from my json
variable (i.e. a POJO) to a Blob, probably via some sort of streaming stringification that saves to an ArrayBuffer as it goes. Or any other way to get a large json
object into a Blob without running into a ‘maximum string length’ error.
Notes:
- Solution must work in the browser.
- If an existing library is proposed in an answer, it must not be one that expects the
json
to simply be an array, since that case is very easy to handle. It must instead expect an arbitrarily nested JSON object where e.g. 90% of the data could be infoo.bar.whatever
rather than spread evenly over the top-level keys, or whatever. - I am not looking for a solution that expects a stream as an input, and results in a stream of string chunks as the output, like json-stream-stringify or streaming-json-stringify, for example. Instead, I’d like to input an already-in-memory POJO, and get a Blob out which contains the stringified JSON.
Related:
- How to use JSONStream to stringify a large object – OP seems to have a similar problem, but is specifically asking about
JSONStream
, which is for Node.js rather than the browser, and I think the solution given just saves to a file key-by-key, rather than in a "fully nested" manner? If there’s a way to get this working in the browser in a way that results in an ArrayBuffer that contains the larger-than-maximum-string-length JSON string for arbitrarily nested, objects, then that would definitely qualify as an answer. - How to use streams to JSON stringify large nested objects in Node.js? – same as above.
2
Answers
A possible solution here (note: written via some back and forth with ChatGPT-4), and I've yet to rigorously test, but seems to make sense and hold up in my testing so far.
✅ Test 1:
✅ Test 2:
Will update this answer if I find any errors/problems when this gets to production.
We can actually workaround that limitation by generating the
Blob
through chunks of strings.Given that the
Blob
constructor also accepts other Blobs in itsblobParts
input, we can even reuse a simple recursive stringifier and replace all the parts that wouldjoin()
the inner temporary list of values to instead produce a list ofBlob
objects, interleaved with the DOMString separator.So at the end we produce something like
And we’re safer from the 500MiB limit.
Here I quickly patched this implementation, but I didn’t perform any serious testing against it, so you might want to do it again yourself: