In Node, I see there are options for configuring space for the "Young generation" and "Old generation" sections of memory, i.e. --max-semi-space-size
and --max-old-space-size
, but is there a limit to V8’s large object space, and if so can it be configured through Node options?
I tried running a memory test application that stored large objects (>10MB in size) and noticed my Node process was able to use up to 10GB+ of memory when viewing the process in top
2
Answers
the limit for standard 64bits systems is 1.4gb. but you can upgrade the limit with
node --max-old-space-size=2000 app.js
(it’s in mb, this piece of code upgrades it from 1.4gb to 2.0gb)
but the limit depends on how much memory you have. if you upgrade the space sizes beyond the standard 1.4gb always monitor the application’s memory :).
but to answer your question, it really depends on how much memory you have for the application.
(side: why do you need that much?)
(V8 developer here.)
The size of Large Object Space cannot be configured separately. For limits purposes, it is considered part of the old generation. In other words, the value of
--max-old-space-size
configures the maximum combined size of regular and large old space.Here’s a simple demo:
This will manage to allocate just under 100 (about 95-98) arrays of 1 MB each before running out of memory. (If you want to test this in
d8
, use1<<18
as the array size, becaused8
uses pointer compression by default so objects are only half as big as they are in Node.) The size threshold for objects getting allocated in Large Object Space is 128KB, so what you see here is Large Object Space running into the--max-old-space-size
limit.The overall process memory consumption can be a lot more than the maximum size of the managed heap. For Node in particular, I guess the easiest way to create that situation is to use off-heap objects such as Buffers. Large Object Space has nothing to do with this, and no part of V8’s heap configuration can control/limit off-heap allocations done by its embedder (i.e. Node).