skip to Main Content

If I try to instantiate a large two-dimensional std::array on our Centos 7.6 system, compiled with g++ 7.3.1, I get a segmentation fault:

int main()
{
    const unsigned NUM_BUFFERS = 200;
    const unsigned BUFFER_SIZE_BYTES = 1024 * 1024;

    std::array<std::array<unsigned char, BUFFER_SIZE_BYTES>, NUM_BUFFERS> buffer;
}

gives:

$ ./main
Segmentation fault (core dumped)

Is there a size limit on the length of a std::array?

I have declared this array on the stack but presumably the storage is on the free store. What is failing here?

2

Answers


  1. You’re getting a stack overflow.

    std::array allocates in-place, and stack space is usually very limited (e.g. on Windows is 1MB, on Linux between 4-8 MB).

    Use std::vector instead, which allocates on the heap.

    Login or Signup to reply.
  2. Is there a size limit on the length of a std::array?

    The limit on the size of an object is implementation defined. The standard specifies the minimum required limit as follows, but most implementation probably support much larger objects (up to available memory):

    [implimits]

    Size of an object ([intro.object]) [262 144].


    I have declared this array on the stack but presumably the storage is on the free store

    You’ve presumed wrong. std::array doesn’t allocate anything from the free store, but has the array as a member.

    What is failing here?

    Stack overflow.

    The size of the execution stack depends on the implementation. On desktop systems, the default size of stack is one to few megabytes. All automatic variables of every function call must fit into this space.

    To avoid this limitation, allocate the array dynamically (for example with std::make_unique or std::vector).

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search