skip to Main Content

I have to read all rows from a mysql table, then store all rows in a PHP array.

Example:

$res = mysqli_query( $con, "SELECT * FROM myTable WHERE 1 ORDER BY type ASC, added DESC" );
while( $row = mysqli_fetch_assoc( $res ) ) { 
  $array[$row['cat']][$row['type']][] = $row; 
}

The table has 14300 rows and a size of 10.8MB

But when I check memory usage I get about 74MB, why?

Example:

echo memory_get_usage(); // return 74.6MB
echo memory_get_peak_usage(); // return 74.6MB

Is there a way to speed up / optimize the creation of the above array?

PS: in the mysql table: type, cat and added are indexes

2

Answers


  1. when I check memory usage I get about 74MB, why?

    1. PHP, like most programs with a variable memory footprint, will start with a small allocation and as it approaches the available capacity it will ask the OS for more memory. There is a execution cost overhead for this and PHP doesn’t know what the peak memory usage will be so it asks for a chunk. So when its running happily it will always have more memory than it uses.

    2. In MyQSL, each record has the same structure. So it only needs to store the attribute names once. But PHP represents this as an array or arrays. Each record element must also have a representation of the record structure.

    3. Each data item in PHP is stored in a structure called a ZVAL. This bundles a lot of meta-data. Of particular significance are the data type & the reference count (so PHP knows when the value can be discarded)

    4. If PHP simply filled up its memory with names, zvals and data the footprint would be a lot smaller. But every time it wanted to retrieve a value it would need to apply a linear search through the data. That would take an EXORBITANT amount of time. It uses indexes and hashing to speed up de-referencing of variables.

    5. In addition to the data (actually before considering the data) PHP needs to store the PHP source code in memory, and convert the human readable stuff into something closer to what the hardware understands. The last time I checked, this averaged around 3 times the size of the original code.

    Is there a way to speed up

    The memory usage isn’t going to affect performance much unless your machine is already in swap. However there is very likely scope for signficant reduction in the memory footprint based on what you’ve NOT told us: presumably there is some reason for reading all this data – i.e. you want to perform some sort of transformation on the data or send it somewhere other than PHPs memory. But we don’t know what that is so can’t advise.

    Login or Signup to reply.
  2. One way as I see it based on the information provided is to use group_concat.

    Your line,

    $array[$row['cat']][$row['type']][] = $row; 
    

    tells me that you are concerned only about grouping all relevant type to cat. So, you can instead reform your query as below and then later explode on the types column and then get your desired result.

    <?php
    
    $res = mysqli_query( $con, "
               SELECT cat, group_concat(DISTINCT type  SEPARATOR ',') as types, other_columns
               FROM myTable
               group by cat
               ORDER BY type ASC, added DESC" );
    
    while( $row = mysqli_fetch_assoc( $res ) ) { 
      foreach (explode(',', $row['types']) as $t) {
        $array[$row['cat']][$t] = $row;
      }
    }
    

    The above will help to pull less rows compared to 14300 since we are grouping by cat and we can speed up the process on the PHP side with explodeing again.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search