skip to Main Content

I am trying to generate and count large amount of small json files at once (like millions of them) using Laravel’s Storage class.

I’ve tried count(Storage::files(PATH)) but that did not work due to the fact that there are millions of them.

First it returned an error of execution time being exceeded 30 secs so I have tried upping the max_execution_time but now after running for a while it returns a blank page with no clue what went wrong

I am hoping for some kind of clever approach to count all files effectively
maybe as chuncks (which i dont think storage have as a function)
or any kind of optimization to the process which might help stopping it from crashing

any help please?

2

Answers


  1. I would do this with PHP‘s exec — Your OS already knows the number of files in the directory .. It indexes it, so why not use it? Super fast and clean.

    <?php
    
    exec('ls -l /path/to/your/json/files | grep ^- | wc -l', $out);
    echo $out[0];
    

    Note that this exec returns an array — So you need to use the index of 0 on $out

    UPDATE — I tested using the following to create 3,000,000 files — The resulting php exec function ran in < 20 seconds.

    #!/bin/bash
    max=3000000
    for i in `seq 2 $max`
    do
        echo "$i" > $i.json
        echo "$i"
    done
    
    Login or Signup to reply.
  2. Even if you increase max_execution_time to 1 hour it will certainly time out anytime around 10 minutes, that’s why you are getting a blank page.

    @Zak’s solution is your best bet.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search