skip to Main Content

I am using a javascript fetch post call to submit form data that is over 1mb.
A backend perl/cgi script writes the data to a file.

When submitted via chrome/IE the data is truncated at 1mb and doesn’t save.
Postman calls using the raw data from the broswer successfully runs and saves.

The content-length shows the correct length for both postman and browser

When I return the input data to the response so I can see what is happening from postman:

{6 mbs of data}
------WebKitFormBoundarylYiQSbqkKI68Lzxk--OK6887007

When I return the input data to the response so I can see what is happening from the browser:

{1 mb of data}
OK6887007

I have tested with data just under 1mb and it uploads/saves fine in the browser.

I really cannot be 100% sure it is the fetch truncating it,
the broswer truncating it, or the perl script cutting it off before the finished transmission.

In the network tab I can see the entire 6mb request, but cannot tell if it actually gets there. From reading, there doesn’t appear to be any limit I am supposed to be hitting.

My javascript function:

async function serverSave(data,access) {
    // Data to send in the POST request
    const formData = new FormData();
    formData.append('access', access);
    formData.append('data', data);
  
    // URL endpoint to send the POST request to
    const url = 'http://www.mywebsite.com/cgi-bin/upload.pl';
    // Options for the fetch request
    const options = { method: 'POST', body: formData };
    
    // Send the POST request using fetch
    try{
        response = fetch(url, options);
        const result = await response.text();
        console.log('Response:', result);
    }
    catch {
        // Handle errors
        console.error('There was a problem saving to the server', error);
    };  
}

My perl script:

#!/usr/bin/perl
use strict;
use warnings;

my $buffer = '';
if($ENV{CONTENT_LENGTH}) { read(STDIN,$buffer,$ENV{CONTENT_LENGTH}); }

my ($data_value) = $buffer =~ /name="data"s*?ns*?n(.+?)n-{5}/s;
my ($access_value) = $buffer =~ /name="access"s*?ns*?n(.+?)n-{5}/s;

$access_value =~ s/[rn]+$//;
$data_value =~ s/[rn]+$//;

my $fn = "../zfiles/".$access_value.".json";

open(FH, '>', $fn) or die $!;
print FH $data_value;
close(FH);

print "Content-type: text/plainnn";
print $buffer;
print "OK";
print $ENV{CONTENT_LENGTH};

2

Answers


  1. Chosen as BEST ANSWER

    Since I had full control over the request, I switched it to text/plain data and devised my own text and added my own headers.

    It appears that the form-data headers seem to have been causing the problem.

    async function serverSave(data,access) {
        // Data to send in the POST request
     
        var body = access + '-----' + data;
    
        // URL endpoint to send the POST request to
        const url = '/cgi-bin/upload.pl';
        // Options for the fetch request
        const options = { method: 'POST', headers: {
            'Content-Type': 'text/plain',
            'Content-Length': body.length,
            'Accept': '*/*',
            'Accept-Encoding': 'gzip, deflate, br',
            'Connection': 'keep-alive'
        }, body: body };
        
        // Send the POST request using fetch
        try{
            response = fetch(url, options);
            const result = await response.text();
            console.log('Response:', result);
        }
        catch {
            // Handle errors
            console.error('There was a problem saving to the server', error);
        };  
    }
    
    #!/usr/bin/perl
    use strict;
    use warnings;
    
    my $buffer = '';
    if($ENV{CONTENT_LENGTH}) { read(STDIN,$buffer,$ENV{CONTENT_LENGTH}); }
    
    my ($access_value) = $buffer =~ /^(.*?)-----/;
    my ($data_value) = $buffer =~ /-----s*(.*)/;
    
    $access_value =~ s/[rn]+$//;
    $data_value =~ s/[rn]+$//;
    
    my $fn = "../zfiles/".$access_value.".json";
    
    open(FH, '>', $fn) or die $!;
    print FH $data_value;
    close(FH);
    
    print "Content-type: text/plainnn";
    print $buffer;
    print "OK";
    print $ENV{CONTENT_LENGTH};
    

  2. I’ve read that uploading as a Blob mitigates the issue with post size limits in some browsers, but my personal preference is to split the data up and upload it in chunks, then put it back together on the back end.

    Here’s a JS library I wrote to do just that: chunk-uploader. It includes a PHP back-end script, which should be easy to convert to Perl. (BTW, if you do, feel free to send me a PR 😉)

    let cu = new ChunkUploader(file, '../chunk-uploader.php', {
        progress: function(pct){
            result.innerHTML = `${pct}% completed`;
        },
        complete: function(response){
            result.innerHTML = `Done.`;
            console.log(response);
        },
            error(error){
            result.innerHTML = `Error: ${error.message}`;
        },
        chunk_size: 500,
        // optional data to be passed to the back end
        data: {
            some: 'arbitrty',
            data: 123
        }
    });
    

    Also worth noting that if you’re using a text field in the browser, they have a limit of 524288 chars.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search