I am running an app with react and node that uploads csv files via react, node then converts them to json, processes the data and finally it outputs an excel file using npm exceljs
, the whole thing takes a little time on my local environment but it works well.
But when I put it on production I get a 502 error when it’s processing the excel file.
There is no much information other than 502, on the nginx error log I get:
`*271 upstream prematurely closed connection while reading response header from upstream`
I got this on the server config
nginx.conf
sendfile on;
keepalive_timeout 65;
client_max_body_size 12000M;
client_body_buffer_size 1024k;
client_header_timeout 3000;
client_body_timeout 3000;
fastcgi_read_timeout 3000;
fastcgi_buffers 8 1024k;
fastcgi_buffer_size 1024k;
fastcgi_connect_timeout 3000;
fastcgi_send_timeout 3000;
proxy_buffer_size 1024k;
proxy_buffers 4 1024k;
proxy_busy_buffers_size 1024k;
app.conf
location /opSizing/process {
proxy_pass http://localhost:5006/opSizing/process;
}
2
Answers
In the end on my case the problem was that I was running out of memory on the server, when node was using
exceljs
to write the xlsx it used up to 2.5G of memory exceding the max RAM allowed by my linode volume.I have a similar issue using pm2 with –watch, because every time I uploaded I file, the folder upload will change, therefore the app will restart, hence the error, so if in you are using pm2 with –watch and in someway a file is being changed this will restart your node app, causing that upstream issue. You have to ignore the file or folder that is being changed and is causing the app to restart.