My web server processes a huge file and then sends a response. I tried various nginx timeout options with no luck. I tried the parameters recommended in this question , however I still see a timeout page with an error in the nginx error logs.
1 upstream prematurely closed connection while reading response header from upstream,client: 10.0.42.97, server:
Here is my nginx.conf
http { include /etc/nginx/mime.types; default_type application/octet-stream; access_log /var/log/nginx/access.log; sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 65; client_header_timeout 600; client_body_timeout 600; send_timeout 600; proxy_read_timeout 600; fastcgi_buffers 8 16k; fastcgi_buffer_size 32k; fastcgi_read_timeout 600; gzip on; gzip_http_version 1.0; gzip_comp_level 2; gzip_proxied any; gzip_types text/plain text/html text/css application/x-javascript text/xml application/xml application/xml+rss text/javascript application/javascript application/json; server_names_hash_bucket_size 64; include /etc/nginx/conf.d/*.conf; include /etc/nginx/sites-enabled/*; }
I still see 502 temporary gateways from time to time, with the indicated error. Any pointers to what might be wrong? My input file is a csv file if that helps. Any pointers or recommendations?
How can i fix this? How to increase the waiting time?
nginx
Vinodh
source share