All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Make php-fpm to serve pages , no matter how slow.
Hi, I have need where there is bad php script, it has everything bad about it ie. poorly coded, poorly implemented and above that, it makes third party connection with each execution. Thus it takes generally long time for it work and if third party is slow then its even slower.
I have nginx+phpfpm setup, however, it is mostly showing "upstream timeout" in nginx logs even if I set
in php.ini
max_execution_time = 300
in php-fpm pool :
request_terminate_timeout = 300
in nginx config
at location ~ .php$ {} and http {}
fastcgi_read_timeout 300;
But on traffic spike, things get haywire and I start seeing 504 errors (upstream timeout in logs ) ie. nginx is not able to get reply in timely . I think 5 min is more than enough . Surprising thing is that if I switch this setup to apache + cgi then although pages are slow but they get served ultimately (and high load on server ). But with nginx+phpfpm, is just 504 everyone and almost no load on server.
So anyone with advice here what I need to tweak that even though server load is high but pages are being delivered ( no matter if are slow).
Comments
How complicated is the PHP script that you're running? Maybe its a better option to rewrite then to work around the issue?
Its complicated though, and developer is like if it works on test machine, should work anywhere (even in high traffic). So, working on PHP script is no choice atm.
sure it is, there is no problem with the nginx.
you set the timeout in exactly same value. when the php timeout, meanwhile the nginx also timeout.
set the nginx timeout higher than php. in hope nginx know (or the php inform) when the php process terminated.
Why not using apache + cgi then?
did you check if the timeout really occurs after 5 minutes? maybe you hit another limit earlier... fastcgi_read_timeout seems to be the right one though.
have you checked if you can reach the third party sites from your server at all ? maybe your IP is blocked by one of them.
prople don't like people scrape or proxy their stuff, so maybe... just maybe... :-D
what about ports those connection are to be made on, any firewall that might block outgoing stuff? you could also try to figure what individual connections are to be made and if it is a specific one that might be the issue and so on
You should probably get a better developer first.
all browser have connection-timeout setting in built so it may not be related to server.
https://stackoverflow.com/questions/39751124/increase-timeout-limit-in-google-chrome however this is not a good thing to tell every one of your visitor to set higher timeout limit and use different browser.
you need to fix it in code. use queues and/or events using socket.io.
Certainly you are not seeing it right
Ofcourse but right now I need to figure out a way at server level for time being.
Because I want to be able to handle more traffic without upgrading hardware.
I can further improve timeouts but on further observation I see that it gets in to race condition ie. requests gets piled up and then max-children runs out.
Its like if I compare with apache+CGI, in cgi it just keep trying not matter how slow it, while php-fpm just gives up. In nutshell I don't want php-fpm to give up, rather serve , no matter how slow but without choking up server.
Yes, can reach third party, there is no issue, further these third party data is used only after consent from them and its more like api calls than scrapping .
Although I see that third party also get slow when there is lots of request but I don't see way to prove it. It is also contributing in menace created during traffic spikes.
on top of script add
ignore_user_abort(TRUE);
even you close the browser the script will continue excute on background .
las if your PHP code slow why not using different execute method like PHP swoole
https://www.swoole.co.uk/
if your website is taking more than 90 seconds (in firefox) then it's browser's issue. otherwise it's something on your server.