Okay, here's the detail
I have ubuntu VM with nginx as web server, here's all the app
- backend, laravel
- frontend, laravel
- websocket, socketio, expressJS (run on port 8015, reverse proxy to domain-ws.com)
- some hardware/raspberry that always connect to websocket (domain-ws.com)
And i have some feature on frontend to do some handshake
to check services status, here's the flow
- frontend: send ajax to FE route
- fe controller: send http request to BE api endpoint
- BE api: send ElepahntIO Client to websocket (domain-ws.com)
- WS: get data, and send broadcast to raspberry
- raspberry: send feedback to WS
- WS: send feedback to BE api (point 3)
- BE api: return output true (true mean hardware is online)
- fe controller: return output of BE api
- frontend: get output from ajax
Yeah, that's a lot of process. But, here's the problem, there are 3 raspberry with different functionality, and in frontend there is a dashboard to asynchronously check all raspberry status. it was fine if only 1 user doing it, but there is a problem if 2 browser tab refresh the dashboard page at the same time, the server (i don't know it was be, fe, or ws) is down for a second and then dashboard page redirect to 504 bad gateway
My first thought is, it was from nginx revers proxy websocket configuration, here's the conf
location / {
proxy_pass http://127.0.0.1:8015/;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# Buffer size adjustments
proxy_buffer_size 128k;
proxy_buffers 4 256k;
proxy_busy_buffers_size 256k; }
note: the rest of the configuration like a normal configuration to conf some SSL and domain, and redirect from http to https
and here's for the laravel project nginx conf
server {
listen 443 ssl;
server_name domain-fe.com;
ssl_certificate /etc/ssl/certs/domainfe.crt;
ssl_certificate_key /etc/ssl/certs/domainfe.key;
# Set maximum upload size to 100MB
client_max_body_size 100M;
root /home/domainfe/public;
index index.php index.html index.htm;
location / {
try_files $uri $uri/ /index.php?$query_string;
# Buffer size adjustments
proxy_buffer_size 128k;
proxy_buffers 4 256k;
proxy_busy_buffers_size 256k;
}
location ~ \.php$ {
include snippets/fastcgi-php.conf;
fastcgi_pass unix:/var/run/php/php8.2-fpm.sock;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
location ~ /\.ht {
deny all;
}
}
At some moment i want to increase the timeout, but i'm afraid the bigger timeout makes the longer loading takes, so yeah i don't do that
Thanks