PHP script doesn't quit when browser connection is closed

108 Views Asked by At

NOTE: this is not a duplicate of question 25363635 I'm using php-fpm + apache mod_fcgi + proxy (see below). connection_aborted() and ignore_user_abort() do not work in my case. TLDR is how to make them work with my setup - fcgi+proxy.

I'm making a log proxy using Server-Sent Events. I'm using these headers:

header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
header('Connection: close');

Then I receive the log lines trhough a socket and output them like this:

foreach($lines as $line) {
  echo "data: $line\n\n";
}
flush();

The problem is that when the browser is closed (also tried using curl) the php script continues running. That takes ram and also prevents a new server socket from being opened to receive logs. I'm using php-fpm (8.1) and fcgi with apache 2.4. Apache config lookslike this:

  <FilesMatch "\.php$">
     SetHandler "proxy:unix:/usr/local/php81/var/run/website.sock|fcgi://localhost/"
  </FilesMatch>
  <Proxy "fcgi://localhost/">
    ProxySet timeout=600
  </Proxy>

Is it possible to either make php script quit when browser closes the connection or find another (relatively simple) way to forward a log to the browser in realtime. I don't want to write web sockets server or install additional software unless there is really no other way.

Also I like the idea of long-poll for other purposes too. But if the scripts keep hanging in the background not knowing that the browser closed the connection it'll eventually fill the allowed number of fpm processes and the server will no longer respond.

Also for some reason the process doesn't quit when the max_execution_time comes. It continues forever. That might be due how I use sockets to receive the log:

<?php

$host = "0.0.0.0";
$port = 1234;

header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
header('Connection: close');


echo "data: creating socket. pid ".getmypid().", time limit: ".ini_get('max_execution_time')."\n\n";
flush();

$socket = socket_create(AF_INET, SOCK_STREAM, 0);
if(empty($socket)) {
  $error = socket_strerror(socket_last_error());
  die("data: could not create socket: $error\n\n");
}
// Set the SO_REUSEADDR option
if (!socket_set_option($socket, SOL_SOCKET, SO_REUSEADDR, 1)) {
  die('data: socket_set_option() failed: ' . socket_strerror(socket_last_error($this->socket))."\n\n");
}

$success = @socket_bind($socket, $host, $port);

if(!$success) {
  $error = socket_strerror(socket_last_error());
  die("data: could not bind: $error\n\n");
}
echo "data: listening for connections\n\n";
flush();
socket_listen($socket);
$clientSocket = socket_accept($socket);

echo "data: got a connection\n\n";
flush();
socket_set_nonblock($clientSocket);

$read = [$clientSocket];
$write = null;
$except = null;
$buffer = "";

ignore_user_abort(false); // doesn't help
while(true) {
  if (connection_aborted()) { // doesn't help either
    break;
  }

  $changedSockets = $read;
  $numChanged = socket_select($changedSockets, $write, $except, 0, 800*1000); // 5000 microseconds timeout

  if($numChanged === false)
    break;
  if ($numChanged <= 0)
    continue;


  $input = socket_read($clientSocket, 16484);
  if($input === false) { // script only quts when the connection with log client is broken
    break;
  }
  $buffer .= $input;

  $lines = explode("\n", $buffer);

  if(count($lines) > 1) {
    $buffer = array_pop($lines); //last line may be incomplete
    foreach($lines as $line) {
      echo "data: $line\n\n";
    }
    flush();
  }

  if (strpos($buffer, "\n") !== false) {
    list($line, $buffer) = explode("\n", $buffer, 2);
    if(empty($line))
      echo "data: empty line\n\n";
    echo "data: $line\n\n";
    flush();
//    fastcgi_finish_request();
  }
}

socket_close($clientSocket);
socket_close($socket);

Any hints are appreciated!

1

There are 1 best solutions below

0
On

It seams proxy or fcgi does some buffering which prevents php from knowing that browser has closed the connection. I couldn't find any info on how to make proxy/mod_fcgi flush more frequently.

When running php as CGI directly that problem does not exist, but cgi is inefficient. However the Server Sent Events is usually started once and then it's run for a long time, which reduces the penalty of starting a new process for every script start.

So as a workaround I did this in my vhost config:

<FilesMatch "\.php$">
 SetHandler "proxy:unix:/usr/local/php81/var/run/xmlgen-php-fpm.sock|fcgi://localhost/"
</FilesMatch>
<Proxy "fcgi://localhost/">
ProxySet timeout=600 flushpackets=on
</Proxy>

# CGI Configuration for scripts named sse-*.php
<FilesMatch "^sse-.*\.php$">
  Action php81-cgi /cgi-bin/xmlgen/php81
  SetHandler php81-cgi
</FilesMatch>

This way all php scripts that start with 'sse-' will be run through cgi.

And because I lost a lot of time on why CGI didn't work here is a bonus hint if you are using suexec (SuexecUserGroup in apache 2.4) php-cgi must be in doc root (suexec -V will show it) and also it must be with the same user/group as in the SuexecUserGroup. One way to achieve this is to put a wrapper script:

cat /var/www/cgi-bin/<vhostname>/php81
#!/bin/bash
PHP_CGI=/usr/local/php81/bin/php-cgi
export PHPRC="/webroot/picasse/xmlgen/etc/php.ini"
# this line irrelevant for CGI - you can skip it:
export PHP_FCGI_MAX_REQUESTS=10000
exec $PHP_CGI -c $PHPRC

and make sure to chown/chgrp the file with proper user/group and chmod +x

I'd love to skip this configuration complexity and get rid of the buffering whereever it happens. So I'm still waiting for an answer how to configure apache so php that's running as fcgi service knows when browser closes the connection.