@bytesnz

Jack Farley, Web Application Engineer

Run PHP Run!

I recently changed from using Wordpress cron jobs to using disconnecting AJAX requests for scanning for images with Gallery Hierarchy to try and get a little more stable and reliable scanning system.

One of the problems that didn’t go away doing this was dealing with the maximum execution time of a PHP script. This is set by (among other things) PHP max_execution_time setting in the PHP configuration file (php.ini).

After coming up with a solution to check if the scan job had been running for longer than this execution time and then restarting it and finding that it wasn’t working, I did a couple of tests.

My first test was this simple PHP script:

1
2
3
4
5
6
7
8
9
10
11
12
<?php
$file = 'test1.log';
file_put_contents($file, "New test started at " . time() . "\n", FILE_APPEND);
file_put_contents($file, "max_execution_time: " . ini_get('max_execution_time') . "\n", FILE_APPEND);
file_put_contents($file, "max_input_time " . ini_get('max_input_time') . "\n", FILE_APPEND);
$i = 0;
$time = 0;
while (1) {
file_put_contents($file, "" . $i++ . "\n", FILE_APPEND);
sleep(1);
}
?>

After running the script in the console while monitoring the test1.log file (and having it running for more that 400s [to be expected as the max_execution_time is 0 (unlimited) in the console]), I tried running it in the browser and got the following in test1.log:

1
2
3
4
5
6
7
8
9
10
11
12
New test started at 1424797632
max_execution_time: 300
max_input_time 60
0
1
...
300
301
302
...
500
...

After seeing this, I wondered if the maximum execution time was the maximum CPU time rather than just time, so I adjusted my test script to the following:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
<?php
$file = 'test1.log';
file_put_contents($file, "New test started at " . time() . "\n", FILE_APPEND);
file_put_contents($file, "max_execution_time: " . ini_get('max_execution_time') . "\n", FILE_APPEND);
file_put_contents($file, "max_input_time " . ini_get('max_input_time') . "\n", FILE_APPEND);
$i = 0;
$time = 0;
while (1) {
if (time() > $time) {
file_put_contents($file, "" . $i++ . "\n", FILE_APPEND);
$time = time();
}
}
?>

After running it in the browser again, the output was a little more expected and confirmed my suspicions.

1
2
3
4
5
6
7
8
9
10
11
12
New test started at 1424797725
max_execution_time: 300
max_input_time 60
0
1
...
300
301
302
...
371
372

Even though it didn’t get killed bang on 300 seconds, it did get killed.

So after the tests and finding that it judging whether something has been killed simply based on the time and the max_execution_time was a bad idea, I decided to change my approach. Instead of seeing how long it has been running for, I see how long it has been since its status was last saved - something that should happen every 10 seconds. If there hasn’t been a status update in a while (30 seconds), I assume that the scan job has been killed and try and restart it. It should be noted that if it dies due to an error, it will not be restarted. I have also set the execution limit to 0 (using set_time_limit) in the hope that it is on a server that it will work on.

One function that may be of use for monitoring cpu time is getrusage, which can be used to monitor CPU time among a few other useful stats.

Some other ways that I considered was:

  • using getrusage in the PHP script to its running time so that when it got close to its end, it could die gracefully/signal it was going to die. This would however add processing overhead.
  • storing the PID of the PHP script and then running once that process goes. This, on a super busy machine could lead to the missing the PHP script dying - if the server happened to have lots of PID reuse and a new process happened to pick up the PID of the PHP script and the run indefinitely… unlikely(ish), but it could happen.