Friday, May 6, 2011

Does a cron job kill last cron execution?

Hey All,

I have a cron job the executes a PHP script. The cron is setup to run every minute, this is done only for testing purposes. The PHP script it is executing is designed to convert videos uploaded to the server by users to a flash format (eg... .flv). The script executes fine when manually doing it via command line, however when executing via cron it starts fine but after one minute it just stops.

It seems that when the next cron is executed it "kills" the last cron execution. I added the following PHP function:

ignore_user_abort(true);

in hopes it would not abort the last execution. I tested to set the cron to run every 5 minutes, which works fine; however a conversion of a video may take over 5 minutes so I need to figure out why its stoping when another cron is executed.

Any help would be appreciated.

Thank you!

EDIT: My cron looks like:

*/1 * * * * php /path_to_file/convert.php
From stackoverflow
  • cron itself won't stop a previous instance of a job running so, if there's a problem, there's almost certainly something in your PHP doing it. You'll need to post that code.

  • I don't think cron kills any processes. However, cron isn't really suitable for long running processes. What may be happening here is that your script tramples all over itself when it is executed multiple times. For example, both PHP processes may be trying to write to the same file at the same time.

    First, make sure you not only look in the php error log but also try to capture output from the PHP file itself. E.g:

    */1 * * * * * php /path/to/convert.php & >> /var/log/convert.log
    

    You could also use a simplistic lockfile to ensure that convert.php isn't executed multiple times. Something like:

    if (file_exists('/tmp/convert.lock')) {
        exit();
    }
    
    touch('/tmp/convert.lock');
    // convert here
    unlink('/tmp/convert.lock');
    
    Michael Kohne : cron WILL NOT kill a process. The script (or something the script is calling) is getting in it's own way.
  • No, it will not. You can keep a second process from running by creating a lock file that the script checks for on each run. If the file exists, it does not run. This should also, if appropriate, be used in conjunction with a maximum execution time so that one process does not stall future executions indefinitely. The lock file can just be an empty plain text file called /tmp/foo.lock.

0 comments:

Post a Comment