How to run 2 PHP script simultaneously (non-blocking) to monitor a popen command?

How to run 2 PHP script simultaniously (synchronous) for monitoring a popen command?

I have a script launching a command like this:

7za a -t7z -mx9 backup.7z "H:\Informatique\*"

And I would like to display the progress of the compression on a page using jQuery and PHP.

The php script running this command look like this:

    if( ($fp = popen("7za a -t7z ".$GLOBALS["backup_compression"]." \"".$backuplocation.$backupname."\" \"".$pathtobackup."\"", "r")) ) {
        while( !feof($fp) ){

            $fread = fread($fp, 256);
            $line_array = preg_split('/\n/',$fread);
            $num_lines = count($line_array);
            $_SESSION['job'][$jobid]['currentfile'] = $_SESSION['job'][$jobid]['currentfile']+$num_lines;
            $num_lines = 0;
            flush();
        }
        pclose($fp);
    }

jQuery call the 7za script then jquery call the listener (listener.php) each 1000ms. the listener.php page contain the following code:

session_start();
$jobid = $_GET['jobid'];

if(!isset($_SESSION['job'][$jobid])) { $arr = array("error"=>"Job not found"); echo json_encode($arr); exit(); };
$arr = array(
    "curfile" => $_SESSION['job'][$jobid]['currentfile'],
    "totalfiles" => $_SESSION['job'][$jobid]['totalfiles'],
);
echo json_encode($arr);
$jobid = null;
$arr = null;
exit();

After the jquery call is complete (with the listener and we got a response from the server) we display the information with something normal like: $("currentfile").text(data['curfile']);

The problem is that the listener is in an infinite loop waiting for the first script to complete... and that's not the job of the listener... It's listening at the end of everything... when you listen, it's to know what's happening. :P

Do you have any idea what's going on and how can I fix this problem? Or maybe you can help me with a new approch to this problem?

As always, any suggestions will be welcome. Thank you.

EDIT

jQuery script:

function backup_launch(jobid) {
    x('jobid: '+jobid+' on state '+state);

    x('Listener launched');
    listen(jobid);

    timeout = setTimeout("listen('"+jobid+"')", 500);
    $.ajax({
        url:'backup.manager.php?json&jobid='+jobid+'&state='+state,
        dataType:'json',
        success:function(data)
        {
            state = 3;
        }
    });
}
function listen(jobid) {

    $.ajax({
        url:'backup.listener.php?json&jobid='+jobid,
        dataType:'json',
        success:function(data)
        {
            var curfile = data['curfile'];
            var totalfiles = data['totalfiles'];
            var p = curfile * 100 / totalfiles;
            x('File '+curfile+' out of '+totalfiles+' progress%: '+p);
            timeout = setTimeout("listen('"+jobid+"')", 500);

        }
    });
}

EDIT 2 I found Gearman (http://gearman.org/) but I don't know at all how to implement this and it needs to be portable/standalone... I'll try to investigate that.

EDIT 3

Full code for the backup.manager.php page. The script is sending the response right aay, but do the job in the background.

The listen.php page still wait for the command to finish before returning any results.

$jobid = isset($_GET['jobid']) ? $_GET['jobid'] : 0;

//Make sure jobid is specified
if($jobid == 0) { return; } 

header("Connection: close");
@ob_end_clean();
ignore_user_abort();
ob_start();

echo 'Launched in backgroud';

$size = ob_get_length();
header("Content-Length: ".$size);
ob_end_flush();
flush();

$_SESSION['job'][$jobid]['currentfile'] = 0;

// 3. When app appove backup,
//  - Write infos to DB
//  - Zip all files into 1 backup file
$datebackup = time();
$bckpstatus = 1; //In progress

$pathtobackup = $_SESSION['job'][$jobid]['path'];

/*
$query = "INSERT INTO backups (watchID, path, datebackup, checksum, sizeori, sizebackup, bckpcomplete)
        VALUES ($watchID, '{$path}', '{$datebackup}', '', '{$files_totalsize}', '', '{$bckpstatus}')"; 

$sth = $db->prepare($query);

$db->beginTransaction();
$sth->execute();
$db->commit();
$sth->closeCursor();
*/

$backupname = $jobid.".".$GLOBALS["backup_ext"];
$backuplocation = "D:\\";

if( ($fp = popen("7za a -t7z ".$GLOBALS["backup_compression"]." \"".$backuplocation.$backupname."\" \"".$pathtobackup."\"", "r")) ) {
    while( !feof($fp) ){

        $fread = fread($fp, 256);
        $line_array = preg_split('/\n/',$fread);
        $num_lines = count($line_array);
        $_SESSION['job'][$jobid]['currentfile'] = $_SESSION['job'][$jobid]['currentfile']+$num_lines;
        $num_lines = 0;
        sleep(1);
        flush();
    }
    pclose($fp);
}


Jeremy,

A couple of months ago, I answered a similar question about running server-side batch jobs in a *NIX/PHP environment. If I understand correctly, your requirement is different but it's possible there might be something in the answer which will help.

Run a batch file from my website

EDIT

Here's a modified version of your client-side code. You will see that the main things I have changed are :

  • to move listen(jobid); inside backup_launch's success handler.
  • to add error handlers so you can observe errors.

Everything else is just a matter of programming style.

function backup_launch(jobid) {
    x(['jobid: ' + jobid, 'on state', state].join(' '));
    $.ajax({
        url: 'backup.manager.php',
        data: {
            'json': 1,
            'jobid': jobid,
            'state': state
        }
        dataType: 'json',
        success:function(data) {
            state = 3;
            x("Job started: " + jobid);
            listen(jobid);
            x("Listener launched: " + jobid);
        },
        error: function(jqXHR, textStatus, errorThrown) {
            x(["backup.manager error", textStatus, errorThrown, jobid].join(": "));
        }
    });
}
function listen(jobid) {
    $.ajax({
        url: 'backup.listener.php',
        data: {
            'json': 1
            'jobid': jobid
        },
        dataType: 'json',
        success: function(data) {
            var curfile = data.curfile;
            var totalfiles = data.totalfiles;
            var p = curfile * 100 / totalfiles;
            x(['File', curfile, 'out of', totalfiles, 'progress%:', p].join(' '));
            timeout = setTimeout(function() {
                listen(jobid);
            }, 500);
        },
        error: function(jqXHR, textStatus, errorThrown) {
            x(["Listener error", textStatus, errorThrown, jobid].join(": "));
        }
    });
}