Can i use file locking in this scenario?

86 Views Asked by At

Please i have a conceptual question. I have two processes that outputs a file with the same name in the same directory and then execute/read that file in a perl script. These two processes can be run concurrently and i am afraid there is a chance the file can be over written by the other process hence one of the process executing an incorrect source file. essentially the line is like this : (the two lines are common to both processes)

$cmd = `my_script.pl`;  ## This generates runx file
$cmd = `source runx | grep - i "Error";`

NOTE: I don't have control over the name of the file and i must run both processes in the same

Can i use an exclusive file lock right after the first command to prevent collusion?

1

There are 1 best solutions below

3
On

The file you lock doesn't have to be the file you want to protect.

use Fcntl qw( LOCK_EX );

my $output;
{
   my $script_qfn = 'my_script.pl';
   my $lock_qfn  = $script_qfn . '.lock';

   open(my $fh, '>', $lock_qfn)
      or die("Can't create lock file \"$lock_qfn\": $!\n");

   flock($fh, LOCK_EX)
      or die("Can't lock \"$lock_qfn\": $!\n");

   system { $script_qfn } $script_qfn;
   $output = `source runx | grep -i Error`;
}

But why not have my_script.pl send its output to STDOUT? Then there would be no need to lock. You could simply use the following:

my $output = `my_script.pl | sh | grep -i Error`;

There are three ways of calling system:

  • system($shell_command).
  • system($prog, @args). Has to have at least one arg.
  • system({ $prog } $arg0, @args). 0+ args.

In this case, we didn't have shell command, and we didn't have any args, so we had to use the third approach (or build a shell command, say, using String::ShellQuote's shell_quote). To use system($script_qfn) would have introduced a code injection bug.