I am using a combination of find and copy command in my backup script. it is used on a fairly huge amount of data,
- first, out of 25 files it needs to find all the files older than 60 mins
- then copy these files to a temp directory - each of these files are 1.52GB to 2GB
one of these 25 files will have data being appended continuously.
I have learnt from googling that Tarring operation will fail if there is an update going on to the file being attempted to tar, is it the same thing with find and copy also?? I am trying something like this,
/usr/bin/find $logPath -mmin +60 -type f -exec /bin/cp {} $logPath/$bkpDirectoryName \;
after this I have a step where I tar the files copied to the temp directory as mentioned above(&bkpDirectoryName), here I use as mentioned below,
/bin/tar -czf $bkpDir/$bkpDirectoryName.tgz $logPath/$bkpDirectoryName
and this also fails.
the same backup script was running from past many days and suddenly it has started failing and causing me headache! can someone please help me on this??
can you try these steps please
If you do the above, the file which is continuously appended will not be moved.
In case any of your other 24 files might be updated after 60 min, you can do the following
If nothing works due to some custom requirement on your side, try doing a rsync and then do the same operations on the rsynced files (i.e find and tar or just tar)