mget in Bash script doesn't work due to huge files because of time out error

839 Views Asked by At

I am trying to execute bash script which has mget *.* to download all the files in the directory. it is downloading couple of files in File1 but it is skipping the File2 part probably due to time out error. I believe it is due to time out error because

1) I tried to implement the same to other directories, it worked perfectly. Might be because the files in the File2 are less.

2)while i tried to do command line mode of

sftp username@hostname 
cd file2
mget *.* 

it took at least 40 sec to 1 min to respond, but it did download all the files eventually.

So, I guess while executing the bash script, it might be stopping due to time out. Please suggest me the work around. Below is my bash script.

#test purpose only
#!/bin/bash
export Src_Dir=/path

File1=/path/*.*
File2=/path/Archive/*.*
DATE=`date +"%Y-%m-%d"`
Pass_Pwd='password'
PORT=22

cd "$Src_Dir" || { echo 'Failed to chdir into $Src_Dir' ; exit 0; }

/usr/bin/expect<<EOD


spawn /usr/bin/sftp -o Port=${PORT} username@host
expect "password:"
send "$Pass_Pwd\r"
expect "sftp>"
send "lcd ${Src_Dir}\r"
expect "sftp>"
send "mget ${File1}\r"
expect "sftp>"
send "mget ${File2}\r"
expect "sftp>"
send "bye\r"
EOD
echo "Download done"
2

There are 2 best solutions below

0
On BEST ANSWER

I have written set timeout -1 above Spawn command, then it worked perfectly :)

Thank you guys :)

0
On

As an alternative to sftp/expect, and assuming you are not able to setup public/private key, consider using lftp. Much easier to script, and no need to deal with changing prompts, etc.

It can also support transfer of multiple files in parallel, which will speedup you data transfer. Look also at increasing the data transfer blocks (size, parallel) to make things more efficient.

lftp -u username,password -psftp://host:port/