How to do file transfer inside ssh command line?

1k Views Asked by At

I have to transfer a file from server A to B and then needs to trigger a script at Server B. Server B is a Load balance server which will redirect you either Server B1 or B2 that we dont know.

I have achieved this as below.

sftp user@Server
put file
exit

then executing the below code to trigger the target script

ssh user@Server "script.sh"

But the problem here is as I said it is a load balance server, Sometimes I am putting file in one server and the script get triggers in another server. How to overcome this problem?

I am thinking some solutions like below

ssh user@server "Command for sftp; sh script.sh"

(i.e) in the same server call if I put and triggers it will not give me the above mentioned problem. How can I do sftp inside ssh connection? Otherwise any other suggestions?

4

There are 4 best solutions below

5
On BEST ANSWER

if you're just copying a file up and then executing a script, and it can't happen as two separate commands you can do:

gzip -c srcfile | ssh user@remote 'gunzip -c >destfile; script.sh'

This gzips srcfile, sends it through ssh to the remote end, gunzips it on that side, then executes script.sh.

If you want more than one file, you can use tar rather than gzip:

tar czf - <srcfiles> | ssh user@remote 'tar xzf -; script.sh'

if you want to get the results back from the remote end and they're files, you can just replicate the tar after the script…

tar czf - <srcfiles> | ssh user@remote 'tar xzf -; script.sh; tar czf - <remotedatafiles>' | tar xzf -

i.e. create a new pipe from ssh back to the local environment. This only works if script.sh doesn't generate any output. If it generates output, you have to redirect it, for example to /dev/null in order to prevent it messing up the tar:

tar czf - <srcfiles> | ssh user@remote 'tar xzf -; script.sh >/dev/null; tar czf - <remotedatafiles>' | tar xzf -
1
On

You can use scp command first to upload your file and then call remote command via ssh.

$ scp filename user@machine:/path/to/file && ssh user@machine 'bash -s' < script.sh

This example about uploading a local file, but there is no a problem to run it on server A.

0
On

You could create a fifo (Named Pipe) on the server, and start a program that tries to read from it. The program will block, it won't eat any CPU.

From sftp try to write the pipe -- you will fail, indeed, but the listening program will run, and check for uploaded files.

# ls -l /home/inp/alertme
prw------- 1 inp system 0 Mar 27 16:05 /home/inp/alertme
# date; cat /home/inp/alertme; date
Wed Jun 24 12:07:20 CEST 2015
<waiting for 'put'>
Wed Jun 24 12:08:19 CEST 2015
0
On

transfer testing with tar gzip compression, ssh default compression. using PV for as pipe meter (apt-get install pv) testing on some site folder where is about 80k small images, total size of folder about 1.9Gb Using non-standart ssh-port 2204

1) tar gzip, no ssh compression

tar cpfz - site.com|pv -b -a -t|ssh -p 2204 -o cipher=none root@removeip "tar xfz - -C /destination/"

pv meter started from 4Mb/sec, degradated down to 1.2MB/sec at end. PV shows about 1.3Gb transfered bytes (1.9GB total size of folder)

2) tar nozip, ssh compression:

tar cpf - site.com|pv -b -a -t|ssh -p 2204 root@removeip "tar xf - -C /destination/"

pv meter started from 8-9Mb/sec, degradated down to 1.8Mb/sec at end