I am trying to use Gitlab Ci to move a static website (an index.html, a style.css and a couple of media files) to a server (just testing this out).
So I was able to basically get it done, but it only works, when I trigger the pipeline by pushing from the command line.
Here is my .yml:
deploy_stage:
image: node:18
stage: deploy
script:
- apt-get update -qq && apt-get install -y -qq lftp
- lftp -e "set sftp:auto-confirm yes; mirror --reverse --verbose --delete . karlanton/stage; quit" -u $FTP_USERNAME,$FTP_PASSWORD sftp://ssh.strato.de -p 22
deploy_live:
when: manual
only:
- tags
stage: deploy
image: node:18
script:
- apt-get update -qq && apt-get install -y -qq lftp
- lftp -e "set sftp:auto-confirm yes; mirror --reverse --verbose --delete . karlanton/live; quit" -u $FTP_USERNAME,$FTP_PASSWORD sftp://ssh.strato.de -p 22
When I git push
from the command line my pipeline goes through as expected and the website is pushed to the stage
directory.
However, when I go to Gitlab > Repository > Tags and create a new Tag, the pipeline triggered by this (which should deploy the website first to stage and then manually to live) hinges on the lftp
step and just "loads forever", not even terminating, timing-out:
Can anyone explain this very odd behaviour?
Cheers from Bavaria, Max :)
Found the problem: The variables I used were protected and no error is thrown when trying to access them. They are only available to protected branches, but obviously not for tags. So the ftp client probably waits forever for input that never comes.