I have a very specific way I need my project to work. I have .bat files being committed to project A with a collection of arguments that need to be actioned by a Python script on project B.
I need project A to be triggered when a new file is committed and then triggering project B to run the script using the variables from the .bat file (sys.argv[1] - [4]
Once project B has been triggered I would like the file in Project A to be deleted so that when the next file is committed to project A there isn’t any conflicting files as there could be up to 100.bat files committed in one day
I have tried simply using power automate to trigger a pipeline once the .bat file is ready however due to organisation blokes I wasn’t able to do so. I have tried going through the GitLab wiki and it seems possible however it’s not clear how my .gitlab-ci.yml is structured on each project for my specific need
For me it's not entirely clear how the bat and python scripts should interact and how the variables should be defined.
But generally you could use a Downstream pipeline, where you retrieve the .bat files from artifacts of the parent pipeline and after the job is finished you could delete the bat-files from the repository.
1. Save bat as pipeline artifacts in project-a:
In the project-b pipeline you could retrieve the artifacts like this:
Now your python script could use the bat files, because they are downloaded in your pipelines workdir
2. You could trigger project-b pipeline from project-a like this:
The
strategy: dependmakes the job wait for the result of the child pipeline, if you want this.Finally you could define a job to delete the *bat files, but you would need to create a token to use the Gitlab API, because the 'CI_JOB_TOKEN' has no access to modify the repository.
By using
stagesor theneedskeyword you would have to ensure that the jobs run in the right order