Set up:
- Upon merge to master codefresh build job builds image and pushes it to docker registry
- Codefresh test run job picks up new image and runs the test
- By the end of test run CF job, allure report building step runs
Results: 3rd step fails with message in a title only if job ran all the way through pipeline It passes fine if I rerun the job manually(no step 1, 2 are executed in this case)
Notes: Manually adding that tag does not help
Test execution pipeline:
stages: - "clone" - "create" - "run" - "get_results" - "clean_up" steps: clone: title: "Cloning repository" type: "git-clone" repo: "repo/repo" # CF_BRANCH value is auto set when pipeline is triggered revision: "${{CF_BRANCH}}" git: "github" stage: "clone" create: title: "Spin up ec2 server on aws" image: mesosphere/aws-cli working_directory: "${{clone}}" # Running command where code cloned commands: - export AWS_ACCESS_KEY_ID="${{AWS_ACCESS_KEY_ID}}" - export AWS_SECRET_ACCESS_KEY="${{AWS_SECRET_ACCESS_KEY}}" - export AWS_DEFAULT_REGION="${{AWS_REGION}}" - aws cloudformation create-stack --stack-name yourStackName --template-body file://cloudformation.yaml --parameters ParameterKey=keyName,ParameterValue=qaKeys stage: "create" run: title: "Wait for results" image: mesosphere/aws-cli working_directory: "${{clone}}" # Running command where code cloned commands: # wait for results in s3 - apk update - apk upgrade - apk add bash - export AWS_ACCESS_KEY_ID="${{AWS_ACCESS_KEY_ID}}" - export AWS_SECRET_ACCESS_KEY="${{AWS_SECRET_ACCESS_KEY}}" - export AWS_DEFAULT_REGION="${{AWS_REGION}}" - chmod +x ./wait-for-aws.sh - ./wait-for-aws.sh # copy results ojbects from s3 - aws s3 cp s3://${S3_BUCKETNAME}/ ./ --recursive - cp -r -f ./_result_/allure-raw $CF_VOLUME_PATH/allure-results - cat test-result.txt stage: "run" get_results: title: Generate test reporting image: codefresh/cf-docker-test-reporting tag: "${{CF_BRANCH_TAG_NORMALIZED}}" working_directory: '${{CF_VOLUME_PATH}}/' environment: - BUCKET_NAME=yourName - CF_STORAGE_INTEGRATION=integrationName stage: "get_results" clean_up: title: "Remove cf stack and files from s3" image: mesosphere/aws-cli working_directory: "${{clone}}" # Running command where code cloned commands: # wait for results in s3 - apk update - apk upgrade - apk add bash - export AWS_ACCESS_KEY_ID="${{AWS_ACCESS_KEY_ID}}" - export AWS_SECRET_ACCESS_KEY="${{AWS_SECRET_ACCESS_KEY}}" - export AWS_DEFAULT_REGION="${{AWS_REGION}}" # delete stack - aws cloudformation delete-stack --stack-name stackName # remove all files from s3 # - aws s3 rm s3://bucketName --recursive stage: "clean_up"```
Adding CF_BRANCH_TAG_NORMALIZED as a tag won't help in that case.
CF_BRANCH_TAG_NORMALIZED needs to be set as an environment variable for this step.
Taking a look at the source code of codefresh/cf-docker-test-reporting, https://github.com/codefresh-io/cf-docker-test-reporting/blob/master/config/index.js
you can see that CF_BRANCH_TAG_NORMALIZED is taken directly from the environment.
My assumption is that whatever triggers your build normally will not set this environment variable. It is usually set automatically when you have a git trigger e.g. from Github. When you start your pipeline manually you probably set the variable and that's why it's running then.
You should check how your pipelines are usually triggered and if the variable is set (automatically or manually).
Here's some more documentation about these variables: https://codefresh.io/docs/docs/codefresh-yaml/variables/#system-provided-variables