How to iterate through an array in concourse

625 Views Asked by At

As far as I've searched I don't see any documentation for loops and arrays for Concourse CI.

I'm trying to migrate a job from Jenkins to CI and the snippet of my Jenkins file

def folders = [
    "roller",
    "auth",
    "Django",
    "gitlab",
    "Drone",

]
        stage('tests & conv') {
            when {
                beforeAgent true
                not {
                    branch 'master'
                }
            }
        steps {
                script {
                    parallel folders.collectEntries {
                         [
                             "tests ${i}" : {
                                 stage("Test ${i}") {
                                     sh "make ${i}"
                                 }
                             },
                             "conv ${it}" : {
                                 stage("Test ${i}") {
                                     sh "make run ${i} "
                                 }
                             },
                         ]
                     }
                }
            }
        }

How can I replicate the same in the Concourse pipeline.

I can define an array like below but unsure how to iterate thru them.

folders:
  - roller
  - auth
  - Django
  - gitlab
  - Drone
1

There are 1 best solutions below

0
On BEST ANSWER

This may not be the complete answer but should definitely start you off running in the right direction. You can in effect do what you need using Carvel YTT which automates the YAML creation. There is also a very handy playground on that site where you can check your workings, and it will show you what the generated YAML will look like.

Example vars.yaml:

#@data/values
---
folders:
 - name: roller
 - name: auth
 - name: Django
 - name: gitlab
 - name: Drone-runtime

Example schema.yaml would then be:

#@data/values-schema
---
folders:
 - name: ""

Your starting concourse Carvel-ytt pipeline.yaml could be:

#@ load("@ytt:data", "data")
#@ load("@ytt:struct", "struct")

#@ folders = []
#@ for f in data.values.folders:
#@   folders.append(struct.make(name = f.name))
#@ end

#@ def task_make(folder):
task: #@ folder.name + "make-task"
image: alpine
config:
  platform: linux
  params:
    NAME: #@ folder.name
  run: 
    path: /bin/sh
    user: root
    args: 
      - -exec
      - |
        make "$NAME"
#@ end

#@ def task_make_run(folder):
task: #@ folder.name + "make-run-task"
image: alpine
config:
  platform: linux
  params:
    NAME: #@ folder.name
  run: 
    path: /bin/sh
    user: root
    args: 
      - -exec
      - |
        make run "$NAME"
#@ end

#@ def job(folder):
name: #@ folder.name
plan:
  - get: alpine
  - #@ task_make(folder)
  - #@ task_make_run(folder)
#@ end

resources:
  - name: ytt
    type: registry-image
    source:
      repository: taylorsilva/carvel-ytt
      tag: 0.36
  - name: cicd-source
    type: git
    icon: bitbucket
    source:
      uri: ((git-source-ssh-url))/((this-repo)).git
      branch: main
      private_key: ((key.git-key))
  - name: alpine
    type: registry-image
    icon: docker
    source:
      repository: alpine
      tag: '3.15.5'

jobs:
  - name: configure-self
    plan:
      - in_parallel:
          - get: ytt
          - get: cicd-source
            trigger: true
      - task: generate-jobs
        image: ytt
        config:
          platform: linux
          inputs:
            - name: ci-source
          outputs:
            - name: pipeline
          run:
            path: /bin/sh
            user: root
            args:
              - -exc
              - |
                ytt -f ./cicd-source --output-files pipeline
      - set_pipeline: job-manager
        file: pipeline/pipeline.yaml
  #@ for folder in data.values.folders:
  - #@ job(folder)
  #@ end

Now in the pipeline.yaml there is a lot going on here that is out of scope of your OP, but pretty neat. We use taylorsilva's carvel-ytt image and this creates the pipeline.yaml that we point the concourse set-pipeline command to generate the all the jobs. This self-configure job will automatically trigger and update itself when you update its SCM. If you have any secret management set up then you can substitute the ((variable references)), otherwise replace them directly with what you need.

  • Note: that you may need to change the commands run in the tasks depending on your need.
  • Note: You should update your images to pull from a local repository rather than direct from dockerhub each time.