How to put an Argo webhook trigger parameter into an artifact?

990 Views Asked by At

I want to be able to POST a big piece of data to a webhook in Argo. In my Sensor definition I get the data from the request and put it into a "raw" artifact on the Workflow. Since the data is base64 encoded, I use a Sprig template to decode the encoded data.

Unfortunately when I use a large amount of data Kubernetes refuses to process the generated Workflow-definition.

Example with raw data

This example works for small amounts of data.

apiVersion: argoproj.io/v1alpha1
kind: Sensor
metadata:
  name: webhook
spec:
  template:
    serviceAccountName: argo-events-sa
  dependencies:
    - name: input-dep
      eventSourceName: webhook-datapost
      eventName: datapost
  triggers:
    - template:
        name: webhook-datapost-trigger
        k8s:
          group: argoproj.io
          version: v1alpha1
          resource: workflows
          operation: create
          source:
            resource:
              apiVersion: argoproj.io/v1alpha1
              kind: Workflow
              metadata:
                generateName: webhook-datapost-
              spec:
                entrypoint: basefile
                imagePullSecrets:
                  - name: regcred
                arguments:
                  artifacts:
                  - name: filecontents
                    raw:
                      data: ""
                templates:
                - name: basefile
                  serviceAccountName: argo-events-sa
                  inputs:
                    artifacts:
                    - name: filecontents
                      path: /input.file
                  container:
                    image: alpine:latest
                    command: ["ls"]
                    args: ["/input.file"]
          parameters:
            - src:
                dependencyName: input-dep
                dataTemplate: "{{ .Input.body.basedata | b64dec }}"
              dest: spec.arguments.artifacts.0.raw.data

Error with larger dataset

When I trigger the example above with a small dataset, this works as expected. But when I use a large dataset, I get an error:

Pod "webhook-datapost-7rwsm" is invalid: metadata.annotations: Too long: must have at most 262144 bytes

I understand that this is due to copying the entire raw data into the Workflow-template. This large template is then rejected by Kubernetes.

I am looking for a method to copy the data from a webhook POST-request into an artifact, without the entire payload being copied into the Workflow-template. Is there a possibility with Argo?

0

There are 0 best solutions below