Databricks DBX pass parameters to notebook job

616 Views Asked by At

For a standard deployment.yaml file for dbx databricks as given below:

workflows:
      - name: "your-job-name"


    job_clusters:
      - job_cluster_key: "basic-cluster"
        <<: *basic-static-cluster
      - job_cluster_key: "basic-autoscale-cluster"
        <<: *basic-autoscale-cluster

    tasks:
      - task_key: "task1"
          python_wheel_task: #
            package_name: "some-pkg"
            entry_point: "some-ep"
            parameters: ["param1","param2"]


      - task_key: "your-task-03"
        job_cluster_key: "basic-cluster"
        notebook_task:
          notebook_path: "/Repos/some/project/notebook"
        depends_on:
          - task_key: "your-task-01"

Is there a way to pass parameters to a notebook job like as shown in the wheel job. How would I do that and read the parameters in the notebook?

2

There are 2 best solutions below

0
On BEST ANSWER

you can define notebook parameters in your deployment.yaml as below:

- task_key: "your-task-03"
        job_cluster_key: "basic-cluster"
        notebook_task:
          notebook_path: "/Repos/some/project/notebook"
          base_parameters:
            param1: "param1-value"
            param2: "param2-value"
        depends_on:
          - task_key: "your-task-01"
0
On

You can pass the parameters with dbx command like this -

dbx ..... --parameters='{"base_parameters": {"key1": "value1", "key2": "value2"}}'

Ref - https://dbx.readthedocs.io/en/latest/guides/general/passing_parameters/#dynamic-parameter-passing