how to pass custom job id via google dataproc cluster job for spark using dataproc client

407 Views Asked by At

i am using the following code snippet but would not found any luck. can anyone help me to pass custom job ID

job = {
    
    "placement": {"cluster_name": cluster_name},
    "spark_job": {
        "main_class": "org.example.App",
        "jar_file_uris": [
          "gs://location.jar",
        ],
        "args": [],
    },
}


operation = job_client.submit_job_as_operation(
     request={"project_id": project_id, "region": region, "job": job}
)

Thanks in Advance :)

1

There are 1 best solutions below

0
On

The following problem can be solved by adding reference attribute in the json like this.

"reference": {
  "job_id": "test101",
  "project_id": "1553sas207"
   }