Reducing Dataproc Serverless CPU quota

262 Views Asked by At

Aim: I want to run spark jobs on Dataproc Serverless for Spark.

Problem: The minimum CPU cores requirement is 12 cores for a Spark app. That doesn't fit into the default regional CPU quota we have and requires us to expand it. 12 cores is an overkill for us; we don't want to expand the quota.

Details: This link mentions the minimum requirements for Dataproc Serverless for Spark: https://cloud.google.com/dataprocserverless/docs/concepts/properties

They are as follows: (a) 1 driver and 2 executor nodes (b) 4 cores per node

Hence, a total 12 CPU cores is required.

Can we bypass this and run Dataproc Serverless for Spark with less CPU cores?

1

There are 1 best solutions below

0
On

Right now Dataproc Serverless for Spark workload requires 12 CPU cores to run - this is a hard minimum that you can not bypass.

We are working on relaxing this requirement, but it will not be available until at least Q3 2023.