Running more than spark streaming job in Google dataproc

234 Views Asked by At

How do I run more than one spark streaming job in dataproc cluster? I created multiple queues using capacity-scheduler.xml but now I will need 12 queues if I want to run 12 different streaming - aggregate applications. Any idea?

1

There are 1 best solutions below

0
On