How to restrict mongo-spark-connector making too many connections to mongo cluster?

184 Views Asked by At

I've built a simple data exporter job in spark using mongo-spark-connector. This job reads collections from mongo and writes it into S3. However, for collections of huge size (2B documents), it opens up too many connections irrespective of the number of executors and executor cores. I tried checking their official documentation to find if there is a connector config to restrict the number of connections. But they don't have it either. Is there a way I can control the number of connections being spawn from my spark application?

0

There are 0 best solutions below