Apache Spark - passing jdbc connection object to executors

883 Views Asked by At

I am creating a jdbc object in spark driver and I am using that in executor to access the db. So my concern is that is it the same connection object or executors would get a copy of connection object so there would be separate connection per partition?

1

There are 1 best solutions below

0
On BEST ANSWER
  • In Scala - not that you state if pyspark, Java or Scala,
    • you can create an Object for a ConnectionPool as per link below;
      • this will be instantiated for each Executor and shared by Cores comprising that Executor. I.e. not as a Singleton for all foreachPartition.
        • Your concern / question is addressed in the previous bullet, i.e. their own Connections.

See https://medium.com/@ravishankar.nair/implementing-a-connectionpool-in-apache-sparks-foreachpartition-4add46dc8ae2. It's a good reference from medium.com.