Instrumenting Spark JDBC with javaagent

191 Views Asked by At

I am attempting to instrument JDBC calls using the Kamon JDBC Kanela agent in my Spark app.

I am able to successfully instrument JDBC calls in a non-spark test app by passing in -javaagent:kanela-agent-1.0.1.jar on the command line when I run the app from the JAR. When I do this, I see the Kanela banner display in the console, and can see that my failed statement processor is getting called when there is a SQL error.

From my research, I should be able to inject a javaagent into the executor of a Spark app by passing in the following to spark-submit: --conf "spark.executor.extraJavaOptions=-javaagent:kanela-agent-1.0.1.jar". However, when I do this, although the Kamon banner IS displaying on the console upon my call to Kamon.init(), my failed statement processor is NOT getting called when there is a SQL error.

Things I'm wondering:

  1. Is there something about the way that spark-jdbc makes these JDBC calls that would prevent a javaagent from "seeing" them?
  2. Does my call to Kamon.init() somehow only apply to code in the Spark driver, and not the executor?
  3. Any other reason that you can think of that would be preventing this from working?
0

There are 0 best solutions below