Is there any possibility using a framework for enabling / using Dependency Injection in a Spark Application?
Is it possible to use Guice, for instance?
If so, is there any documentation, or samples of how to do it?
I am using Scala as the implementation language, Spark 2.2, and SBT as the build tool.
At the moment, my team adn I are using the Cake Pattern - it has however become quite verbose, and we would prefer Guice. That's something more intuitive, and already know for by other team members.

I've been struggling with the same problem recently. Most of my findings are that you'll face issues with serialization.
I found a nice solution with Guice here: https://www.slideshare.net/databricks/dependency-injection-in-apache-spark-applications