This requirement may sound weird or may be I am not sure how to solve this problem, I am giving my best shot here to explain with following diagram
- I have an existing application (
Legacy Application) which runs as single process. Legacy Applicationhas very old dependencies on java version, slf4j, spring etc.- In order to make it fault-tolerant and add some supervision to it I wrapped it inside an
Akka Actorand start remotely in different JVM and Actor System. - I start this legacy app from my new application as
processor = context.actorOf(Props[Processor], "processor")
and my configuration looks like
deployment {
/newApplication/processor/ {
remote = "akka.tcp://[email protected]:2552"
}
But since Processor has all old dependencies, I get them in my new system as well :(
What I am looking for?
- Is there a way I can start the legacy application inside ActorSystem(and JVM) where it can be encapsulated but have supervisor in the new Application?
bottom-line: I want to have old dependencies not leak out, but since the other JVM can throw OOM, I would like to supervise from outside (new ActorSystem)

About dependencies, just use Group Router, which sends messages via ActorSelection and don't require presence of target classes at senders side. But presence of Actors at this paths not managed by Group Router. More details at: http://doc.akka.io/docs/akka/2.4-M2/scala/remoting.html
About supervising: Just resolve ActorSelection (
resolveOnemethod call) from paths (which used in the Group Router) and addActorRefobjects tocontext.watchof the actor in your new application then handleTerminatedmessages.