How to run task on a different server than Spring Cloud Data Local server

195 Views Asked by At

I want to host a Spring Cloud Data Flow Local server for Monitoring and executing my various Spring Boot Batch projects.

The issue or the infrastructure I want to achieve is that, I want my Spring Cloud Data Flow Server host on Server A which can be able to execute Spring Boot Batches/Tasks on Server B.

Is this a possible configuration I am trying to achieve ? If not how should I achieve this ? Since I have few Spring Boot batch applications which run on different server.

2

There are 2 best solutions below

0
On

This is not how SCDF works. So I do no think it is possible. If you want to monitor your batch jobs then you need to register your jobs in the SCDF server.

0
On

It depends on how you launch and configure your batch applications. You can have a custom task application (call it batch-launcher) that launches your batch job on an external cluster. But, in terms of monitoring the application, SCDF can help monitoring the task application (the batch-launcher) that is used for launching your actual batch but not the actual job that is running on the external cluster (unless you have a mechanism to retrieve the metrics of the batch application into the batch-launcher).

Launching a Spark compute job on a Spark cluster using an SCDF task (using Spark client) is one such example. In this case, you would register the SCDF task and monitor only the Spark client task app via SCDF (not the Spark Compute Job).