Spring boot : Thread pool for serving requests on Management port

2k Views Asked by At

We are using a spring boot actuator for exposing liveness and readiness endpoints when running in the Kubernetes cluster. By default, the spring-boot actuator exposes the endpoint on the default standard HTTP server port where the request is served by Tomcat/Jetty server acceptor and worker thread pools. We recently ran into an issue during our stress testing where all threads in the worker pool were busy and new requests were getting queued. This caused the pod to crash in the Kubernetes cluster as the liveness probes started failing.

I am considering exposing the actuator on the management port. I wanted to check on the following

a) Are the requests on the management port served a separate worker thread pool (from that of the standard server port )?

b) If the answer to a) is no, is there a way I can configure spring boot to use a separate thread pool for management port (we are using tomcat/jetty and reactive netty servers across our different micro services)

1

There are 1 best solutions below

0
On

Yes, if you specify a different management port, Spring/Tomcat will be using a separate thread pool for serving requests on that port.

E.g. if you specify something like this is your configuration:

server.port=8080
management.server.port=8081
server.tomcat.threads.max=10

Regular requests on port 8080 will be served by threads from the standard thread pool, which will have a total of 10 threads (server.tomcat.threads.max). You will see the thread names in your log like this:

... nio-8080-exec-<number from 1 to 10>..

Management/health checks threads will be served by threads from a different thread pool, which will also have a total size of 10. You will see those threads in you log file like this:

... nio-8081-exec-<number from 1 to 10>..

NOTE: doing this will maybe solve your issue with failing health checks that cause the pods to be restarted, however it might not solve your root cause of all worker threads being occupied when there is a burst of traffic. Perhaps you need to look into something like rate limiting to deal with such scenario, so your service is not getting more traffic than it can handle.