I am writing a REST server in Tornado. I use a ProcessPoolExecutor
with a max_workers
configurable parameter.
However, the problem is that it seems to not effectively limit the number of processes the way I want.
The code is
def post(self):
...
self.process_pool_executor.submit(_execute_scenario_optimization, self.project_name,
self.scenario_name)
self._generate_output_json_from_dict({"execution_status": "RET_OK"})
return
I need submit
to generate an exception when the maximum processes active in pool at the same time is more than, for example, 4. Do you have any idea?
The pool executors limit the number of processes that can run at once, but if you give them more tasks than processes, the extra tasks just go onto a queue instead of raising an exception. There doesn't appear to be a way to limit the size of this queue, so you should probably use a semaphore to limit the number of items you add to the queue.