Limit the code a python service is loading into the RAM

114 Views Asked by At

I'm working on a system written in python which runs on Linux (python 3.9). This system is a monolith. It has about 20 services that all share the same code base.

Every service - even the most simple - uses at least 50 MBs of RAM. Since there are 20 services, this adds up to quite a bit of RAM, which I'd very much like to reduce. (In this system, RAM is the bottleneck).

Doing some experiments, I found that the following service uses 50 MB of RAM:

from .alert_mgr import alert_types_for_obj, alert_types_for_objtype, EventLogThread, Event
import time

while True:
    time.sleep(2)

While this service takes only 10 MBs of RAM (i.e. with the first import commented out):

# from .alert_mgr import alert_types_for_obj, alert_types_for_objtype, EventLogThread, Event
import time

while True:
    time.sleep(2)

As far as I can tell, the reason for this is that the import is pulling python code into the RAM. Many of the imported files are pulling other files into the RAM, and we end up with a ton of python code in the RAM for each service. The service isn't using most of this code, but the code gets pulled in since the python source files aren't broken up properly.

Of course, the right solution for this would be to break up the python source code into smaller files. But since that's a lot of work and time is limited, are there any easier solutions (like dynamically loading the source code into RAM)?

1

There are 1 best solutions below

0
On

You can use something like linux control groups to limit the memory available to a specific process. - example

Needless to say this will slow down your services.

There are also numerous frameworks that can help you find unused code, thus helping you with optimizing the memory footprint of your code - example