I'm developing with Apache and Django an web application application where users interacts with a data model (C++ implementation wrapped into Python).
To avoid load / save data in a file or database after each user operation, I prefer keep data model in memory as long as the user is connected to the app. Until now, data models are stored into a variable attached to web service. As Python running under Apache has sometimes strange behavior, I'd prefer execute user operation into separated python process, today on same server, maybe tomorrow on a different node.
I'm under the impression that Distributed computing library (dispy, dask distributed) does not enable to keep memory attached to a node. Does anyone have a solution / idea about what libraries could I use ?
Simple answer : stop wasting your time trying to do complicated things that will never work right with your typical web server and store your data in a database (doesn't have to be a mysql database FWIW).
Longest answer: in a production environment you typically have several parallel (sub)processes handling incoming requests, and any of those process can serve any user at any time, so keeping your data in memory in a process will never work reliably. This is by design and this is a sane design, so trying to fight against it is just a waste of time and energy. Web server processes are not meant to persist data before requests, that's what your database is for, so use it.