Gunicorn shares memory between multiprocessing processes and workers

I have a python application that uses a dictionary as shared memory between several processes:

from multiprocessing import Manager manager = Manager() shared_dict = manager.dict() 

The REST API is implemented using Flask. When using pywsgi or just Flask.run to initialize the Flask server, everything worked fine. I decided to throw a machine gun in the cannon. Now, when I access this split dict from any of the workers (even when only one works), I get the error:

message = connection.recv_bytes (256) # reject large message
IOError: [Errno 35] The resource is temporarily unavailable

I was looking through mmap, a multiprocessor listener and client, and they all looked like a lot of overhead.

+7
python multiprocessing gunicorn
source share
1 answer

I don’t know about a specific error, but I think that the most likely reason is that when a web server is added, processes are initialized on demand, so manager_dict is lost within calls. If the dict is not big enough and you can pay a fine for serialization / de-serialization using the redis in-memory data store with the py-redis library is pretty simple.

0
source share

All Articles