Returning large objects from child processes in python multiprocessing

I am working with Python multiprocessing to call some workers. Each of them should return an array of several MB in size.

  • Is it right that since my returned array is created in a child process, it needs to be copied back to the parent memory when the process is finished? (this seems to take some time, but it could be a pypy issue)
  • Is there a mechanism that allows parent and child objects to access the same object in memory? (synchronization is not a problem, since only one child will access each object)

I'm afraid that I have a few gaps in how python implements multiprocessing, and trying to convince pypy to play well does not ease the situation. Thank!

+4
source share
1 answer

, , , , Pipe, . CPython, PyPy. , PyPy; , PyPy , multiprocessing PyPy , CPython.

CPython ctypes multiprocessing.sharedctypes. PyPy API. () , ctypes.

multiprocessing.Manager, shared/list Manager, , Proxy. , / , , , multiprocessing.sharedctypes.

+6

All Articles