Python multiprocessing is shared inside a class

I want to understand how the multiprocessor and manager work for shared memory

I have a class with a dictionary created on the class. init I want to use multiprocessing to call a class function that populates a dictionary (each process adds a different key).

import multiprocessing as mp
from multiprocessing import Process, Manager

class num:
    def __init__(self):
        manager = Manager()
        d = manager.dict()

        # Setup list of processes
        processes = [mp.Process(target=self.f, args=(d,i)) for i in range(5)]

        #Run processes 
        for p in processes:
            p.start()

        #Exit the completed processes
        for p in processes:
            p.join()

        print d

    def f(self,d,i):
        d[str(i)] = []
        d[str(i)].append(i)


if __name__ == '__main__':      
    test = num()

result:

{'1': [], '0': [], '3': [], '2': [], '4': []}

Inside, inside f (), should there also be a generic list? How and why?

+4
source share
2 answers

You need to change the line in the function fto:

d['i'] = [i]

to something like

d[str(i)] = i

so that your processes do not overwrite other entries in a shared dict. After that, it works fine for me (in Python 2.7.3), printing

{'1': 1, '0': 0, '3': 3, '2': 2, '4': 4}

( , , , import multiprocessing as mp)

: , shared dict , ,

d[str(i)] = [i]

, , , , , manager.list() , :

    count = 5
    lists = [manager.list() for i in range(count)]
    for i in range(count):
        d[i] = lists[i]
    processes = [mp.Process(target=self.f, args=(d,i, lists)) for i in range(count)]

[...]

def f(self,d,i, lists):
    for j in range(i):       # just an example to show 
        lists[j].append(i)   # that the lists are shared between processes

dict, - , . , , , , . :

{0: [1, 2, 3, 4], 1: [2, 3, 4], 2: [3, 4], 3: [4], 4: []}
+3

:

d[str(i)].append(i)

:

d[str(i)] += [i]

.

:

{'1': [1], '0': [0], '3': [3], '2': [2], '4': [4]}

, .

+2

All Articles