Here is an example of a program in which I use multiprocessing. Calculations are performed using multiprocessing.Process , and the results are collected using multiprocessing.Queue .
#THIS PROGRAM RUNS WITH ~40Gb RAM. (you can reduce a,b,c for less RAM #but then it works for smaller values) #PROBLEM OCCURS ONLY FOR HUGE DATA. from numpy import * import multiprocessing as mp a = arange(0, 3500, 5) b = arange(0, 3500, 5) c = arange(0, 3500, 5) a0 = 540. #random values b0 = 26. c0 = 826. def rand_function(a, b, c, a0, b0, c0): Nloop = 100. def loop(Nloop, out): res_total = zeros((700, 700, 700), dtype = 'float') n = 1 while n <= Nloop: rad = sqrt((a-a0)**2 + (b-b0)**2 + (c-c0)**2) res_total += rad n +=1 out.put(res_total) out = mp.Queue() jobs = [] Nprocs = mp.cpu_count() print "No. of processors : ", Nprocs for i in range(Nprocs): p = mp.Process(target = loop, args=(Nloop/Nprocs, out)) jobs.append(p) p.start() final_result = zeros((700, 700, 700), dtype = 'float') for i in range(Nprocs): final_result = final_result + out.get() p.join() test = rand_function(a,b,c,a0, b0, c0)
Here is the error message:
Traceback (most recent call last): File "/usr/lib/python2.7/multiprocessing/queues.py", line 266, in _feed send(obj) SystemError: NULL result without error in PyObject_Call
I read here that this is a mistake. But I canβt understand. Can someone tell me some way to compute huge data using multiprocessing?
Many thanks
python numpy queue multidimensional-array multiprocessing
geekygeek
source share