PYTHON Making a global variable accessible from every process

I am new to python and started using the genetic algorithm (GA) to do some kind of curve binding. For this GA, I use the (amazing) pyevolve library ( http://pyevolve.sourceforge.net/ ), which can significantly reduce the computation time using multiprocessing.

This is where my problem arises: the curve I want to approximate is an array that is read from an excel file and saved as a global variable at the beginning of my program. When using the python multiprocessing module, each process creates its own python instance with its own global variable. This makes every person in every generation of the algorithm (every process) open and read the excel file again and again. Opening large excel files can lead to a huge amount of time, so it would be nice to open this file only once and make the file readable for each process / individual file.

Multiprocessing is initiated in the pyevolve library, and I do not want to change it in order to simplify its updating. Unfortunately, this means just passing the variable to the process pool through for example

p = Process(target=my_func,args=(my_array))

not an option for me. This is the only solution I have found so far.

Does anyone know a different way to make my_array accessible from each process?

Thanks in advance!

+8
python multiprocessing global pyevolve
source share
2 answers

I just wanted to tell you how I solved this problem if someone else is encountering it:

My solution does not relate to the general python related problem, but it helps when using pyevolve, which was enough in my case. I did not know that in pyevolve you can add parameters to your genomes or your instance of the genetic algorithm through

my_genome.setParams(xyz=my_array) or my_ga.setParams(xyz=my_array)

And these parameters can be obtained through

my_genome.getParam('xyz') and my_ga.getParam('xyz')

These options are available for each process, so my problem was solved, and I did not need to think about the general python multiprocessing problem. I hope this helps anyone!

0
source share

Check mmap , the Python interface for creating memory mapped files that can be shared between processes. You probably want something like the following:

 import mmap import os import ctypes mm = mmap.mmap(-1, 13) mm.write('Hello world!') mm_addr = id(mm) with open('shared_id', 'w') as f: f.write(str(mm_addr)) pid = os.fork() if pid == 0: # In a child process id_from_file = long(open('shared_id').read()) loaded_mm = ctypes.cast(id_from_file, ctypes.py_object).value loaded_mm.seek(0) print loaded_mm.readline() loaded_mm.close() 

I used this question to figure out how to get the physical memory address on a shared memory card and convert it back to a Python object.

I suppose you could also do this with any object in memory instead of mmap, but I have not tried it.

0
source share

All Articles