I have a python script that runs multiprocessing.Pool to process a large number of files individually. Usually I have a cpu limit of 8. My problem is this: I always get "IOError: [Errno 24] Too many open files". Each child process opens several files for reading only with file.open (). These file handlers are passed to several functions to retrieve data. At the end of each child process, these files are closed by file.close (). I also tried with the statement, but did not solve the problem. Does anyone know what happened. I googled around but did not find any answers. I close the files and the functions return properly to save the file handlers.
My settings - Mac 10.5 with python 2.6
thank
Ogan
from custom import func1, func2
import multiprocessing
def Worker(*args):
f1 = open("db1.txt")
f2 = open("db2.txt")
for each in args[1]:
X = func1(f1)
Y = func2(f2)
f1.close()
f2.close()
return
Data = {1:[2], 2:[3]}
JobP= multiprocessing.Pool(8)
jobP.map_async(Worker, Data.items())
jobP.close()
jobP.join()