Too many files are opened using multiprocessing.

I have a python script that runs multiprocessing.Pool to process a large number of files individually. Usually I have a cpu limit of 8. My problem is this: I always get "IOError: [Errno 24] Too many open files". Each child process opens several files for reading only with file.open (). These file handlers are passed to several functions to retrieve data. At the end of each child process, these files are closed by file.close (). I also tried with the statement, but did not solve the problem. Does anyone know what happened. I googled around but did not find any answers. I close the files and the functions return properly to save the file handlers.

My settings - Mac 10.5 with python 2.6

thank

Ogan

    from custom import func1, func2
    # func1 and func2 only seek, read and return values form the file
    # however, they do not close the file
    import multiprocessing
    def Worker(*args):
        f1 = open("db1.txt")
        f2 = open("db2.txt")
        for each in args[1]:
            # do many stuff
            X = func1(f1)
            Y = func2(f2)

        f1.close()
        f2.close()
        return

    Data = {1:[2], 2:[3]}  
    JobP= multiprocessing.Pool(8) 
    jobP.map_async(Worker, Data.items()) 
    jobP.close()
    jobP.join()
+5
2

, . . Linux? . /etc/security/limits.conf.

0

Yosemite (OS X 10.10):

sudo launchctl limit maxfiles [number-of-files] unlimited
0

All Articles