I am developing a long python script that makes many connections with different serial ports. The script will run for several hours in its execution, citing "Too many open files."
I tracked the issue for a serial module, where the .close () method does not seem to reduce the number of file descriptors used by python. I check this with lsof | grep python | wc lsof | grep python | wc lsof | grep python | wc . Using Debian 7.2 and Python 2.7.3
In the example below, more and more file descriptors are slowly being used until it reaches the limit. Why is this and how can I avoid this?
#!/usr/bin/env python import serial #Used to communicate with pressure controller import logging import time from time import gmtime, strftime logging.basicConfig(filename="open_files_test.log") # Write unusual + significant events to logfile + stdout def log( message ): time = strftime("%Y-%m-%d %H:%M:%S", gmtime()) logging.warning( time + " " + message ) print( message ) for i in range(2000): for n in range(1, 12): try: port_name = "/dev/tty" + str(n+20) com = serial.Serial(port_name,9600,serial.EIGHTBITS,serial.PARITY_NONE,serial.STOPBITS_ONE,0.0,False,False,5.0,False,None) com.open() com.flushInput() com.flushOutput() log("Opened port: " + port_name) except serial.SerialException: com = None log("could not open serial port: " + port_name) com.close() log("Closed port: " + port_name) time.sleep(1) log("Finished Program")
thanks
python linux
Shiftee
source share