Fopen problem - too many open files

I have a multi-threaded application running on Win XP. At a certain point, one of the threads cannot open an existing file using the fopen function. The _get_errno function returns EMFILE, which means too many open files. No more file descriptors. FOPEN_MAX for my platform is 20. _getmaxstdio returns 512. I checked this with WinDbg and I see that about 100 files are open:

788 Handles Type Count Event 201 Section 12 File 101 Port 3 Directory 3 Mutant 32 WindowStation 2 Semaphore 351 Key 12 Thread 63 Desktop 1 IoCompletion 6 KeyedEvent 1 

What is the reason for the failure of fopen?


EDIT:

I wrote a simple single-threaded test application. This application can open 510 files. I do not understand why this application can open more files, and then a multi-threaded application. Could this be due to file descriptor leaks?

 #include <cstdio> #include <cassert> #include <cerrno> void main() { int counter(0); while (true) { char buffer[256] = {0}; sprintf(buffer, "C:\\temp\\abc\\abc%d.txt", counter++); FILE* hFile = fopen(buffer, "wb+"); if (0 == hFile) { // check error code int err(0); errno_t ret = _get_errno(&err); assert(0 == ret); int maxAllowed = _getmaxstdio(); assert(hFile); } } } 
+2
source share
2 answers

I think in win32 the whole crt function will finally end using the win32 api below. Therefore, in this case, most likely, he should use CreateFile / OpenFile for win32. Now CreatFile / OpenFile api is intended not only for files (files, directories, communication ports, channels, mail slots, disk volumes, etc.). Thus, in a real application, depending on the number of these resources, the maximum open file may differ. Since you have not described anything about the application. This is my first guess. If time allows you to go through http://blogs.technet.com/b/markrussinovich/archive/2009/09/29/3283844.aspx

+1
source

I assume this is a limitation of your operating system. This can depend on many things: the way the file descriptors are represented, the memory they use, etc.

And I suppose you can't do it. Perhaps there is some parameter for setting this limit.

The real question is: do you really need to open so many files at once? I mean, even if you have 100 topics that try to read more than 100 different files, they probably won't be able to read them at the same time, and you probably won't get a better result than, for example, 50 threads.

It is difficult to be more precise, since we do not know what you are trying to achieve.

+5
source

All Articles