We have an application that can potentially highlight a large number of small objects (depending on user input). Sometimes an application runs out of memory and crashes efficiently.
However, if we knew that memory allocations became hard, there are some objects with a lower priority that can be destroyed and, thus, allow graceful degradation of user results.
What is the best way to detect that the memory for a process is running low before calls to the "new" ones really don't work? We could call API functions like GetProcessWorkingSetSize() or GetProcessMemoryInfo() , but how do you know when limits are reached on a given machine (for example, with 80% of the maximum allocations)?
source share