I profile my application and try to understand how the application will behave in a low memory situation.
I know that I can simulate a low memory warning using a simulator, but I don’t want to simulate how the application will behave if a memory warning is triggered, but instead we will see how the application will behave at the lowest memory level.
To increase memory usage, I created a method that calls
char * a = malloc (1024 * 1024);
And the result is quite interesting. I probably read it wrong, and where I need help.
On Allocation tools, at some point I had an application using more than 1 GB in the "Live Bytes" column, but the application did not work on the real device and worked fine.

Looking at the Resident / Dirty size, on VM Tracker, I had a different result: about 134 MB on the resident and 78 MB on the dirty size.

What am I missing here? According to the docs:
Live Bytes : the number of bytes that have been allocated but not released.
Resident Size : The amount of actual memory used.
Why are the results different? Thanks
source
share