Exact documentation is not available as far as I know. NSString and (IIRC) NSNumber are implemented as class clusters, i.e. When you request a new object, you can actually get the object of some undocumented subclass.
It also means that everything can change without warning when your program runs on a different version of the OS, so do not rely on exact numbers.
Now let's try a rough estimate. As many as 4 bytes on all modern Apple platforms. Pointers have 4 bytes on iOS.
Objects are allocated on the heap; at the lowest level, heap allocation is done using malloc
. I assume that the iOS malloc implementation is derived from the one used on Mac OS. Check here for details: http://cocoawithlove.com/2010/05/look-at-how-malloc-works-on-mac.html
The most important point is that the distribution quantum for small objects is 16 bytes, i.e. small objects will use a multiple of 16 bytes.
Each Objective-C object contains a pointer to its class.
So for an NSNumber containing an int, I would evaluate 4 bytes for your pointer plus 16 bytes for the object (consisting of a 4-byte class pointer and, I suppose, a four-byte int plus 8 bytes of lost space).
There are different specific subclasses for NSString for different situations. String literal @"2"
will point to a statically allocated string literal, a string created at runtime is likely to have a different representation. In general, I would suggest that 4 bytes (your pointer) + 16 bytes (NSString object) + the number of characters * 2 (sizeof (unichar)) is rounded to a multiple of 16.
To summarize, I NSNumbers
that NSNumbers
require about five times as much memory as int
s. I also estimate that the same number represented as NSString
takes about 10 times as much memory as int
.
Also note that allocating Objective-C objects is much slower than defining a local variable of type int. However, you must also remember that this often does not matter and that premature optimization is the root of all evil.
wolfgang
source share