How to determine the number of bytes used by UIImage?

I would like to be able to calculate the total number of bytes of a UIImage in memory.

I can make a rough estimate by multiplying the width by height and then by the number of byte multipliers, but I would like to calculate the size as much as possible.

+4
source share
3 answers

In general, objects do not have a single significant “size”, since they can allocate and free any number of other objects as necessary. sizeof(*myObj) gives you the size of a top-level structure, not a very useful number. If you need the full influence of memory on the distribution and use of an object, run in the "Tools" and "Time Allocation" sections.

With a UIImage its practical size is the size of what supports it, usually either NSData containing a PNG , or CGimageRef , plus object overhead. (There is also a pixel buffer when it is displayed on the screen or in another context, but this buffer belongs to the view or context in question, not UIImage . If a UIView renders, then this buffer is probably GL texture memory anyway.)

[UIImage imageWithData:[NSData dataWithContentsOfFile:@"foo.png"]] gives you a UIImage that is the same size as the foo.png file, as well as some minor overhead. [UIImage imageNamed:@"foo.png"] does the same, except that the class maintains a cache table of one object per file name and will force this object to unload its copy of png memory in low memory situations, reducing the "size" to just official.

imageWithCGImage: and the options give you a UIImage that uses the CGImage link as storage, and CGImages can be any number of things, depending on their source. If you paint in one, this is probably an uncompressed pixel buffer. Calculate its size exactly as you suggest above. If you need what its size would "be" if it were from a file, check the result of the UIImagePNGRepresentation or UIImageJPEGRepresentation .

+6
source

Width * height * 4 will bring you closer. I'm not sure that there is a way to get the exact size, since the width is rounded to an arbitrary undocumented border (at least 4 pixels or 16 bytes, I am going to), and there are several additional interior parts of the object that you need to calculate. Plus, there are probably internal attributes that hang on the object or not, based on its use.

0
source

I had to solve this for the twitter application I was writing. Twitter rejects images larger than 3 MB, so I needed to compress the image enough to go below the 3 MB limit. Here is the code snippet I used:

 float compression = 1.0f; NSData* data = UIImageJPEGRepresentation(photo, compression); while(data.length > 3145728) //3MB { compression -= .1f; NSLog(@"Compressing Image to: %lf", compression); data = UIImageJPEGRepresentation(photo, compression); NSLog(@"Image Bytes: %i", data.length); } 

The compression algorithm used is not optimized.

So what is he doing?

Good question! The UIImageJPEGRepresentation method returns an array of bytes. To get the size, just check the length of the array!

There is also a UIImagePNGRepresentation method. Keep in mind that these methods have to create byte arrays and, if necessary, convert the binary representation of the data. This may take some time. Fortunately, in my case, most of the images taken by the iPhone are already less than 3 MB, and only compression is required if there is a wide range of colors; but calling the UIImageJPEGRepresentation method repeatedly (which might happen in my published code) may take some time.

0
source

All Articles