When downloading images to the application using the [UIImage imageNamed: fileName]system caches images and, therefore, provides improved performance when reusing the same image.
Is there something similar for images created using Core Graphics? I mean images created from contexts using a method UIImage *image = UIGraphicsGetImageFromCurrentImageContext();.
The current approach I have is to draw an image and save the UIImage to disk, so the next time I need to use the same drawing method, I can just load the cached image from disk. I'm looking for a better way to store images with Core Graphics, as the current approach seems cumbersome.
Maybe even save the CGContextRef with the whole drawing in some caching data structure, I'm not quite sure if this is possible?
My goal is to use only Core Graphics, so my application package is smaller and I get permission independence, but I would like to improve performance, because complex drawing procedures can take a lot of time to process.
UPDATE: after doing some performance tests, my results. Each time an average of more than 100 runs, each time drawing 19 or 25 different species. Presented images included rectangles, circles, and text as UILabels. Fills, strokes, gradients and shadows were used.
Caching was implemented as described in the answer, with an NSDictionary storing UIImage objects. Each run had a separate cache, which was used during work, but not for all representations (out of 25, there were 2 sets of 8, 2 sets of 6 out of 19, which were identical and could be cached).
Here are some examples:
iOS Simulator
19 views
No caching - average mileage 11.667ms
Caching - Average Mileage 10.321ms
25 views
No caching - average mileage 14.304ms
Caching - Average Mileage 13.509ms
Device
19 views
No caching - average 82.785ms Caching - average 77.831ms
25 views
No caching - average value 107.977ms Caching - average value 100.094ms
( 8%), ( ) .