I am writing an application in which the user takes pictures of them on their own, and then looks through a series of views to adjust the image using the navigation controller. This works great if the user takes a photo with a front camera (installed by default on devices that support it), but when I repeat the process, I get about half the way, and it crashes after a warning about a memory warning.
After profiling in the Tools, I see that the memory size of my applications is about 20-25 MB when using the low-resolution front camera image, but when using the rear camera, each change in view adds another 33 MB or so until it works about 350 MB (on 4S)
Below is the code that I use to control the saving of the photo in the document directory and then reading this file to set the image to UIImageView . Part of the βreadβ of this code is processed through several view controllers (viewDidLoad) to set the image that I saved as a background image in each view when I go.
I removed all the image modification code in order to break this down to a minimum, trying to isolate the problem, and I cannot find it. Currently, the entire application takes pictures at a glance, and then uses this photo as a background image for more than 10 views, highlighting when the user passes through the presentation stack.
Now, obviously, higher resolution photos will use more memory, but I donβt understand why low resolution photos do not seem to use more memory as I go, while high resolution photos constantly use more and more before the failure.
How do I save and read an image:
- (void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info { UIImage *image = [info objectForKey:@"UIImagePickerControllerOriginalImage"]; jpgData = UIImageJPEGRepresentation(image, 1); NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsPath = [paths objectAtIndex:0]; filePath = [documentsPath stringByAppendingPathComponent:@"image.jpeg"]; [jpgData writeToFile:filePath atomically:YES]; [self dismissModalViewControllerAnimated:YES]; [disableNextButton setEnabled:YES]; jpgData = [NSData dataWithContentsOfFile:filePath]; UIImage *image2 = [UIImage imageWithData:jpgData]; [imageView setImage:image2]; }
Now I know that I can try to scale the image before I save it, and I plan to look further, but I do not understand why this does not work as it is. Perhaps I was mistaken under the impression that ARC automatically frees up looks and their sub-items when they leave the top of the stack.
Can anyone shed some light on why I am in the memory storage of my devices? (Hope something simple, I completely ignore). Somehow I managed to throw ARC out of the window?
EDIT: How I invoke an image in my other views
- (void)loadBackground { NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsPath = [paths objectAtIndex:0]; NSString *filePath = [documentsPath stringByAppendingPathComponent:@"image.jpeg"]; UIImage *image = [UIImage imageWithContentsOfFile:filePath]; [backgroundImageView setImage:image]; }
How to set navigation between my view controllers:

EDIT 2:
What my main ads look like:
#import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> @interface PhotoPickerViewController : UIViewController <UIImagePickerControllerDelegate, UINavigationControllerDelegate> { IBOutlet UIImageView *imageView; NSData *jpgData; NSString *filePath; UIImagePickerController *imagePicker; IBOutlet UIBarButtonItem *disableNextButton; } @end
If appropriate, how do I call my image picker:
- (void)callCameraPicker { if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera] == YES) { NSLog(@"Camera is available and ready"); imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera; imagePicker.delegate = self; imagePicker.allowsEditing = NO; imagePicker.cameraCaptureMode = UIImagePickerControllerCameraCaptureModePhoto; NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; for (AVCaptureDevice *device in devices) { if([[UIScreen mainScreen] respondsToSelector:@selector(scale)] && [[UIScreen mainScreen] scale] == 2.0) { imagePicker.cameraDevice = UIImagePickerControllerCameraDeviceFront; } } imagePicker.modalTransitionStyle = UIModalTransitionStyleCoverVertical; [self presentModalViewController:imagePicker animated:YES]; } else { NSLog(@"Camera is not available"); UIAlertView *cameraAlert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"Your device doesn't seem to have a camera!" delegate:self cancelButtonTitle:@"Dismiss" otherButtonTitles:nil]; [cameraAlert show]; } }
EDIT 3: I registered viewDidUnload and didn't actually call it, so I call loadBackground in viewWillAppear and create backgroundImageView nil in viewDidDisappear . I expected this to help, but it does not make any difference.
- (void)viewWillAppear:(BOOL)animated { [self loadBackground]; } - (void)viewDidDisappear:(BOOL)animated { NSLog(@"ViewDidDisappear"); backgroundImageView = nil; }