A memory issue when using a large UIImage array causes a crash (fast)

In my application, I have an array of images that stores all the images taken on my camera. I am using collectionView to display these images. However, when this array of images reaches the 20th or so image, it falls. I believe this is due to a memory problem. How to store images in an array of images in such a way that it is an effective memory?

Michael Dauterman provided the answer using thumbnails. I was hoping there would be a solution besides this. Perhaps storing images in NSData or CoreData?

Camera.swift:

//What happens after the picture is chosen func imagePickerController(picker:UIImagePickerController, didFinishPickingMediaWithInfo info: [NSObject:AnyObject]){ //cast image as a string let mediaType = info[UIImagePickerControllerMediaType] as! NSString self.dismissViewControllerAnimated(true, completion: nil) //if the mediaType it actually is an image (jpeg) if mediaType.isEqualToString(kUTTypeImage as NSString as String){ let image = info[UIImagePickerControllerOriginalImage] as! UIImage //Our outlet for imageview appraisalPic.image = image //Picture taken, to be added to imageArray globalPic = image //image:didFinish.. if we arent able to save, pass to contextInfo in Error Handling if (newMedia == true){ UIImageWriteToSavedPhotosAlbum(image, self, "image:didFinishSavingWithError:contextInfo:", nil) } } } 

NewRecord.swift

 var imageArray:[UIImage] = [UIImage]() viewDidLoad(){ //OUR IMAGE ARRAY WHICH HOLDS OUR PHOTOS, CRASHES AROUND 20th PHOTO ADDED imageArray.append(globalPic) //Rest of NewRecord.swift is code which adds images from imageArray to be presented on a collection view } 
+7
ios swift uiimage camera
source share
7 answers

The best way that worked for me was to store the complete set of images in order to use PHPhotoLibrary. PHLibrary comes with caching and garbage collection. Other solutions did not help me.

 ViewDidLoad: //Check if the folder exists, if not, create it let fetchOptions = PHFetchOptions() fetchOptions.predicate = NSPredicate(format: "title = %@", albumName) let collection:PHFetchResult = PHAssetCollection.fetchAssetCollectionsWithType(.Album, subtype: .Any, options: fetchOptions) if let first_Obj:AnyObject = collection.firstObject{ //found the album self.albumFound = true self.assetCollection = first_Obj as! PHAssetCollection }else{ //Album placeholder for the asset collection, used to reference collection in completion handler var albumPlaceholder:PHObjectPlaceholder! //create the folder NSLog("\nFolder \"%@\" does not exist\nCreating now...", albumName) PHPhotoLibrary.sharedPhotoLibrary().performChanges({ let request = PHAssetCollectionChangeRequest.creationRequestForAssetCollectionWithTitle(albumName) albumPlaceholder = request.placeholderForCreatedAssetCollection }, completionHandler: {(success:Bool, error:NSError!)in if(success){ println("Successfully created folder") self.albumFound = true if let collection = PHAssetCollection.fetchAssetCollectionsWithLocalIdentifiers([albumPlaceholder.localIdentifier], options: nil){ self.assetCollection = collection.firstObject as! PHAssetCollection } }else{ println("Error creating folder") self.albumFound = false } }) } func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [NSObject : AnyObject]) { //what happens after the picture is chosen let mediaType = info[UIImagePickerControllerMediaType] as! NSString if mediaType.isEqualToString(kUTTypeImage as NSString as String){ let image = info[UIImagePickerControllerOriginalImage] as! UIImage appraisalPic.image = image globalPic = appraisalPic.image! if(newMedia == true){ UIImageWriteToSavedPhotosAlbum(image, self, "image:didFinishSavingWithError:contextInfo:", nil) self.dismissViewControllerAnimated(true, completion: nil) picTaken = true println(photosAsset) } } } 
-2
source share

I myself have run into low memory issues in my applications that need to work with multiple high-resolution UIImage objects.

The solution is to save thumbnails of your images (which take up much less memory) in your image and then display them. If the user really needs to see the full resolution image, you can let them click on the image and then reload and display the full size of the UIImage from the camera roll.

Here is the code that allows you to create thumbnails:

 // image here is your original image let size = CGSizeApplyAffineTransform(image.size, CGAffineTransformMakeScale(0.5, 0.5)) let hasAlpha = false let scale: CGFloat = 0.0 // Automatically use scale factor of main screen UIGraphicsBeginImageContextWithOptions(size, !hasAlpha, scale) image.drawInRect(CGRect(origin: CGPointZero, size: size)) let scaledImage = UIGraphicsGetImageFromCurrentImageContext() UIGraphicsEndImageContext() imageArray.append(scaledImage) 

And more information about these methods can be found in this NSHipster article .

Swift 4 -

 // image here is your original image let size = image.size.applying(CGAffineTransform(scaleX: 0.5, y: 0.5)) let hasAlpha = false let scale: CGFloat = 0.0 // Automatically use scale factor of main screen UIGraphicsBeginImageContextWithOptions(size, !hasAlpha, scale) image.draw(in: CGRect(origin: .zero, size: size)) let scaledImage = UIGraphicsGetImageFromCurrentImageContext() UIGraphicsEndImageContext() 
+6
source share

Best practice is to keep the image Array short. The array should be used only for caching images that are in the current scroll range (and those that are going to be shown for better user convenience). You must leave the rest in CoreData and load them dynamically while scrolling. Otherwise, the application will end up crashing even when using thumbnails.

+4
source share

Let me begin with an easy answer: you should not realize things that have been tested by thousands of people themselves. There are several large libraries that take care of this problem by themselves by implementing disk cache, memory cache, and buffers. Basically everything you ever need, and more.

Two libraries that I can recommend you are as follows:

Both of them are great, so this is actually preferable (I like Haneke better), but they allow you to upload images to different streams, whether from the Internet or from your package or from the file system. They also have extensions for UIImageView that allow you to use the 1-line function to download all images easily and when loading these images they take care of the download.

Cache

For your specific problem, you can use a cache that uses these methods to solve this problem, for example (from the documentation):

 [[SDImageCache sharedImageCache] storeImage:myImage forKey:myCacheKey]; 

Now that you have it in this cache, you can easily download it

 SDImageCache *imageCache = [[SDImageCache alloc] initWithNamespace:@"myNamespace"]; [imageCache queryDiskCacheForKey:myCacheKey done:^(UIImage *image) { // image is not nil if image was found }]; 

All processing and memory balancing is done by the library itself, so you don’t have to worry about anything. You can optionally combine it with resizing methods to store smaller images if they are huge, but it is up to you.

Hope this helps!

+2
source share

When you receive a memory warning from the view controller, you can delete the photos that you don’t show from your array and save them as a file, then load them again when they are needed, and so on. Or just detect when they disappear with collectionView:didEndDisplayingCell:forItemAtIndexPath

Store them in an array like this:

 var cachedImages = [(section: Int, row: Int, imagePath: String)]() 

Using:

 func saveImage(indexPath: NSIndexPath, image: UIImage) { let imageData = UIImagePNGRepresentation(image) let documents = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0] let imagePath = (documents as NSString).stringByAppendingPathComponent("\(indexPath.section)-\(indexPath.row)-cached.png") if (imageData?.writeToFile(imagePath, atomically: true) == true) { print("saved!") cachedImages.append((indexPath.section, indexPath.row, imagePath)) } else { print("not saved!") } } 

And return them with:

 func getImage(indexPath indexPath: NSIndexPath) -> UIImage? { let filteredCachedImages = cachedImages.filter({ $0.section == indexPath.section && $0.row == indexPath.row }) if filteredCachedImages.count > 0 { let firstItem = filteredCachedImages[0] return UIImage(contentsOfFile: firstItem.imagePath)! } else { return nil } } 

Also use something like this answer to avoid blocking the main thread

I gave an example: find it here

+1
source share

Use the following code to reduce the size of the image when saving it:

  var newImage : UIImage var size = CGSizeMake(400, 300) UIGraphicsBeginImageContext(size) image.drawInRect(CGRectMake(0,0,400,300)) newImage = UIGraphicsGetImageFromCurrentImageContext() UIGraphicsEndImageContext() 

I would suggest optimizing your code instead of creating an array of photos by simply creating an array of URLs (version ios <8.1 from AssetLibrary) / localIdentifier (version> 8.1 Photos Library) and fetching images only when needed through these URLs. that is, when displayed.

ARC does not handle memory management correctly sometimes when storing images in an array and also causes a memory leak in many places.

You can use autoreleasepool to remove unnecessary links that ARC cannot release.

To add, if you captured any image through the camera, the size that is stored in the array is much larger than the image size (although I'm not sure why!).

+1
source share

You can just store the raw image data in an array, and not all metadata and unnecessary things. I don’t know if you need metadata, but you can do without it. Another alternative would be to write each image to a temporary file and then retrieve it later.

0
source share

All Articles