Application crashes when retrieving all frames from video using AVAssetImageGenerator for long videos

I successfully extracted all the frames from the video and works great for small videos, but when I try to extract frames from the video for more than 60 seconds, the application crashes to the device.

I am extracting 30 frames per second from the video.

Below is the code I wrote: -

var videoSec = Float64(0) func startImageConversion(){ let filePath = NSURL.fileURLWithPath(self.savedVideoURL) videoSec = self.getVideoTime(filePath) videoImagesArray = self.getImagesArrayFromVideo(filePath) print("Images Count \(videoImagesArray.count)") } 

// MARK: - Video editor functions

  func getImagesArrayFromVideo(filePath:NSURL) -> NSMutableArray { let imageArray = NSMutableArray() print("Video Sec is ",videoSec) let vidSec = Float64(videoSec) let theOpts = [ AVURLAssetPreferPreciseDurationAndTimingKey : true, AVURLAssetReferenceRestrictionsKey : 0 // AVAssetReferenceRestrictions.RestrictionForbidNone ] let asset = AVURLAsset(URL: filePath, options: theOpts) let generator = AVAssetImageGenerator(asset: asset) generator.maximumSize = CGSize(width: Double(self.view.frame.size.width), height: Double(self.view.frame.size.height)) generator.appliesPreferredTrackTransform = false generator.requestedTimeToleranceBefore = kCMTimeZero generator.requestedTimeToleranceAfter = kCMTimeZero let vid_length:CMTime = asset.duration let fps = vid_length.timescale print("Video Lenght :- \(vid_length) FPS is :- \(fps)") let mainValue = Float64(vid_length.value) let divide = vidSec*30 let byVal = mainValue/divide for i in 0.stride(through: Float64(vid_length.value), by: byVal) { var image:CGImage!//UIImage() let divident = Float64(i) // let mileSec = Float64(divident / 1000) // print(Sec) image = self.generateVideoThumbs(filePath, second: divident, thumbWidth: Double(self.view.frame.size.width), generator : generator, fps : fps ) if image != nil { imageArray.addObject(image) } } print("value of Array is ",imageArray.count) return imageArray } private func getVideoTime(url: NSURL) -> Float64 { let videoTime = AVURLAsset(URL: url, options: nil) print("videoTime.preferredRate = \(videoTime.preferredRate)") return CMTimeGetSeconds(videoTime.duration) } private func generateVideoThumbs(url: NSURL, second: Float64, thumbWidth: Double, generator:AVAssetImageGenerator, fps: CMTimeScale) -> CGImage! { let thumbTime = CMTimeMake(Int64(second), fps) var actualTime : CMTime = CMTimeMake(0, 0) print("thumbTime - \(thumbTime)") do { let ref = try generator.copyCGImageAtTime(thumbTime, actualTime: &actualTime) print("actualTime - \(actualTime)") return ref//UIImage(CGImage: ref) }catch { print(error) return nil } } 

Everything works fine if the video is less than 60 seconds. It also works great on the simulator, but turns off the device without any warnings or errors. enter image description here Any help would be appreciated, thanks

+7
ios iphone swift
source share
3 answers
 autoreleasepool{ if image != nil { var newImage:UIImage = UIImage(CGImage: cgImage) imageArray.addObject(image) } } 

As in the aforementioned reciprocal immersion, the mentioned memory problem. Just use this code. I tested it and it worked fine.

+2
source share

Core Graphics does not support autostart pool in Objective-C, but Swift ARC can handle CF types, in any case, you may still have problems with CGImage in some cases, maybe this is the case, and you obviously have a problem with the size of your imageArray , and I'm sure your device is disconnected due to out of memory SIGTERM. iOS Simulator uses shared memory on your modern 15-inch MacBook, so everything will work there.

I suggest you rewrite your generateVideoThumbs(...) function and use UIImage as the return value (you've already tried this, as I see it, just get your solution back using UIImage ):

 private func generateVideoThumbs(url: NSURL, second: Float64, thumbWidth: Double, generator:AVAssetImageGenerator, fps: CMTimeScale) -> UIImage! { let thumbTime = CMTimeMake(Int64(second), fps) var actualTime : CMTime = CMTimeMake(0, 0) print("thumbTime - \(thumbTime)") do { let ref = try generator.copyCGImageAtTime(thumbTime, actualTime: &actualTime) let resultImage = UIImage.init(ref) print("actualTime - \(actualTime)") return resultImage }catch { print(error) return nil } } 

And secondly, I suggest you save your images to disk and save only links in imageArray . Something like that:

 if image != nil { let data = UIImagePNGRepresentation(image) let filename = NSTemporaryDirectory().appendingPathComponent("\(i).png") try? data.write(to: filename) imageArray.addObject(filename) } 

You can then extract the images from the URLs and do whatever you plan to do with them. This allows you to avoid memory pressure and use your array in any way (copy, forward to another class, etc.).

PS I wrote the code without compilation, but I think the idea is clear.

+1
source share

This is a memory problem. copeCGImageAtTime is synchronous in the main thread, and there is a problem if you put it in a for loop.

Use - (void)generateCGImagesAsynchronouslyForTimes:(NSArray<NSValue *> *)requestedTimes completionHandler:(AVAssetImageGeneratorCompletionHandler)handler;

and not - (nullable CGImageRef)copyCGImageAtTime:(CMTime)requestedTime actualTime:(nullable CMTime *)actualTime error:(NSError * _Nullable * _Nullable)outError CF_RETURNS_RETAINED; .

Try again.

A good day.

0
source share

All Articles