IOS - Reverse Video File (.mov)

Requirement :

It seems that everything is different, but I want to achieve this. I want to make a movie (.mov) in reverse order. Just as we rewind a movie file. I also want to maintain the same frame rate as my video.

NOTE. I do not just want to play the video in reverse order. I want to generate a new movie playing in the reverse order.

My research :

I thought of the following steps to accomplish the same thing.

  • Make pieces of video files at a specific frame rate using AVAssetExportSession
  • Combine all of these video fragments into a single file using AVMutableComposition and AVAssetExportSession .
  • Also merge the sound of each file into a new video file during the merge process.

Using the steps above, I can get the resulting video file in the reverse order, but mine below .

  • It takes a lot of time if the video is a long time.
  • It also consumes huge processor cycles and memory to complete this process.

Does anyone have another optimized way to achieve this? Any suggestion would be appreciated.

+7
source share
3 answers

Here is my solution, maybe it can help you. https://github.com/KayWong/VideoReverse

+4
source

Swift 5, thanks to Andy Hin, as I based it on http://www.andyhin.com/post/5/reverse-video-avfoundation

  class func reverseVideo(inURL: URL, outURL: URL, queue: DispatchQueue, _ completionBlock: ((Bool)->Void)?) { let asset = AVAsset.init(url: inURL) guard let reader = try? AVAssetReader.init(asset: asset), let videoTrack = asset.tracks(withMediaType: .video).first else { assert(false) completionBlock?(false) return } let width = videoTrack.naturalSize.width let height = videoTrack.naturalSize.height let readerSettings: [String : Any] = [ String(kCVPixelBufferPixelFormatTypeKey) : kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, ] let readerOutput = AVAssetReaderTrackOutput.init(track: videoTrack, outputSettings: readerSettings) reader.add(readerOutput) reader.startReading() var buffers = [CMSampleBuffer]() while let nextBuffer = readerOutput.copyNextSampleBuffer() { buffers.append(nextBuffer) } let status = reader.status reader.cancelReading() guard status == .completed, let firstBuffer = buffers.first else { assert(false) completionBlock?(false) return } let sessionStartTime = CMSampleBufferGetPresentationTimeStamp(firstBuffer) let writerSettings: [String:Any] = [ AVVideoCodecKey : AVVideoCodecType.h264, AVVideoWidthKey : width, AVVideoHeightKey: height, ] let writerInput: AVAssetWriterInput if let formatDescription = videoTrack.formatDescriptions.last { writerInput = AVAssetWriterInput.init(mediaType: .video, outputSettings: writerSettings, sourceFormatHint: (formatDescription as! CMFormatDescription)) } else { writerInput = AVAssetWriterInput.init(mediaType: .video, outputSettings: writerSettings) } writerInput.transform = videoTrack.preferredTransform writerInput.expectsMediaDataInRealTime = false guard let writer = try? AVAssetWriter.init(url: outURL, fileType: .mp4), writer.canAdd(writerInput) else { assert(false) completionBlock?(false) return } let pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor.init(assetWriterInput: writerInput, sourcePixelBufferAttributes: nil) let group = DispatchGroup.init() group.enter() writer.add(writerInput) writer.startWriting() writer.startSession(atSourceTime: sessionStartTime) var currentSample = 0 writerInput.requestMediaDataWhenReady(on: queue) { for i in currentSample..<buffers.count { currentSample = i if !writerInput.isReadyForMoreMediaData { return } let presentationTime = CMSampleBufferGetPresentationTimeStamp(buffers[i]) guard let imageBuffer = CMSampleBufferGetImageBuffer(buffers[buffers.count - i - 1]) else { WLog("VideoWriter reverseVideo: warning, could not get imageBuffer from SampleBuffer...") continue } if !pixelBufferAdaptor.append(imageBuffer, withPresentationTime: presentationTime) { WLog("VideoWriter reverseVideo: warning, could not append imageBuffer...") } } // finish writerInput.markAsFinished() group.leave() } group.notify(queue: queue) { writer.finishWriting { if writer.status != .completed { WLog("VideoWriter reverseVideo: error - \(String(describing: writer.error))") completionBlock?(false) } else { completionBlock?(true) } } } } 
0
source

you need to transfer the AVFoundation library to achieve your task.

I only performed video editing in 30 seconds using AVAssetExportSession and AVMutableComposition .

this is the link you need to pass and its help in full.

http://www.subfurther.com/blog/category/avfoundation/

And also, if you want to refer to the WWDC Conference PDF for media editing, it would be better.

this link's shared resource: https://developer.apple.com/videos/wwdc/2010/ And this link covers Media Editing with AVFoundation

About memory cycles. it also consumes more memory when exporting.

-one
source

All Articles