I use AVMutableComposition and AVAssetExportSession to crop the video down. By chance, and I mean random (I can not play sequentially) user videos have several black frames at the beginning of the cropped video. The sound does not change. I can confirm 100% that the cropped videos have nothing to do with this, as this happens for a variety of videos from different sources.
Any insight into why these videos are exported with black frames at the beginning would be very welcome. Thanks!
Some relevant code (sorry for the length):
// AVURLAssetPreferPreciseDurationAndTimingKey added in attempt to solve issue let videoAsset = AVURLAsset(URL: url, options: [AVURLAssetPreferPreciseDurationAndTimingKey: true]) var mixComposition = AVMutableComposition() let compositionVideoTrack = mixComposition.addMutableTrackWithMediaType( AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid) ) let clipVideoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0] as! AVAssetTrack let videoSize = clipVideoTrack.naturalSize // startTime and duration are NSTimeInterval types let start = startTime == 0 ? kCMTimeZero : CMTimeMakeWithSeconds(startTime, videoAsset.duration.timescale) var dur = CMTimeMakeWithSeconds(duration, videoAsset.duration.timescale) if dur.value >= videoAsset.duration.value { dur = videoAsset.duration } compositionVideoTrack.insertTimeRange( CMTimeRange(start: start, duration: dur), ofTrack:clipVideoTrack, atTime: kCMTimeZero, error:nil ) compositionVideoTrack.preferredTransform = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0].preferredTransform let compositionAudioTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)) let clipAudioTrack = videoAsset.tracksWithMediaType(AVMediaTypeAudio)[0] as! AVAssetTrack compositionAudioTrack.insertTimeRange( CMTimeRange(start: start, duration: dur), ofTrack: clipAudioTrack, atTime: kCMTimeZero, error: nil ) let parentLayer = CALayer() parentLayer.backgroundColor = UIColor.blackColor().CGColor let videoLayer = CALayer() videoLayer.backgroundColor = UIColor.blackColor().CGColor var parentFrame = CGRect( x: 0, y: 0, width: videoSize.width, height: videoSize.height ) if parentFrame.width % 2 > 0 { parentFrame.size.width = parentFrame.size.width - 1 } // Fix crop frame height if parentFrame.size.height % 2 > 0 { parentFrame.size.height = parentFrame.size.height - 1 } parentLayer.frame = parentFrame videoLayer.frame = CGRect( x: 0, y: 0, width: videoSize.width, height: videoSize.height ) parentLayer.addSublayer(videoLayer) let videoComp = AVMutableVideoComposition() videoComp.renderSize = parentLayer.frame.size videoComp.frameDuration = CMTimeMake(1, Int32(clipVideoTrack.nominalFrameRate)) videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, inLayer: parentLayer) let instruction = AVMutableVideoCompositionInstruction() instruction.timeRange = CMTimeRange(start: kCMTimeZero, duration: mixComposition.duration) let videoTrack = mixComposition.tracksWithMediaType(AVMediaTypeVideo)[0] as! AVAssetTrack let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack) layerInstruction.setTransform(CGAffineTransformMakeScale(parentLayer.frame.size.width / videoSize.width, parentLayer.frame.size.height / videoSize.height), atTime: kCMTimeZero) instruction.layerInstructions = [layerInstruction] videoComp.instructions = [instruction] // Export let exportSession = AVAssetExportSession( asset: mixComposition, presetName: AVAssetExportPresetHighestQuality ) exportSession.videoComposition = videoComp let renderFileName = "video.mp4" let renderURL = NSURL(fileURLWithPath: NSTemporaryDirectory().stringByAppendingPathComponent(renderFileName)) exportSession.outputURL = renderURL exportSession.outputFileType = AVFileTypeQuickTimeMovie exportSession.exportAsynchronouslyWithCompletionHandler { ... }
source share