AVFoundation does not export orientation correctly

I am trying to combine image and video. I have their combination and export, but they rotate on the sides.

Sorry to insert bulk code. I saw answers about applying the transform to compositionVideoTrack.preferredTransform , but it does nothing. Adding to AVMutableVideoCompositionInstruction does nothing.

I feel that in this area everything is starting to go wrong. here:

 // I feel like this loading here is the problem let videoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0] // because it makes our parentLayer and videoLayer sizes wrong let videoSize = videoTrack.naturalSize // this is returning 1920x1080, so it is rotating the video print("\(videoSize.width) , \(videoSize.height)") 

Thus, our frame sizes are incorrect for the rest of the method. Now, when we try to go and create an overlay layer, the frame is incorrect:

  let aLayer = CALayer() aLayer.contents = UIImage(named: "OverlayTestImageOverlay")?.CGImage aLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height) aLayer.opacity = 1 

Here is my complete method.

  func combineImageVid() { let path = NSBundle.mainBundle().pathForResource("SampleMovie", ofType:"MOV") let fileURL = NSURL(fileURLWithPath: path!) let videoAsset = AVURLAsset(URL: fileURL) let mixComposition = AVMutableComposition() let compositionVideoTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid) var clipVideoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo) do { try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), ofTrack: clipVideoTrack[0], atTime: kCMTimeZero) } catch _ { print("failed to insertTimeRange") } compositionVideoTrack.preferredTransform = videoAsset.preferredTransform // I feel like this loading here is the problem let videoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0] // because it makes our parentLayer and videoLayer sizes wrong let videoSize = videoTrack.naturalSize // this is returning 1920x1080, so it is rotating the video print("\(videoSize.width) , \(videoSize.height)") let aLayer = CALayer() aLayer.contents = UIImage(named: "OverlayTestImageOverlay")?.CGImage aLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height) aLayer.opacity = 1 let parentLayer = CALayer() let videoLayer = CALayer() parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height) videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height) parentLayer.addSublayer(videoLayer) parentLayer.addSublayer(aLayer) let videoComp = AVMutableVideoComposition() videoComp.renderSize = videoSize videoComp.frameDuration = CMTimeMake(1, 30) videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, inLayer: parentLayer) let instruction = AVMutableVideoCompositionInstruction() instruction.timeRange = CMTimeRangeMake(kCMTimeZero, mixComposition.duration) let mixVideoTrack = mixComposition.tracksWithMediaType(AVMediaTypeVideo)[0] mixVideoTrack.preferredTransform = CGAffineTransformMakeRotation(CGFloat(M_PI * 90.0 / 180)) let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: mixVideoTrack) instruction.layerInstructions = [layerInstruction] videoComp.instructions = [instruction] // create new file to receive data let dirPaths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true) let docsDir: AnyObject = dirPaths[0] let movieFilePath = docsDir.stringByAppendingPathComponent("result.mov") let movieDestinationUrl = NSURL(fileURLWithPath: movieFilePath) do { try NSFileManager.defaultManager().removeItemAtPath(movieFilePath) } catch _ {} // use AVAssetExportSession to export video let assetExport = AVAssetExportSession(asset: mixComposition, presetName:AVAssetExportPresetHighestQuality) assetExport?.videoComposition = videoComp assetExport!.outputFileType = AVFileTypeQuickTimeMovie assetExport!.outputURL = movieDestinationUrl assetExport!.exportAsynchronouslyWithCompletionHandler({ switch assetExport!.status{ case AVAssetExportSessionStatus.Failed: print("failed \(assetExport!.error)") case AVAssetExportSessionStatus.Cancelled: print("cancelled \(assetExport!.error)") default: print("Movie complete") // play video NSOperationQueue.mainQueue().addOperationWithBlock({ () -> Void in print(movieDestinationUrl) }) } }) } 

This is what I get: enter image description here


I tried adding these two methods to rotate the video:

 class func videoCompositionInstructionForTrack(track: AVCompositionTrack, asset: AVAsset) -> AVMutableVideoCompositionLayerInstruction { let instruction = AVMutableVideoCompositionLayerInstruction(assetTrack: track) let assetTrack = asset.tracksWithMediaType(AVMediaTypeVideo)[0] let transform = assetTrack.preferredTransform let assetInfo = orientationFromTransform(transform) var scaleToFitRatio = UIScreen.mainScreen().bounds.width / assetTrack.naturalSize.width if assetInfo.isPortrait { scaleToFitRatio = UIScreen.mainScreen().bounds.width / assetTrack.naturalSize.height let scaleFactor = CGAffineTransformMakeScale(scaleToFitRatio, scaleToFitRatio) instruction.setTransform(CGAffineTransformConcat(assetTrack.preferredTransform, scaleFactor), atTime: kCMTimeZero) } else { let scaleFactor = CGAffineTransformMakeScale(scaleToFitRatio, scaleToFitRatio) var concat = CGAffineTransformConcat(CGAffineTransformConcat(assetTrack.preferredTransform, scaleFactor), CGAffineTransformMakeTranslation(0, UIScreen.mainScreen().bounds.width / 2)) if assetInfo.orientation == .Down { let fixUpsideDown = CGAffineTransformMakeRotation(CGFloat(M_PI)) let windowBounds = UIScreen.mainScreen().bounds let yFix = assetTrack.naturalSize.height + windowBounds.height let centerFix = CGAffineTransformMakeTranslation(assetTrack.naturalSize.width, yFix) concat = CGAffineTransformConcat(CGAffineTransformConcat(fixUpsideDown, centerFix), scaleFactor) } instruction.setTransform(concat, atTime: kCMTimeZero) } return instruction } class func orientationFromTransform(transform: CGAffineTransform) -> (orientation: UIImageOrientation, isPortrait: Bool) { var assetOrientation = UIImageOrientation.Up var isPortrait = false if transform.a == 0 && transform.b == 1.0 && transform.c == -1.0 && transform.d == 0 { assetOrientation = .Right isPortrait = true } else if transform.a == 0 && transform.b == -1.0 && transform.c == 1.0 && transform.d == 0 { assetOrientation = .Left isPortrait = true } else if transform.a == 1.0 && transform.b == 0 && transform.c == 0 && transform.d == 1.0 { assetOrientation = .Up } else if transform.a == -1.0 && transform.b == 0 && transform.c == 0 && transform.d == -1.0 { assetOrientation = .Down } return (assetOrientation, isPortrait) } 

Updated combineImageVid() method adding this to

 let instruction = AVMutableVideoCompositionInstruction() instruction.timeRange = CMTimeRangeMake(kCMTimeZero, mixComposition.duration) let mixVideoTrack = mixComposition.tracksWithMediaType(AVMediaTypeVideo)[0] //let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: mixVideoTrack) //layerInstruction.setTransform(videoAsset.preferredTransform, atTime: kCMTimeZero) let layerInstruction = videoCompositionInstructionForTrack(compositionVideoTrack, asset: videoAsset) 

Which gives me this result:

enter image description here

So, Iโ€™m getting closer, but I feel that since the track is initially loading incorrectly, I need to solve the problem there. In addition, I do not know why there is now a huge black box. I thought that maybe this is due to the fact that my image layer is loading from a loaded video object:

 aLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height) 

However, changing this value to some small width / height does not matter. Then I thought about adding a crop to get rid of the black square, but that didn't work either :(


Following Allens recommendations for using these two methods:

 class func videoCompositionInstructionForTrack(track: AVCompositionTrack, asset: AVAsset) -> AVMutableVideoCompositionLayerInstruction class func orientationFromTransform(transform: CGAffineTransform) -> (orientation: UIImageOrientation, isPortrait: Bool) 

But updating my original method is as follows:

 videoLayer.frame = CGRectMake(0, 0, videoSize.height, videoSize.width) //notice the switched width and height ... videoComp.renderSize = CGSizeMake(videoSize.height,videoSize.width) //this make the final video in portrait ... layerInstruction.setTransform(videoTrack.preferredTransform, atTime: kCMTimeZero) //important piece of information let composition know you want to rotate the original video in output 

We become very close, however now the problem is editing renderSize . If I change it to anything but the size of the terrain, I get this:

enter image description here

+7
ios swift camera orientation avassetexportsession
source share
2 answers

here is the orientation document for Apple:

https://developer.apple.com/library/ios/qa/qa1744/_index.html

if your original video was shot in iOS portrait mode, its size will still be landscape, but it comes with metadata to rotate in the mov file. To rotate your video, you need to make changes to your 1st code snippet:

 videoLayer.frame = CGRectMake(0, 0, videoSize.height, videoSize.width) //notice the switched width and height ... videoComp.renderSize = CGSizeMake(videoSize.height,videoSize.width) //this make the final video in portrait ... layerInstruction.setTransform(videoTrack.preferredTransform, atTime: kCMTimeZero) //important piece of information let composition know you want to rotate the original video in output 

Yes, you are really close!

+14
source share

Maybe U should check out videoTrack preferredTransform to give it accurate rendering and conversion:

 CGAffineTransform transform = assetVideoTrack.preferredTransform; CGFloat rotation = [self rotationWithTransform:transform]; //if been rotated if (rotation != 0) { //if rotation is 360ยฐ if (fabs((rotation - M_PI * 2)) >= valueOfError) { CGFloat m = rotation / M_PI; CGAffineTransform t1; //rotation is 90ยฐ or 270ยฐ if (fabs(m - 1/2.0) < valueOfError || fabs(m - 3/2.0) < valueOfError) { self.mutableVideoComposition.renderSize = CGSizeMake(assetVideoTrack.naturalSize.height,assetVideoTrack.naturalSize.width); t1 = CGAffineTransformMakeTranslation(assetVideoTrack.naturalSize.height, 0); } //rotation is 180ยฐ if (fabs(m - 1.0) < valueOfError) { t1 = CGAffineTransformMakeTranslation(assetVideoTrack.naturalSize.width, assetVideoTrack.naturalSize.height); } CGAffineTransform t2 = CGAffineTransformRotate(t1,rotation); // CGAffineTransform transform = makeTransform(1.0, 1.0, 90, videoTrack.naturalSize.height, 0); [passThroughLayer setTransform:t2 atTime:kCMTimeZero]; } } //convert transform to radian - (CGFloat)rotationWithTransform:(CGAffineTransform)t { return atan2f(tb, ta); } 
0
source share

All Articles