ExportAsynchronouslyWithCompletionHandler does not work with multiple video files (Code = -11820)

I record small video clips (about one second or so, with the front and rear cameras, with various possible orientations). Then try combining them using AVAssetExportSession. I mainly do composition and video composition with the correct transformations and audio and video tracks.

The problem is that on iOS 5 it fails if you have more than 4 videos, and on iOS 6 the limit is 16 clips.

This seems really puzzling to me. Is AVAssetExportSession doing something strange or has some kind of undocumented limit on the number of clips that can be transferred to it? Here are some excerpts from my code:

-(void)exportVideo { AVMutableComposition *composition = video.composition; AVMutableVideoComposition *videoComposition = video.videoComposition; NSString * presetName = AVAssetExportPresetMediumQuality; AVAssetExportSession *_assetExport = [[AVAssetExportSession alloc] initWithAsset:composition presetName:presetName]; self.exportSession = _assetExport; videoComposition.renderSize = CGSizeMake(640, 480); _assetExport.videoComposition = videoComposition; NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent: @"export.mov"]; NSURL *exportUrl = [NSURL fileURLWithPath:exportPath]; // Delete the currently exported files if it exists if([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil]; _assetExport.outputFileType = AVFileTypeQuickTimeMovie; _assetExport.outputURL = exportUrl; _assetExport.shouldOptimizeForNetworkUse = YES; [_assetExport exportAsynchronouslyWithCompletionHandler:^{ switch (_assetExport.status) { case AVAssetExportSessionStatusCompleted: NSLog(@"Completed exporting!"); break; case AVAssetExportSessionStatusFailed: NSLog(@"Failed:%@", _assetExport.error.description); break; case AVAssetExportSessionStatusCancelled: NSLog(@"Canceled:%@", _assetExport.error); break; default: break; } }]; } 

And here is how the compositions are composed:

 -(void)setVideoAndExport { video = nil; video = [[VideoComposition alloc] initVideoTracks]; CMTime localTimeline = kCMTimeZero; // Create the composition of all videofiles for (NSURL *url in outputFileUrlArray) { AVAsset *asset = [[AVURLAsset alloc]initWithURL:url options:nil]; [video setVideo:url at:localTimeline]; localTimeline = CMTimeAdd(localTimeline, asset.duration); // Increment the timeline } [self exportVideo]; } 

And here is the meat of the VideoComposition class:

 -(id)initVideoTracks { if((self = [super init])) { composition = [[AVMutableComposition alloc] init]; addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; instructions = [[NSMutableArray alloc] init]; videoComposition = [AVMutableVideoComposition videoComposition]; } return self; } -(void)setVideo:(NSURL*) url at:(CMTime)to { asset = [[AVURLAsset alloc]initWithURL:url options:nil]; AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVMutableCompositionTrack *compositionTrackVideo = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionTrackVideo insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack: assetTrack atTime:to error:nil]; AVMutableCompositionTrack *compositionTrackAudio = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionTrackAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:to error:nil]; mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeAdd(to, asset.duration)); AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionTrackVideo]; [layerInstruction setTransform: assetTrack.preferredTransform atTime: kCMTimeZero]; [layerInstruction setOpacity:0.0 atTime:CMTimeAdd(to, asset.duration)]; [instructions addObject:layerInstruction]; mainInstruction.layerInstructions = instructions; videoComposition.instructions = [NSArray arrayWithObject:mainInstruction]; videoComposition.frameDuration = CMTimeMake(1, 30); } 
+7
source share
2 answers

Ok, I also contacted Apple on this issue, and they answered:

"This is a known condition. You are pushing the decoder limit set in AVFoundation."

They also asked me to write an error report because the error message that AVAssetExportSession gives if it is undefined and misleading. So I filed the error message in the bullseye, complaining that the error message is bad.

Thus, these restrictions in the AVAssetExportSession are confirmed. In iOS 5, the decoder limit is 4, and in iOS 6 it was raised to 16. The main problem here is that the error reported by AVAssetExportSession is bad because it only reports: 11820 “Unable to complete export” instead of actually telling us that we have reached the limit.

+7
source

I also ran into a similar problem. I managed to fix this by inserting assets into the composition, rather than tracks into mutable tracks. So, in your code for "setVideo" instead of this line:

 [compositionTrackVideo insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack: assetTrack atTime:to error:nil]; 

try the following:

 [self insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofAsset:asset atTime:to error:nil] 
+2
source

All Articles