I record small video clips (about one second or so, with the front and rear cameras, with various possible orientations). Then try combining them using AVAssetExportSession. I mainly do composition and video composition with the correct transformations and audio and video tracks.
The problem is that on iOS 5 it fails if you have more than 4 videos, and on iOS 6 the limit is 16 clips.
This seems really puzzling to me. Is AVAssetExportSession doing something strange or has some kind of undocumented limit on the number of clips that can be transferred to it? Here are some excerpts from my code:
-(void)exportVideo { AVMutableComposition *composition = video.composition; AVMutableVideoComposition *videoComposition = video.videoComposition; NSString * presetName = AVAssetExportPresetMediumQuality; AVAssetExportSession *_assetExport = [[AVAssetExportSession alloc] initWithAsset:composition presetName:presetName]; self.exportSession = _assetExport; videoComposition.renderSize = CGSizeMake(640, 480); _assetExport.videoComposition = videoComposition; NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent: @"export.mov"]; NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
And here is how the compositions are composed:
-(void)setVideoAndExport { video = nil; video = [[VideoComposition alloc] initVideoTracks]; CMTime localTimeline = kCMTimeZero; // Create the composition of all videofiles for (NSURL *url in outputFileUrlArray) { AVAsset *asset = [[AVURLAsset alloc]initWithURL:url options:nil]; [video setVideo:url at:localTimeline]; localTimeline = CMTimeAdd(localTimeline, asset.duration); // Increment the timeline } [self exportVideo]; }
And here is the meat of the VideoComposition class:
-(id)initVideoTracks { if((self = [super init])) { composition = [[AVMutableComposition alloc] init]; addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; instructions = [[NSMutableArray alloc] init]; videoComposition = [AVMutableVideoComposition videoComposition]; } return self; } -(void)setVideo:(NSURL*) url at:(CMTime)to { asset = [[AVURLAsset alloc]initWithURL:url options:nil]; AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVMutableCompositionTrack *compositionTrackVideo = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionTrackVideo insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack: assetTrack atTime:to error:nil]; AVMutableCompositionTrack *compositionTrackAudio = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionTrackAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:to error:nil]; mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeAdd(to, asset.duration)); AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionTrackVideo]; [layerInstruction setTransform: assetTrack.preferredTransform atTime: kCMTimeZero]; [layerInstruction setOpacity:0.0 atTime:CMTimeAdd(to, asset.duration)]; [instructions addObject:layerInstruction]; mainInstruction.layerInstructions = instructions; videoComposition.instructions = [NSArray arrayWithObject:mainInstruction]; videoComposition.frameDuration = CMTimeMake(1, 30); }
Karvapallo
source share