Blank frame when merging video using AVMutableComposition

This question has been asked many times, but nothing helped me. I am combining several videos using AVMutableComposition . After merging the video, I get blank frames between 30-40% of the video. Others merge well. I just play the composition directly using AVPlayer as AVPlayerItem . Code below:

 AVMutableComposition *mutableComposition = [AVMutableComposition composition]; AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; NSMutableArray *instructions = [NSMutableArray new]; CGSize size = CGSizeZero; CMTime time = kCMTimeZero; for (AVURLAsset *asset in assets) { AVAssetTrack *assetTrack; assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVAssetTrack *audioAssetTrack = [asset tracksWithMediaType:AVMediaTypeAudio].firstObject; NSError *error; [videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration ) ofTrack:assetTrack atTime:time error:&error]; if (error) { NSLog(@"asset url :: %@",assetTrack.asset); NSLog(@"Error - %@", error.debugDescription); } [audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration) ofTrack:audioAssetTrack atTime:time error:&error]; if (error) { NSLog(@"Error - %@", error.debugDescription); } AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; videoCompositionInstruction.timeRange = CMTimeRangeMake(time, assetTrack.timeRange.duration); videoCompositionInstruction.layerInstructions = @[[AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack]]; [instructions addObject:videoCompositionInstruction]; time = CMTimeAdd(time, assetTrack.timeRange.duration); if (CGSizeEqualToSize(size, CGSizeZero)) { size = assetTrack.naturalSize;; } } AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition]; mutableVideoComposition.instructions = instructions; mutableVideoComposition.frameDuration = CMTimeMake(1, 30); mutableVideoComposition.renderSize = size; playerItem = [AVPlayerItem playerItemWithAsset:mutableComposition]; playerItem.videoComposition = mutableVideoComposition; 
+7
ios objective-c video avfoundation avmutablecomposition
source share
1 answer

As far as I know, AVMutableVideoCompositionLayerInstruction cannot just be added or added as your code.

From your code, I think you want to save information about video instructions when merging video assets, but instructions cannot be directly copied.

If you want to do this, see the docs for AVVideoCompositionLayerInstruction , for example.

  getTransformRampForTime:startTransform:endTransform:timeRange: setTransformRampFromStartTransform:toEndTransform:timeRange: setTransform:atTime: getOpacityRampForTime:startOpacity:endOpacity:timeRange: setOpacityRampFromStartOpacity:toEndOpacity:timeRange: setOpacity:atTime: getCropRectangleRampForTime:startCropRectangle:endCropRectangle:timeRange: setCropRectangleRampFromStartCropRectangle:toEndCropRectangle:timeRange: setCropRectangle:atTime: 

You should use the getFoo... methods on the source track, and then output insertTime or timeRange for the final track, then setFoo... , and then add the final video composition to layerInstructions.

YES, a little complicated ... Also, most importantly, you cannot get all the video effects that apply to the original resource.

So what is your goal? And what is your source resource with support?

If you just want to merge some mp4 / mov files, just loop the tracks and add them to AVMutableCompositionTrack , no videoComposition . And I checked your code, it works.

If you want to combine AVAssets, which with video instructions, see above explanation and docs . And my best practice is, before merging, save these AVAssets to a file using AVAssetExportSession , and then just merge the video files.

ps Maybe there are some problems with your test files or source resources.

The code for my project like Vine is:

  - (BOOL)generateComposition { [self cleanComposition]; NSUInteger segmentsCount = self.segmentsCount; if (0 == segmentsCount) { return NO; } AVMutableComposition *composition = [AVMutableComposition composition]; AVMutableVideoComposition *videoComposition = nil; AVMutableVideoCompositionInstruction *videoCompositionInstruction = nil; AVMutableVideoCompositionLayerInstruction *videoCompositionLayerInstruction = nil; AVMutableAudioMix *audioMix = nil; AVMutableCompositionTrack *videoTrack = nil; AVMutableCompositionTrack *audioTrack = nil; AVMutableCompositionTrack *musicTrack = nil; CMTime currentTime = kCMTimeZero; for (MVRecorderSegment *segment in self.segments) { AVURLAsset *asset = segment.asset; NSArray *videoAssetTracks = [asset tracksWithMediaType:AVMediaTypeVideo]; NSArray *audioAssetTracks = [asset tracksWithMediaType:AVMediaTypeAudio]; CMTime maxBounds = kCMTimeInvalid; CMTime videoTime = currentTime; for (AVAssetTrack *videoAssetTrack in videoAssetTracks) { if (!videoTrack) { videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; videoTrack.preferredTransform = CGAffineTransformIdentity; videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; videoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; } /* Fix orientation */ CGAffineTransform transform = videoAssetTrack.preferredTransform; if (AVCaptureDevicePositionFront == segment.cameraPosition) { transform = CGAffineTransformMakeTranslation(self.config.videoSize, 0); transform = CGAffineTransformScale(transform, -1.0, 1.0); } else if (AVCaptureDevicePositionBack == segment.cameraPosition) { } [videoCompositionLayerInstruction setTransform:transform atTime:videoTime]; /* Append track */ videoTime = [MVHelper appendAssetTrack:videoAssetTrack toCompositionTrack:videoTrack atTime:videoTime withBounds:maxBounds]; maxBounds = videoTime; } if (self.sessionConfiguration.originalVoiceOn) { CMTime audioTime = currentTime; for (AVAssetTrack *audioAssetTrack in audioAssetTracks) { if (!audioTrack) { audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; } audioTime = [MVHelper appendAssetTrack:audioAssetTrack toCompositionTrack:audioTrack atTime:audioTime withBounds:maxBounds]; } } currentTime = composition.duration; } if (videoCompositionInstruction && videoCompositionLayerInstruction) { videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration); videoCompositionInstruction.layerInstructions = @[videoCompositionLayerInstruction]; videoComposition = [AVMutableVideoComposition videoComposition]; videoComposition.renderSize = CGSizeMake(self.config.videoSize, self.config.videoSize); videoComposition.frameDuration = CMTimeMake(1, self.config.videoFrameRate); videoComposition.instructions = @[videoCompositionInstruction]; } // 添加背景音乐 musicTrack NSURL *musicFileURL = self.sessionConfiguration.musicFileURL; if (musicFileURL && musicFileURL.isFileExists) { AVAsset *musicAsset = [AVAsset assetWithURL:musicFileURL]; AVAssetTrack *musicAssetTrack = [musicAsset tracksWithMediaType:AVMediaTypeAudio].firstObject; if (musicAssetTrack) { musicTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; if (CMTIME_COMPARE_INLINE(musicAsset.duration, >=, composition.duration)) { // 如果背景音乐时长大于视频总时长, 则直接添加[musicTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, composition.duration) ofTrack:musicAssetTrack atTime:kCMTimeZero error:NULL]; } else { // 否则, 循环背景音乐CMTime musicTime = kCMTimeZero; CMTime bounds = composition.duration; while (true) { musicTime = [MVHelper appendAssetTrack:musicAssetTrack toCompositionTrack:musicTrack atTime:musicTime withBounds:bounds]; if (CMTIME_COMPARE_INLINE(musicTime, >=, composition.duration)) { break; } } } } } // 处理音频if (musicTrack) { AVMutableAudioMixInputParameters *audioMixParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:musicTrack]; /* 背景音乐添加淡入淡出 */ AVAsset *musicAsset = musicTrack.asset; CMTime crossfadeDuration = CMTimeMake(15, 10); // 前后都是1.5秒CMTime halfDuration = CMTimeMultiplyByFloat64(musicAsset.duration, 0.5); crossfadeDuration = CMTimeMinimum(crossfadeDuration, halfDuration); CMTimeRange crossfadeRangeBegin = CMTimeRangeMake(kCMTimeZero, crossfadeDuration); CMTimeRange crossfadeRangeEnd = CMTimeRangeMake(CMTimeSubtract(musicAsset.duration, crossfadeDuration), crossfadeDuration); [audioMixParameters setVolumeRampFromStartVolume:0.0 toEndVolume:self.sessionConfiguration.musicVolume timeRange:crossfadeRangeBegin]; [audioMixParameters setVolumeRampFromStartVolume:self.sessionConfiguration.musicVolume toEndVolume:0.0 timeRange:crossfadeRangeEnd]; audioMix = [AVMutableAudioMix audioMix]; [audioMix setInputParameters:@[audioMixParameters]]; } _composition = composition; _videoComposition = videoComposition; _audioMix = audioMix; return YES; } - (AVPlayerItem *)playerItem { AVPlayerItem *playerItem = nil; if (self.composition) { playerItem = [AVPlayerItem playerItemWithAsset:self.composition]; if (!self.videoComposition.animationTool) { playerItem.videoComposition = self.videoComposition; } playerItem.audioMix = self.audioMix; } return playerItem; } ///============================================= /// MVHelper ///============================================= + (CMTime)appendAssetTrack:(AVAssetTrack *)track toCompositionTrack:(AVMutableCompositionTrack *)compositionTrack atTime:(CMTime)atTime withBounds:(CMTime)bounds { CMTimeRange timeRange = track.timeRange; atTime = CMTimeAdd(atTime, timeRange.start); if (!track || !compositionTrack) { return atTime; } if (CMTIME_IS_VALID(bounds)) { CMTime currentBounds = CMTimeAdd(atTime, timeRange.duration); if (CMTIME_COMPARE_INLINE(currentBounds, >, bounds)) { timeRange = CMTimeRangeMake(timeRange.start, CMTimeSubtract(timeRange.duration, CMTimeSubtract(currentBounds, bounds))); } } if (CMTIME_COMPARE_INLINE(timeRange.duration, >, kCMTimeZero)) { NSError *error = nil; [compositionTrack insertTimeRange:timeRange ofTrack:track atTime:atTime error:&error]; if (error) { MVLog(@"Failed to append %@ track: %@", compositionTrack.mediaType, error); } return CMTimeAdd(atTime, timeRange.duration); } return atTime; } 
+1
source share

All Articles