IOS AVFoundation - display video display time and export

I want to show an overlay image over a video and export this video, including this screen. I looked at the AVFoundation Framework, AVCompositions, AVAssets, etc., but I have no idea for this yet. There is a class called AVSynchronizedLayer that allows you to animate things that synchronize the video, but I don’t want to animate, I want to overlay the time display on each individual frame of the video. Any tips?

Hi

+8
ios video avfoundation overlay
source share
2 answers

Something like that...

(NB: rejected from a much larger project, so I could accidentally include some unnecessary parts).

You will need to capture the CALayer of your watch / animation and set it to var myClockLayer (using 1/3 of the way down the evaluation tool).

It also assumes that your incoming video has only two tracks - audio and video. If you have more, you will need to set the track ID to "asTrackID: 2" more carefully.

AVURLAsset* url = [AVURLAsset URLAssetWithURL:incomingVideo options:nil]; AVMutableComposition *videoComposition = [AVMutableComposition composition]; AVMutableCompositionTrack *compositionVideoTrack = [videoComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; AVAssetTrack *clipVideoTrack = [[url tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [url duration]) ofTrack:clipVideoTrack atTime:kCMTimeZero error:&error]; AVMutableVideoComposition* videoComposition = [[AVMutableVideoComposition videoComposition]retain]; videoComposition.renderSize = CGSizeMake(320, 240); videoComposition.frameDuration = CMTimeMake(1, 30); videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithAdditionalLayer:myClockLayer asTrackID:2]; AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30) ); AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack]; instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction]; videoComposition.instructions = [NSArray arrayWithObject: instruction]; exporter = [[AVAssetExportSession alloc] initWithAsset:saveComposition presetName:AVAssetExportPresetHighestQuality] ; exporter.videoComposition = videoComposition; exporter.outputURL=url3; exporter.outputFileType=AVFileTypeQuickTimeMovie; [exporter exportAsynchronouslyWithCompletionHandler:^(void){}]; 
+5
source share

I think you can use AVCaptureVideoDataOutput to process each frame and use AVAssetWriter to record the processed frame. You can reference this answer.

https://stackoverflow.com/questions/122986/...

use AVAssetWriterPixelBufferAdaptor appendPixelBuffer: withPresentationTime: method to export
And I highly recommend using OpenCV to handle the frame. this is a good tutorial

http://aptogo.co.uk/2011/09/opencv-framework-for-ios/ .

The OpenCV library is very large.

+2
source share

Source: https://habr.com/ru/post/651303/


All Articles