Play video using AVPlayer

I get the frame buffer one by one from the video file using AVAssetReader and perform some operation on the frame and then save the new frame to the temp file using AVAssetWritter . Now I have a temp file path where all new frames are saved one at a time. Is there a way to play the video at time intervals by constantly adding to the temporary file?

here is the code for playing video from a temporary path (where frames are constantly being added)

 - (void)loadAssetFromFile { AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[(mMediaReader.mCameraRecorder) tempVideoFilePath ]] options:nil]; NSString *tracksKey = @"tracks"; [asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler: ^{ // Completion handler block. dispatch_async(dispatch_get_main_queue(), ^{ NSError *error = nil; AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error]; if (status == AVKeyValueStatusLoaded) { self.mPlayerItem = [AVPlayerItem playerItemWithAsset:asset]; [mPlayerItem addObserver:self forKeyPath:@"status" options:0 context:&ItemStatusContext]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(playerItemDidReachEnd:) name:AVPlayerItemDidPlayToEndTimeNotification object:mPlayerItem]; self.mPlayer = [AVPlayer playerWithPlayerItem:mPlayerItem]; [mPlayerView setPlayer:mPlayer]; [self play:nil]; } else { // You should deal with the error appropriately. NSLog(@"The asset tracks were not loaded:\n%@", [error localizedDescription]); } }); }]; } - (IBAction)play:sender { [mPlayer play]; } 

And the code inside the block never runs.

+7
source share
2 answers

Sharing videos in multiple sub-videos worked for me.

What I did instead of saving the full video in one time path. I split this video into several sub-videos and then replaced the AVPlayerItem property of AVPlayer accordingly.

So now the functionality works just like streaming a video. :)

+4
source

You can also convert a CMSampleBuffer, which AVAssetReader will return to CGImage, and then UIImage and display that in UIImageView to display frames when they are output from the original video file.

The AV programming guide provides sample code that shows how to do this.

+1
source

All Articles