I spent the whole day and went through a lot of SO answers, Apple links, documentation, etc., but didn’t.
I want a simple thing: I play the video using AVPlayer and I want to pause it and get the current frame as a UIImage . What is it.
My video is a m3u8 file located on the Internet, it plays normally in AVPlayerLayer without any problems.
What I tried:
AVAssetImageGenerator . It does not work, the copyCGImageAtTime:actualTime: error: method returns null ref ref. According to the answer here, AVAssetImageGenerator does not work for video streaming.- Take a snapshot of the player view. I tried first
renderInContext: on AVPlayerLayer , but then I realized that it does not create such “special” layers. Then I found a new method introduced in iOS 7 - drawViewHierarchyInRect:afterScreenUpdates: which should be able to display special layers as well, but no luck, still got a UI snapshot with an empty black area where the video is displayed. AVPlayerItemVideoOutput . I have added a video output for my AVPlayerItem , however whenever I call hasNewPixelBufferForItemTime: it returns NO . I think the problem is that the video is streaming again and I am not alone with this problem.AVAssetReader I thought to try, but decided not to lose time after searching for a related question here .
So, is there no way to get a snapshot of something that I still see right now on the screen? I can not believe this.
source share