Typically, you can use a low-level approach to create a video stream and then write it to a file. I am not a specialist in video formats, codecs, etc., But the approach:
- Configure CADisplayLink to receive a callback in each frame when the screen is redrawn. Perhaps a good solution is to set the frame interval to 2 to reduce the target video frame rate to ~ 30 frames per second.
- Each time you re-view the screen, take a snapshot of the preview and overlay layer.
- Processing collected images: zip every two images of one frame, then make a video stream from a sequence of combined frames. I assume that iOS has built-in tools for a more or less simple way to do this.
Of course, resolution and quality are limited by layer parameters. If you need an unprocessed stream of video from the camera, you should capture this stream, and then draw your overlay data directly in the captured video frames.
source share