RTP iPhone camera - How to read an AVAssetWriter file while writing it?

I am trying to capture an iPhone RTSP / RTP camera to a Wowza server.

The Apple API does not allow direct access to H264 encoded frames, but allows you to write them to a .mov file. container.

In any case, I cannot access this file content until AVAssetWriter finishes writing, which prevents me from broadcasting the recording in real time.

I tried to access it using a named pipe in order to access the contents of the file in real time, but there is no success there - AVAssetWriter will not write to the existing file.

Does anyone know how to do this?

Thanks!

Change Starting with iOS 8, the encoder and decoder have an API

+7
source share
2 answers

Only the solution I found works so far,
records without sound , then the file is recorded to the location you specify.
Otherwise, it is probably recorded in a temporary place where you cannot get.

Here is an example of Apple for video capture: AVCam
You will need to remove the sound channels.

If anyone has a better way, please post it here.

+1
source

You can use AVCaptureVideoDataOutput to process / stream each frame while the camera is running, and AVAssetWriter to record the video file at the same time (adding each frame of the video output queue).

See also Simultaneous AVCaptureVideoDataOutput and AVCaptureMovieFileOutput and Can I use AVCaptureVideoDataOutput and AVCaptureMovieFileOutput at the same time?

+1
source

All Articles