I know how to get a frame from iOS sdk. [How to capture video frames from a camera as images using AV Foundation ( http://developer.apple.com/library/ios/#qa/qa1702/_index.html)] It is a pixel and I can transfer it to JPEG.
The way I want to transfer the video is as follows:
One iOS A device:
- Get pixel or JPEG from call function
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
Using existing encoding technology for h.264 - ffmpeg
encapsulate video with TS stream
Start the http server and wait for the request
Other iOS B device:
- http request to A (using http just instead of rtp / rtsp)
So my question is: do I need to use ffmpeg to get the h.264 stream or can I get from the iOS API? If I use ffmpeg to encode h.264 (libx264), how to do this, is there any code sample or recommendation?
I read the message. What is the best way to directly stream the iphone camera to a media server? This is a pretty good discussion, but I want to know the details.
user855118
source share