How to effectively transfer real-time video between two iOS devices (e.g. facetime, skype, fring, tango)

I know how to get a frame from iOS sdk. [How to capture video frames from a camera as images using AV Foundation ( http://developer.apple.com/library/ios/#qa/qa1702/_index.html)] It is a pixel and I can transfer it to JPEG.

The way I want to transfer the video is as follows:

One iOS A device:

  • Get pixel or JPEG from call function

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection

  1. Using existing encoding technology for h.264 - ffmpeg

  2. encapsulate video with TS stream

  3. Start the http server and wait for the request

Other iOS B device:

  • http request to A (using http just instead of rtp / rtsp)

So my question is: do I need to use ffmpeg to get the h.264 stream or can I get from the iOS API? If I use ffmpeg to encode h.264 (libx264), how to do this, is there any code sample or recommendation?

I read the message. What is the best way to directly stream the iphone camera to a media server? This is a pretty good discussion, but I want to know the details.

+7
source share
2 answers

The ffmpeg license is not compatible with iOS apps distributed through the App Store.

If you want to transfer real-time video and use any useful frame rate, you will not want to use http or TCP.

0
source

Although this does not directly answer your question about which video format to use, I would suggest exploring some third-party structures such as ToxBox or QuickBlox . There is a fantastic tutorial using Parse and OpenTok here:

http://www.iphonegamezone.net/ios-tutorial-create-iphone-video-chat-app-using-parse-and-opentok-tokbox/

0
source

All Articles