everything!
I know there are a lot of questions about FFMPEG on iOS, but no answer fits my case :( Something strange happens in every case when I try to link FFMPEG in my project, so please help me!
My task is to write a video chat application for iOS that uses the RTMP protocol to publish and read video streams to / from a custom Flash Media Server.
I decided to use the free open source library rtmplib to stream FLV videos through RTMP, as this is the only matching library.
There were a lot of problems when I started to study it, but later I realized how this should work.
Now I can read the live stream of the FLV video (from url) and send it back to the channel using my application.
My problem now is sending the FROM Camera video. The sequence of basic operations, as I understand it, should be as follows:
Using AVFoundation, using the sequence (Device-AVCaptureSession-AVVideoDataOutput-> AVAssetWriter) I write this to a file (if you need, I can describe this stream in more detail, but it doesn't matter in the context of the question). This stream is necessary to make hardware acceleration of real-time video conversion from a camera in H.264 code. But this is in the format of the MOV container. (This step is completed)
I read this temporary file with each recorded sample and get a stream of video bytes (encoded in H.264 in the QuickTime container). (this is a completed step)
I need to convert video data from QuickTime container format to FLV. And all this in real time. (Batch package)
If I have video packets contained in the format of the FLV container, I can send packets through RTMP using rtmplib.
Now the hardest part for me is step 3.
I think I need to use ffmpeg lib for this conversion (libavformat). I even found the source code showing how to decode h.264 data packets from a MOV file (looking in libavformat, I found that it is possible to extract these packets even from a byte stream, which is more suitable for me). And if this is completed, I will need to encode the packets in FLV (using ffmpeg or manually to add the FLV headers to the h.264 packets, this is not a problem and is easy if I'm right).
FFMPEG has excellent documentation and a very powerful library, and I think there will be no problem using it. BUT the problem here is that I cannot get it to work in the iOS project.
I spent 3 days reading documentation, stackoverflow and googling the answer to the question "How to build FFMPEG for iOS", and I think my prime minister is going to fire me if I spend another week trying to compile this library :))
I tried to use many build scripts and configure files, but when I create FFMPEG, I compile libavformat, libavcodec, etc. for x86 architecture (even when I specify archv6 arch in build-script). (I use "lipo -info libavcodec.a" to display architectures)
So, I can’t build these sources and decided to find a ready-made FFMPEG that is built for the architecture of armv7, armv6, i386.
I downloaded iOS Comm Lib from MidnightCoders from github and it contains an example of using FFMPEG, it contains pre-created .a avcodec, avformat files and other FFMPEG libraries.
I check their architecture:
iMac-2:MediaLibiOS root
And I realized that it was appropriate for me! When I tried to add these libraries and headers to the xCode project, it compiles fine (and I don’t even have any warnings like “The library was compiled for a different architecture”), and I can use structures from the headers, but when I try to call C - function from libavformat (av_register_all ()), the compiler shows me the error message "Symbol (s) not found for armv7 architecture: av_register_all".
I thought that maybe there are no characters in lib and tried to show them:
root
Now I'm stuck here, I don’t understand why xCode cannot see these characters and cannot move forward.
Please correct me if I am mistaken in understanding the stream for publishing the RTMP stream from iOS and helping me in creating and linking FFMPEG for iOS.
I have an iPhone 5.1. SDK and xCode 4.2.