Convert video using ffmpeg for Android and iOS mobile devices

I am creating a reciprocal native application for Android and IOS, the internal API is written using NodeJS.

Users can download videos from their phones after the user downloads the user and his friends to watch the video, so the video must be saved in a format that plays on Android and IOS.

My question is about converting the video uploaded by the user. I developed a similar application a couple of years ago; I used repo node-fluent-ffmpeg , which provides a nice API for interacting with FFmpeg.

In the previous project (which was a web application), I converted the downloaded videos into two files, one .mp4 and one .webm - if the user downloaded mp4, then I would have skipped step mp4, similarly if they downloaded .webm.

It was pretty slow. Now I have faced the same requirement years later, after some research, I think I was wrong to convert the video to the latest project.

I read that I can just use FFmpeg to change the format of the video container, which is much faster than converting them from scratch.

The video conversion code I used last time went line by line:

var convertVideo = function (source, format, output, success, failure, progress) { var converter = ffmpeg(source); var audioCodec = "libvorbis"; if (format.indexOf("mp4") != -1) { audioCodec = "aac"; } converter.format(format) .withVideoBitrate(1024) .withAudioCodec(audioCodec) .on('end', success) .on('progress', progress) .on('error', failure); converter.save(output); }; 

Using:

Convert to mp4:

 convertVideo("PATH_TO_VIDEO", "mp4", "foo.mp4", () => {console.log("success");}); 

Convert to webm:

 convertVideo("PATH_TO_VIDEO", "webm", "foo.webm", () => {console.log("success");}); 

Can anyone point out the smell of code here regarding the effectiveness of this operation? Is this code much more than it should provide cross-platform compatibility between iOS and Android?

Perhaps it is worth mentioning that support for older versions of the OS is not such a big problem in this project.

+7
android ios ffmpeg video
source share
3 answers

What is the difference between codec and container / format?

You need to understand the difference between a codec (e.g. H.264, VP9) and a container format (e.g. MP4, WebM). The container simply stores encoded video and audio information. As a rule, you can change the container between streams ( ffmpeg -i input -c copy output ), but for historical reasons you will find that some containers do not accept some codecs or some players may not process the codec inside the container (for example, only the latest software can read VP9 video in MP4). Check out this overview of container formats to find out which codecs are supported.

What are the restrictions imposed by various mobile OS?

To target iOS and Android platforms, you need to check whether this video file is compatible with supported codecs / formats:

Of course, it can change over time, but, as a rule, a common denominator:

  • H.264 Video
  • MP4 container
  • Main profile
  • Maximum resolution 1920 Γ— 1080 - although the resolution may be higher if the device supports it, it is rarely necessary to exceed Full HD for mobile devices, since people cannot understand the difference between this and 4K / UHD, if only it is a large tablet.
  • The frame rate must not exceed a certain maximum (e.g. 60 Hz)
  • The sub-colorization should be 4: 2: 0.
  • Audio AAC-LC

Certain restrictions depend on the device, obviously, and on the installed version of the operating system. Not all of these features are mentioned in the iOS / Android documentation. You should definitely do some testing and, if not sure, transcode the video.

So which codec / format should be encoded with?

Apple has invested heavily in the MPEG ecosystem and traditionally better supports H.264 and H.265 (HEVC); they do not support VP8 and VP9 in WebM. Thus, if you have a VP8 / VP9 video and you want it to be visible cross-platform, transcode it to H.264.

How do I make the actual encoding?

Make sure you use a high enough bit rate so that you don’t add additional artifacts to your already lost video. You should not just do a one-time target bit rate, as you are doing now. Instead, perform two-pass encoding to improve the quality and efficiency of encoding (although it takes more time). You can also use constant quality mode if you do not need a specific file size (for example, CRF for libx264 ). Read more about FFmpeg H.264 for encoding .

What about the future?

Please note that almost all major players in the technology industry, except Apple, have joined the Alliance for Open Media . They are developing a VP9 successor called " AV1, " which will receive support from all major browser providers (Chrome, Firefox, Edge) and Android.

H.265 / HEVC also seems like a good choice, but encoding with x265 , for example, is currently still very slow compared to x264 , the most popular open source H.264 encoder.

+3
source share

WebM and mp4 are just containers, not encoders used. Usually webm will be vp8 or vp9, and mp4 in most cases will be h264. You can change the containers with a copy of the stream (ffmpeg: -vcodec copy command), but you cannot change the encoder type without encoding the entire stream. When you run the copy command, you also cannot resize the video or change the bit rate. Copying sounds exactly the same, you are exactly copying the base frames and wrapping them in another container.

I will doubt why you want to code webm. Both ios and android will play (most of all) mp4 videos without problems. If you want to apply a certain type of format, you can check if the incoming video is being received and try to ensure compliance with certain standards (for example, basic h264 video and aac audio, not more than 1080p). If the input stream does not comply with this standard, encode the video by specifying the encoder, bit rate and size. If this meets your standards, just run a copy of -vcodec copy -acodec. It looks like your specific library will be .audioCodec ('copy'). VideoCodec ('copy').

+1
source share

It's easy, you just go the same way as large vendors like youtube, currently mp4 / h.264 / aac

As you target Android or better β€œany” OS or device, select max. You should also take care of the detailed codec settings. Hardware decoders that are built into devices are usually quite picky about what they want to decode.

Youtube determines the video codec settings in this way

for h264:

  • Progressive scan (no weave)
  • High profile
  • 2 consecutive frames B
  • Closed GOP. GOP with half frame rate.
  • Cabac
  • Variable bitrate. No bit rate limit is required, although we offer the recommended data rates below for reference.
  • Chromatic subsampling: 4: 2: 0

The only thing I can add: in fact, you have to take care of bitrate, if you go too high (for example, more than 15 Mbps for Full-HD video), hardware decoders can cause problems.

For AAC:

  • Channels: stereo or stereo + 5.1
  • Sampling frequency 96 kHz or 48 kHz

Container: MP4

  • No edit lists (or videos may not be processed correctly)
  • moov atom at the beginning of the file (quick start)

In addition, h264 is definitely the codec that worked on most, so it can be encoded much faster than VP8 / 9 and co. Therefore, in my opinion, there is no good reason (besides the "ethical" ones) to go only for h264 - not to mention the fact that "x264" is even faster than h264, and should guarantee the same compatibility

Conclusio:

Can anyone point out the smell of code here regarding the performance of this operation? Does this code do much more than you need to achieve compatibility between iOS and Android?

Unfortunately, no, it all depends on what kind of videos you get as input. Your code is definitely not too much, but much less than "try to save computing time." You can check whether the input video matches all of the above parameters before sending it to the encoding, but, frankly, from the experience of most compatible results you will get when you encode all your inputs to the settings above.

+1
source share

All Articles