How to reduce the size of video captured by default camera using FFMPEG in Android?

I am trying to reduce the size of the video captured by the default camera (it generates high resolution video) in Android. Does FFMPEG have the ability to encode video with a given resolution? I tried using Google, but all examples use command line mode for FFMPEG.

My questions:

  • Can we use ffmpeg command line in Android?
  • If not, how will we achieve this?
  • Is it possible to record video directly using ffmpeg in Android?
  • Are there any other solutions for this?
+4
source share
1 answer

It is possible to compile ffmpeg for android, as well as run ffmpeg from the command line. There is no need to delve into native code and jni calls unless you require more advanced use than what the command line provides.

For reference, this is a shell script that I run to compile ffmpeg (runs under Ubuntu, this makes things a lot easier than windows)

#!/bin/bash ANDROID_API=android-3 export ANDROID_NDK=${HOME}/android-ndk export ANDROID_SDK=${HOME}/android-sdk SYSROOT=$ANDROID_NDK/platforms/$ANDROID_API/arch-arm ANDROID_BIN=$ANDROID_NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt/*-x86/bin/ CROSS_COMPILE=${ANDROID_BIN}/arm-linux-androideabi- export PATH=$PATH:$ANDROID_SDK/tools:$ANDROID_SDK/platform-tools export ARM_ROOT=${HOME}android-ndk export ARM_INC=$ARM_ROOT/platforms/android-5/arch-arm/usr/include export ARM_LIB=$ARM_ROOT/platforms/android-5/arch-arm/usr/lib export LIB_INC=${HOME}/include export LIB_LIB=${HOME}/lib CFLAGS=" -I$ARM_INC -fPIC -DANDROID -fpic -mthumb-interwork -ffunction-sections -funwind-tables -fstack-protector -fno-short-enums -D__ARM_ARCH_5__ -D__ARM_ARCH_5T__ -D__ARM_ARCH_5E__ -D__ARM_ARCH_5TE__ -Wno-psabi -march=armv5te -mtune=xscale -msoft-float -mthumb -Os -fomit-frame-pointer -fno-strict-aliasing -finline-limit=64 -DANDROID -Wa,--noexecstack -MMD -MP " LDFLAGS=" -nostdlib -Bdynamic -Wl,--no-undefined -Wl,-z,noexecstack -Wl,-z,nocopyreloc -Wl,-soname,/system/lib/libz.so -Wl,-rpath-link=$ARM_LIB,-dynamic-linker=/system/bin/linker -L$ARM_LIB -nostdlib $ARM_LIB/crtbegin_dynamic.o $ARM_LIB/crtend_android.o -lc -lm -ldl -lgcc " FLAGS="--target-os=linux --enable-cross-compile --cross-prefix=$CROSS_COMPILE --arch=arm --prefix=$HOME --disable-shared --enable-static --extra-libs=-static --extra-cflags=--static --enable-small --disable-asm --disable-yasm --disable-amd3dnow --disable-amd3dnowext --disable-mmx --disable-mmx2 --disable-sse --disable-ssse3 --disable-indevs" export CFLAGS=$EXTRA_CFLAGS export LDFLAGS=$EXTRA_LDFLAGS ./configure $FLAGS --extra-cflags="$CFLAGS" --extra-ldflags="$LDFLAGS" \ --cc="${CROSS_COMPILE}gcc --sysroot=${SYSROOT}" --extra-ldflags="$LDFLAGS" \ --cxx="${CROSS_COMPILE}g++ --sysroot=${SYSROOT}" \ --nm="${CROSS_COMPILE}nm" \ --ar="${CROSS_COMPILE}ar" make clean make -j4 || exit 1 make install || exit 1 

As for running ffmpeg, first you need to copy the ffmpeg file to the application file directory, chmod 755 it uses getRuntime.exec (), as shown below, then run ffmpeg with the following line:

 Process p = Runtime.getRuntime().exec("/data/data/yourpackagename/files/ffmpeg -i in.mp4 out.mp4") 

Now that the camera input in ffmpeg is in a format that it can understand, this is a hard bit that I'm still trying to figure out. I have a stackoverflow question related to: Decode H264 hardware encoded channel using ffmpeg in real time

+3
source

All Articles