I am trying to stream my own .H264 web feed from a Logitech C920 webcam in real time from an Odroid device (robot) via ffserver running on a separate server (CentOS 7.1), to the user's browser without reusing .H264 video.
The presence of real-time videos in the browser is a problem in itself, so for now Iβm just trying to get the Logitech C920 webcam on Odroid to transmit its own .H264 mp4 real-time video via ffserver for users without having to transcode the video in the process. Obviously, I want to avoid re-encoding, as it will take too much CPU time and kill real-time video. Later, I may need to change the container to .flv or rtp so that it can play in the browser in real time. I use the Logitech C920 webcam because it can do .H264 encoding on hardware. (It was tested by saving the file directly, it works, except for the well-known jerking problem associated with the Linux kernel error: http://sourceforge.net/p/linux-uvc/mailman/message/33164469/ , but thatβs completely other story)
The problem is that, however, I set ffmpeg-ffserver up as soon as ffserver is in the picture, the channel gets reencoded - even from h264 (native) to h264 (libx264) - it takes up 100% of the CPU on the Odroid device and introduces a huge delay in the video stream.
The following are the ffmpeg and ffserver settings.
Ffmpeg from an Odroid device transmitting .H264 stream to ffserver
$ ffmpeg -s 1920x1080 -f v4l2 -vcodec h264 -i /dev/video0 -copyinkf -vcodec copy http://xxxyyyy.com:8090/feed1.ffm ffmpeg version N-72744-g653bf3c Copyright (c) 2000-2015 the FFmpeg developers built with gcc 4.8 (Ubuntu/Linaro 4.8.2-19ubuntu1) configuration: --prefix=/home/odroid/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/odroid/ffmpeg_build/include --extra-ldflags=-L/home/odroid/ffmpeg_build/lib --bindir=/home/odroid/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-nonfree libavutil 54. 27.100 / 54. 27.100 libavcodec 56. 41.100 / 56. 41.100 libavformat 56. 36.100 / 56. 36.100 libavdevice 56. 4.100 / 56. 4.100 libavfilter 5. 16.101 / 5. 16.101 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 2.100 / 1. 2.100 libpostproc 53. 3.100 / 53. 3.100 Input
And / etc / ffserver.conf on the server running ffserver:
HTTPPort 8090
As you saw in the ffmpeg section above, transcoding occurs on the Odroid device, maximizing the processors:
Stream mapping: Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
I already tried to set the VideoCodec value in the ffserver configuration directly in libx264, tried the -re option in ffmpeg, tried to use a different syntax for ffmpeg, etc. Nothing helps. Re-recording always exists, so I cannot do ffmpeg-ffserver just to broadcast the video stream as it is.
Both ffmpeg (on Odroid and on the server) were compiled yesterday (2015-06-09) from the source, so they are the latest (and the same) version.
Any idea?
EDIT: In RESUME, the problem is this: I cannot find a way to get ffserver to broadcast the h264 (native) channel coming from the Logitech C920 webcam without re-encoding.