I am developing a real-time streaming application. Two parts include streaming. I use a capture card to capture some live source and need to transmit in real time. and you also need to transfer the local video file.
For streaming local live video, I use emgu cv to capture the video as bitmap images. To do this, I create a list of bitmap images, and I save the captured bitmap to this list using a single stream. and also I show these frames in the image window. You can save video in 1 second in the list of bitmaps. if the frame rate is 30 it will store 30 video frames. After filling out this list, I start another stream to encode this 1st piece of video.
For the purpose of coding, I use the ffmpeg wrapper called nreco. I write that video frames for ffmpeg and run ffmpeg for encoding. After stopping this task, I can get the encoded data as an array of bytes.
Then I send this data using the UDP protocol over the local network.
It works great. But I can’t achieve a smooth flow. When I received the stream through the VLC player, there are several milliseconds of delay between packets, and I also noticed that the frame was lost.
private Capture _capture = null; Image<Bgr, Byte> frame; // Here I capture the frames and store them in a list private void ProcessFrame(object sender, EventArgs arg) { frame = _capture.QueryFrame(); frameBmp = new Bitmap((int)frameWidth, (int)frameHeight, PixelFormat.Format24bppRgb); frameBmp = frame.ToBitmap(); twoSecondVideoBitmapFramesForEncode.Add(frameBmp); ////} if (twoSecondVideoBitmapFramesForEncode.Count == (int)FrameRate) { isInitiate = false; thread = new Thread(new ThreadStart(encodeTwoSecondVideo)); thread.IsBackground = true; thread.Start(); } } public void encodeTwoSecondVideo() { List<Bitmap> copyOfTwoSecondVideo = new List<Bitmap>(); copyOfTwoSecondVideo = twoSecondVideoBitmapFramesForEncode.ToList(); twoSecondVideoBitmapFramesForEncode.Clear(); int g = (int)FrameRate * 2; // create the ffmpeg task. these are the parameters i use for h264 encoding string outPutFrameSize = frameWidth.ToString() + "x" + frameHeight.ToString(); //frame.ToBitmap().Save(msBit, frame.ToBitmap().RawFormat); ms = new MemoryStream(); //Create video encoding task and set main parameters for the video encode ffMpegTask = ffmpegConverter.ConvertLiveMedia( Format.raw_video, ms, Format.h264, new ConvertSettings() { CustomInputArgs = " -pix_fmt bgr24 -video_size " + frameWidth + "x" + frameHeight + " -framerate " + FrameRate + " ", // windows bitmap pixel format CustomOutputArgs = " -threads 7 -preset ultrafast -profile:v baseline -level 3.0 -tune zerolatency -qp 0 -pix_fmt yuv420p -g " + g + " -keyint_min " + g + " -flags -global_header -sc_threshold 40 -qscale:v 1 -crf 25 -b:v 10000k -bufsize 20000k -s " + outPutFrameSize + " -r " + FrameRate + " -pass 1 -coder 1 -movflags frag_keyframe -movflags +faststart -c:a libfdk_aac -b:a 128k " //VideoFrameSize = FrameSize.hd1080, //VideoFrameRate = 30 }); ////////ffMpegTask.Start(); ffMpegTask.Start(); // I get the 2 second chunk video bitmap from the list and write to the ffmpeg foreach (var item in copyOfTwoSecondVideo) { id++; byte[] buf = null; BitmapData bd = null; Bitmap frameBmp = null; Thread.Sleep((int)(1000.5 / FrameRate)); bd = item.LockBits(new Rectangle(0, 0, item.Width, item.Height), ImageLockMode.ReadOnly, PixelFormat.Format24bppRgb); buf = new byte[bd.Stride * item.Height]; Marshal.Copy(bd.Scan0, buf, 0, buf.Length); ffMpegTask.Write(buf, 0, buf.Length); item.UnlockBits(bd); } }
This is the process that I used to create real-time streaming. But the flow is not smooth. Instead, I tried to use a list queue to reduce the delay in order to populate the list. Because I thought that latency occurs when encoding thread encode and sending a 2 second video is very fast. But when he finishes this process of encoding a list of raster images, not completely filled. Thus, the encoding stream will stop until the next 2 second video is ready.
If someone can help me figure this out, he is very grateful. If the way I do it is wrong, please correct me. Thanks!