Live Streaming Video in C #

I am developing a real-time streaming application. Two parts include streaming. I use a capture card to capture some live source and need to transmit in real time. and you also need to transfer the local video file.

For streaming local live video, I use emgu cv to capture the video as bitmap images. To do this, I create a list of bitmap images, and I save the captured bitmap to this list using a single stream. and also I show these frames in the image window. You can save video in 1 second in the list of bitmaps. if the frame rate is 30 it will store 30 video frames. After filling out this list, I start another stream to encode this 1st piece of video.

For the purpose of coding, I use the ffmpeg wrapper called nreco. I write that video frames for ffmpeg and run ffmpeg for encoding. After stopping this task, I can get the encoded data as an array of bytes.

Then I send this data using the UDP protocol over the local network.

It works great. But I can’t achieve a smooth flow. When I received the stream through the VLC player, there are several milliseconds of delay between packets, and I also noticed that the frame was lost.

private Capture _capture = null; Image<Bgr, Byte> frame; // Here I capture the frames and store them in a list private void ProcessFrame(object sender, EventArgs arg) { frame = _capture.QueryFrame(); frameBmp = new Bitmap((int)frameWidth, (int)frameHeight, PixelFormat.Format24bppRgb); frameBmp = frame.ToBitmap(); twoSecondVideoBitmapFramesForEncode.Add(frameBmp); ////} if (twoSecondVideoBitmapFramesForEncode.Count == (int)FrameRate) { isInitiate = false; thread = new Thread(new ThreadStart(encodeTwoSecondVideo)); thread.IsBackground = true; thread.Start(); } } public void encodeTwoSecondVideo() { List<Bitmap> copyOfTwoSecondVideo = new List<Bitmap>(); copyOfTwoSecondVideo = twoSecondVideoBitmapFramesForEncode.ToList(); twoSecondVideoBitmapFramesForEncode.Clear(); int g = (int)FrameRate * 2; // create the ffmpeg task. these are the parameters i use for h264 encoding string outPutFrameSize = frameWidth.ToString() + "x" + frameHeight.ToString(); //frame.ToBitmap().Save(msBit, frame.ToBitmap().RawFormat); ms = new MemoryStream(); //Create video encoding task and set main parameters for the video encode ffMpegTask = ffmpegConverter.ConvertLiveMedia( Format.raw_video, ms, Format.h264, new ConvertSettings() { CustomInputArgs = " -pix_fmt bgr24 -video_size " + frameWidth + "x" + frameHeight + " -framerate " + FrameRate + " ", // windows bitmap pixel format CustomOutputArgs = " -threads 7 -preset ultrafast -profile:v baseline -level 3.0 -tune zerolatency -qp 0 -pix_fmt yuv420p -g " + g + " -keyint_min " + g + " -flags -global_header -sc_threshold 40 -qscale:v 1 -crf 25 -b:v 10000k -bufsize 20000k -s " + outPutFrameSize + " -r " + FrameRate + " -pass 1 -coder 1 -movflags frag_keyframe -movflags +faststart -c:a libfdk_aac -b:a 128k " //VideoFrameSize = FrameSize.hd1080, //VideoFrameRate = 30 }); ////////ffMpegTask.Start(); ffMpegTask.Start(); // I get the 2 second chunk video bitmap from the list and write to the ffmpeg foreach (var item in copyOfTwoSecondVideo) { id++; byte[] buf = null; BitmapData bd = null; Bitmap frameBmp = null; Thread.Sleep((int)(1000.5 / FrameRate)); bd = item.LockBits(new Rectangle(0, 0, item.Width, item.Height), ImageLockMode.ReadOnly, PixelFormat.Format24bppRgb); buf = new byte[bd.Stride * item.Height]; Marshal.Copy(bd.Scan0, buf, 0, buf.Length); ffMpegTask.Write(buf, 0, buf.Length); item.UnlockBits(bd); } } 

This is the process that I used to create real-time streaming. But the flow is not smooth. Instead, I tried to use a list queue to reduce the delay in order to populate the list. Because I thought that latency occurs when encoding thread encode and sending a 2 second video is very fast. But when he finishes this process of encoding a list of raster images, not completely filled. Thus, the encoding stream will stop until the next 2 second video is ready.

If someone can help me figure this out, he is very grateful. If the way I do it is wrong, please correct me. Thanks!

+8
c # ffmpeg video
source share
2 answers

It's hard to say something about the code, since your snippets do not provide details for the whole process.

First of all, you can generally exclude the frame buffer (list of bitmap images). Just create 1 live encoding process (creating a new process every 2 seconds is a very bad idea) and click on bitmaps on VideoConverter using the Write method when they arrive. Since you get frames from capturing the device in real time, you also do not need to make any manual delays (Thread.Sleep ((int) (1000.5 / FrameRate))). As a result, you should get smooth video on the VLC side (some latency - usually around 200-500 ms - is inevitable due to encoding and network transmission).

If you get frames from capturing a device using tricks and starts, you can try the "-re" FFMpeg option.

+1
source share

Now I have changed my code. When the frame buffer was full, I started the stream that was used to encode the video clip. Inside this stream, I encode video frames, and I saved this encoded data in a stream safe queue. After filling this queue to some extent, I start the timer. A timer starts every 200 millisecond intervals and sends encoded data.

This works very well, and I get a smooth stream on the receiving side. I tested this with 720 video. But when I try to stream 1080p video, it works very well in the beginning. But after a while, the stream is displayed in parts. I noticed that this happens when my streaming application is not sending data very quickly. Thus, the player’s buffer becomes empty for a short millisecond time. I think this is because emgu cv did not capture frames in real time. He shot very quickly for a low-resolution video. When I shoot 1080p HD video, the capture slows down. Even both have the same frame rate. Each time it will capture a frame when "Application.Idle + = ProcessFrame;" trigger event.

I have a capture card that can be used to capture live video and can be used if I have an HDMI line. Therefore, I do not know how to capture a video file using a capture card. That is why I used open cv. And also I deleted the whole thread, as you said.

0
source share

All Articles