Bad frame detection in OpenCV 2.4.9

I know the name is a bit vague, but I'm not sure how else to describe it.

CentOS with ffmpeg + OpenCV 2.4.9. I am working on a simple motion detection system that uses stream from an IP camera (h264).

From time to time, the stream hiccups and throws a "bad frame" (see pic-bad.png link below). The problem is that these frames are significantly different from previous frames and cause the "motion" event to fire even if no actual movement has occurred.

Below are the illustrations below.

Good shot (motion capture):

Good frame

Bad frame (no movement, only a broken frame):

Bad frame

A bad frame gets by accident. I guess I can make a bad frame detector by analyzing (looping) through the pixels going down from a certain position to see if they are all the same, but I wonder if there are other, more effective, "according to the book," approach to detecting these types of bad frames and just skip them.

Thanks!

EDIT UPDATE:

The frame is captured using the C ++ motion detection program through cvQueryFrame(camera); , so I do not directly interact with ffmpeg, OpenCV does it on the backend. I am using the latest ffmpeg compiled from a git source. All libraries are also updated (h264, etc., all downloaded and compiled yesterday). Data comes from the RTSP stream (ffserver). I tested several cameras (dahua models 1-3 MP), and frame failure is very stable for all of them, although this does not happen all the time, only once in a while (for example: once every 10 minutes).

+7
c ++ opencv ffmpeg motion
source share
3 answers

What comes to my mind in the first approach is to check the dissimilarity between the example of a valid frame and the one we are checking by counting pixels that do not match. Dividing this number by region, we get a percentage that measures the dissimilarity. I would prefer above 0.5, we can say that the checked frame is not valid, because it is too much different from the real example.

This assumption is only suitable if you have a static camera (it does not move), and objects that can move in front of it are not at the shortest distance (it depends on the focal length, but if you have, for example, wide lenses, objects should not be displayed less than 30 cm in front of the camera to prevent the situation when objects "jump" into the frame from nowhere and have a size of more than 50% of the frame area).

Here you have the opencv function that does what I said. In fact, you can adjust the coefficient of dissimilarity to be larger if you think that changes in movement will be faster. Note that the first parameter should be an example of a valid frame.

 bool IsBadFrame(const cv::Mat &goodFrame, const cv::Mat &nextFrame) { // assert(goodFrame.size() == nextFrame.size()) cv::Mat g, g2; cv::cvtColor(goodFrame, g, CV_BGR2GRAY); cv::cvtColor(nextFrame, g2, CV_BGR2GRAY); cv::Mat diff = g2 != g; float similarity = (float)cv::countNonZero(diff) / (goodFrame.size().height * goodFrame.size().width); return similarity > 0.5f; } 
+5
source share

You do not mention if you use the command line or the ffmpeg libraries, but in the latter case you can check the bad frame flag (I forgot its exact description) and just ignore these frames.

+1
source share

remove waitKey(50) or change it to waitKey(1) . I think opencv does not spawn a new thread to perform the capture. therefore, when there is a pause, this confuses the buffer management procedures, causing bad frames. May be?

I have dahua cameras and have observed that with higher latency, bad frames are observed. And they completely leave with waitKey(1) . A pause does not have to start with waitKey . Calling procedures also cause such pauses and lead to poor frames if they take a lot of time.

This means that there should be a minimum pause between successive frame captures. The solution will be to use two threads to perform capture and processing separately.

0
source share

All Articles