A few frames are lost if I use av_read_frame in FFmpeg

I have a HEVC sequence with 3500 frames, and I am writing a decoder to read it (read frame by frame and dump for yuv). In my main (), I have a for loop that calls the decoder () 3500 times (I assume at this point that main () knows how many frames there are).

So, for every decoder () call, I need a full frame that needs to be returned. This is what decoder () looks like.

bool decode(pFormatCtx, pCodecCtx)
{
    int gotaFrame=0;

    while (gotaFrame==0) {

        printf("1\t");

        if ( !av_read_frame(pFormatCtx, &packet) ) { 
            if(packet.stream_index==videoStreamIndex) {

                // try decoding
                avcodec_decode_video2(pCodecCtx, pFrame, &gotaFrame, &packet);

                if (gotaFrame) {  // decode success.

                    printf("2\t");

                    // dump to yuv ... not shown here. 

                    // cleanup
                    av_frame_unref(pFrame);
                    av_frame_free(&pFrame);
                    av_free_packet(&packet);

                    return true;
                }
            }
        }
    }
}

The behavior looks like this: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 2 1 2 ...... does it look like it reads a few frames before decoding? The first frame is an I-frame, so can't it be decoded right away?

With this code, I lose a few frames (indicated by a series of 1). Can someone help me here? Is there something I'm doing wrong in my code?

: . .

+4
2

, , - . . , " ". , , avcodec_decode_video2 , . . B , .

, ? av_read_frame , , avcodec_decode_video2 , .

+7

, if(packet.stream_index==videoStreamIndex), av_read_frame: (, "i-frames" ).

I- , I- (.. avcodec_decode_video2 gotaFrame == 0)

0

All Articles