H.264 Real-time streaming, time stamp in NAL units?

I am trying to create a system that broadcasts video and audio in real time using Android phones. Videos and auido are recorded on the Android side using MediaRecorder, and then dragged directly to a server written in python. Clients must access this direct channel using their browser, so I performed the streaming part of the system using flash. Right now both video and audio content are appearing on the client side, but the problem is that they are not synchronized. I am sure that this is caused by incorrect timestamp values ​​in flash (I am currently increasing ts by 60 ms for the video frame, but this value should be variable).

The sound is encoded in amr on an Android phone, so I know for sure that every amr frame is 20 ms. However, this does not apply to video encoded in H.264. To synchronize them together, I would need to know exactly how many milliseconds each H.264 frame lasts, so I can set them later when you deliver content using flash. My question is, is such information available in NAL H.264 units? I tried to find the answer in the H.264 standard, but the information there is just overwhelming.

Can someone point me in the right direction? Thanks.

+4
source share
1 answer

Timestamps are not in NAL units, but are usually part of RTP . RTP / RTCP also supports media synchronization.

The RTP payload format for H.264 may also interest you.

If you are not using RTP, are you just sending raw blocks of data over the network?

+1
source

All Articles