How does the obs-studio synchronize the audio and video?

choes

New Member
Hi, I'm trying to develop a streamer SDK for Android and IOS.

I'm a novice in terms of A/V, it's difficult for me to understand the A/V synchronization, the following is my understanding from the source code(the latest version).

1. the video_thread and audio_thread wait for the closest starting point before they encode and insert the packet to the interleaved_packets(let's take rtmp_output module as an example).
2. the A/V packets of the interleaved_packets are sorted based on dts_usec, then add the first packet to the rtmp_stream.
3. that A/V packet will be multiplexed based on the FLV specification before it's sent to a RTMP server.

How to ensure A/V synchronization in this process? What's the key point of A/V synchronization?
Please give me some tips or reference materials, thanks in advance.
 

Lain

Forum Admin
Lain
Forum Moderator
Developer
I'm not entirely sure what you're asking. How do you ensure sync? Have accurate timestamps on your raw audio/video data, and then start encoding at the same starting timestamp. Your raw audio data can be trimmed to match the timestamp of the video frame you want to start at.
 

choes

New Member
I'm not entirely sure what you're asking. How do you ensure sync? Have accurate timestamps on your raw audio/video data, and then start encoding at the same starting timestamp. Your raw audio data can be trimmed to match the timestamp of the video frame you want to start at.
Hi Jim, thanks for your response, sorry for my unclear expression. I'm still at the starting point of my streamer SDK, I need to understand the A/V sync thoroughly before continuing.

Let me ask a specific question, what does a RTMP player need to play A/V streams? How does it ensure A/V sync?
 

Lain

Forum Admin
Lain
Forum Moderator
Developer
How does the client/viewer side ensure sync? They do it with the timestamps on the video/audio packets.

Syncing is all about the timestamps. The timing information on your audio and video -- whether encoded or not, whether coming from the streamer or being played back by the user -- is key to ensuring sync.

The interleaving stuff you saw doesn't have anything to do with sync; it's more because muxers/demuxers will break if the audio/video packets are out of order in terms of timing information.
 

choes

New Member
How does the client/viewer side ensure sync? They do it with the timestamps on the video/audio packets.

Syncing is all about the timestamps. The timing information on your audio and video -- whether encoded or not, whether coming from the streamer or being played back by the user -- is key to ensuring sync.

The interleaving stuff you saw doesn't have anything to do with sync; it's more because muxers/demuxers will break if the audio/video packets are out of order in terms of timing information.
Jim, could you talk more about timestamps? I searched through google and get some explains about timestamps, most of them mention PTS and DTS. How does the obs-studio process this two timestamps?
 
Top