# How does the obs-studio synchronize the audio and video?



## choes (Nov 21, 2016)

Hi, I'm trying to develop a streamer SDK for Android and IOS.

I'm a novice in terms of A/V, it's difficult for me to understand the A/V synchronization, the following is my understanding from the source code(the latest version).

1. the _video_thread and audio_thread _wait for the closest starting point before they encode and insert the packet to the _interleaved_packets_(let's take _rtmp_output_ module as an example).
2. the A/V packets of the _interleaved_packets _are sorted based on _dts_usec_, then add the first packet to the _rtmp_stream_.
3. that A/V packet will be multiplexed based on the FLV specification before it's sent to a RTMP server.

How to ensure A/V synchronization in this process? What's the key point of A/V synchronization?
Please give me some tips or reference materials, thanks in advance.


----------



## Jim (Nov 21, 2016)

I'm not entirely sure what you're asking.  How do you ensure sync?  Have accurate timestamps on your raw audio/video data, and then start encoding at the same starting timestamp.  Your raw audio data can be trimmed to match the timestamp of the video frame you want to start at.


----------



## choes (Nov 21, 2016)

Jim said:


> I'm not entirely sure what you're asking.  How do you ensure sync?  Have accurate timestamps on your raw audio/video data, and then start encoding at the same starting timestamp.  Your raw audio data can be trimmed to match the timestamp of the video frame you want to start at.


Hi Jim, thanks for your response, sorry for my unclear expression. I'm still at the starting point of my streamer SDK, I need to understand the A/V sync thoroughly before continuing.

Let me ask a specific question, what does a RTMP player need to play A/V streams? How does it ensure A/V sync?


----------



## Jim (Nov 21, 2016)

How does the client/viewer side ensure sync?  They do it with the timestamps on the video/audio packets.

Syncing is all about the timestamps.  The timing information on your audio and video -- whether encoded or not, whether coming from the streamer or being played back by the user -- is key to ensuring sync.

The interleaving stuff you saw doesn't have anything to do with sync; it's more because muxers/demuxers will break if the audio/video packets are out of order in terms of timing information.


----------



## choes (Nov 22, 2016)

Jim said:


> How does the client/viewer side ensure sync?  They do it with the timestamps on the video/audio packets.
> 
> Syncing is all about the timestamps.  The timing information on your audio and video -- whether encoded or not, whether coming from the streamer or being played back by the user -- is key to ensuring sync.
> 
> The interleaving stuff you saw doesn't have anything to do with sync; it's more because muxers/demuxers will break if the audio/video packets are out of order in terms of timing information.


Jim, could you talk more about timestamps? I searched through google and get some explains about timestamps, most of them mention PTS and DTS. How does the obs-studio process this two timestamps?


----------

