Question / Help Gradual Audio/Video source desync (NOT an El Gato product)

Boildown

Active Member
If it doesn't happen in Xsplit using the same hardware and OS setup, we can eliminate those as causes. This restricts it to something in OBS, perhaps a bug in x264 itself, or OBS's implementation of x264.

Next I would try disabling most of your cores. The biggest difference now between your encode PC and mine is the huge number of threads x264 will try to open by default. Limit it to 4 or 8 cores, and disable the second physical CPU. An easier test may be to set a lower thread limit, like threads=12 in the Custom x264 commands, but if that doesn't work, I'd actually try disabling cores (in the BIOS presumably), and see if the problem remains.

What happens if you don't use x264 encoding? Try NVEnc or Quicksync encoding (well, not QS) and see if the problem remains.
 

FerretBomb

Active Member
OBS does use a different timing method than XSplit (per @Jim it's much more accurate than whatever XS uses).

But yeah, something about how OBS handles video captures is... weird. Or problematic.
Just was doing differential testing for delay. Anything running through OBS (as a Video Capture Device, or the native Datapath Vision plugin) has significantly more delay than any card run through AmaRec 2.20c and then captured via a DWM Game Capture. The native DPV plugin is close, but it's still a few frames behind. Haven't tried the AmarecLive (virtual webcam) plugin yet, but I'd expect it to be further behind the native capture.
 

Boildown

Active Member
Delay as in just the video, not the audio? Or both are delayed? If both are delayed the same amount it doesn't really matter.
 

EDGAR_SEC

Member
Disabled the BCLK OC for 5 hour and 3 hour Battlefront streams last night to see if that was the issue. Same results: first hour the sync is fine, after that it creeps towards ~250ms audio lagging behind around the 2.5hr mark. Seems to top out there oddly enough.

With this video buffer option I have set to 0ms or 1ms, should I even attempt putting it to 200ms or would it just cause any more problems? Besides that, there are only the steps of testing with NVENC and then testing with GameShow left (unless I can get some sort of comparable load using OBSMP).

EDIT: I doubt disabling cores would do much since Widgitybear runs 16 cores and 32 threads (over dual xeons) with the same Datapath card with OBS and doesn't have these desync issues it seems.
 

EDGAR_SEC

Member
So I got this crazy idea to try enabling "Select Input Device as Desktop Audio" since I run a mixer and feed everything into the Stream PC using Line In on either a USB soundcard or on the mobo itself. How would I go about selecting which input to use though since nothing pops up in the GUI?
 

FerretBomb

Active Member
I don't think that's going to help... it essentially just gives you another Mic input. So if your mixer is currently coming through the Mic channel, it'll make no difference.

More telling is that the desync doesn't happen with NVENC (assuming you were still running the video through the streaming machine). Pins it down as an x264 problem, which gels with the CPU utilization problems you've been seeing too. Just brainstorming, do you have any custom CODEC packs installed? K-Lite, CCCP, anything like that? I'm reaching, but I seem to remember people having problems with one of those a couple years back.
Would need a Dev to confirm if OBS uses a self-contained version of x264, or if it hooks into what's present on the host system.
 

EDGAR_SEC

Member
I don't think that's going to help... it essentially just gives you another Mic input. So if your mixer is currently coming through the Mic channel, it'll make no difference.

Gotcha. I've found documentation on this skimpy since they moved it from the GUI to INI access only.[/quote]

More telling is that the desync doesn't happen with NVENC (assuming you were still running the video through the streaming machine). Pins it down as an x264 problem, which gels with the CPU utilization problems you've been seeing too.

Yup, was running the video into the stream PC via the same cap card then using the GTX 980 on the stream PC to encode with NVENC. Let's just say the video quality was definitely subpar when compared to x264 haha.

Last night, after some extensive experimentation, I finally figured out how to use both CPUs when doing an x264 encode via Handbrake or OBS. The stream last night was done using x264 with both CPUs running load (verified in task manager), so I don't think it's a bottleneck or something anymore. Tonight, I'm going to fine-tune the method and experiment to see exactly how many threads I should limit OBS to creating given specific resolutions. Hopefully, no one will ever have this utilization problem again.

Just brainstorming, do you have any custom CODEC packs installed? K-Lite, CCCP, anything like that? I'm reaching, but I seem to remember people having problems with one of those a couple years back.
Would need a Dev to confirm if OBS uses a self-contained version of x264, or if it hooks into what's present on the host system.

This might actually be the case. When I set this stream PC up, I used Ninite to download all the wanted programs and I usually select VLC, K-Lite, Classic, etc under the codecs section. Should I be uninstalling those? Should I wait for a dev response?

Thanks again for the great suggestions!
 

FerretBomb

Active Member
Didn't even realize it'd been moved to the INI, thought it'd been removed. Makes sense as a lot of new users were clicking it for some reason, then not being able to figure out why they couldn't set up system audio correctly.

Yep, NVENC is a band-aid, not a mainline method. Still going to have to remember that as a future troubleshooting tool.
Ah? How did you work around the issue? Still curious as it's the first incident I've seen where OBS didn't just grab both chips and go.

It's worth a shot. Easy enough to grab and reinstall them if it turns out not to be the case, and probably what I'd do in the interim. I'm using the most recent VLC and K-Lite Mega 11.2.0, so it's mostly spitballing at this point if something in a config got tweaked in a way OBS doesn't like. I'm also pretty sure that CCCP and K-Lite don't get along well, if you grabbed both of them.
 

EDGAR_SEC

Member
So I tried upping the scene buffer time to 1000ms from the default 100ms and it appears that after my 3.5hr stream tonight that the offset was limited to only expanding to 50-80ms.

Is it possible this is a fix? What does the global scene buffering actually do or help? The stream looks great at the Slower preset, just need to solve this final audio issue and I'll be good to go. :)
 

Boildown

Active Member
Googling for Scene Buffering Time doesn't seem to get much info (default is 700ms, not 100ms). If it helps, I'd do it though. Hell, just keep increasing it further and see what happens.

I don't think the devs stop by the classic OBS forums much any more. You'll probably have to PM or IRC to get info direct from the sources.
 

EDGAR_SEC

Member
Yup, documentation on it is pretty sparse, which is why I asked here. I always do my own research before posting on a message board. ;)

I'll try jacking it up to maybe 2500ms for tonight's stream to see if there's any improvement unless some dev happens to swing by here and can tell me in detail what this setting does.
 

EDGAR_SEC

Member
So apparently "Use Video Buffering" (on the capture device's options page) doesn't seem to do anything (even immediately when I start the stream) no matter if I set the delay to 1ms or 5000ms. Is this option bugged in OBS?
 
Top