Question / Help Inconsistent Frame Capture - Not the typical issues.

Hey all. I've had a long-standing issue here and I'm at my wit's end trying to research a solution so I hope someone can help.

So my setup is as follows:

Main rig: i7 6700K @ 4.6Ghz), 16GB DDR4, 1080Ti, controls scenes and sources but does NOT do the streaming, sends to streaming rig via OBS-NDI, two displays (primary is 1440p 144Hz, secondary is 1080p 60Hz)
Streaming rig: Ryzen 1600, 8GB DDR4, R9 Fury, takes NDI output from main rig and encodes using x264 to Mixer, Fury just handles encoding for replay buffer
Stream config: Mixer FTL, 900p@60fps, 6000Kbps, fast preset. Server never goes into red CPU usage and never has encoding overloads.
Log file from main rig of most recent stream where I had this issue bad: https://obsproject.com/logs/-AUXM26kPFw0GR8b

This problem is not with NDI as it happens even if I stream directly off the main system. The issue is that when capturing PC games, I get very inconsistent frame rates on some of my sources and in some other games, the entire capture frame rate of OBS will tank well into. For example, I use an HD camcorder as my webcam and it is captured via a USB device. The camera outputs 60fps and when I'm playing say, a console game being captured via my HD 60 Pro, the frame rate of all my sources is a rock solid 60. When I start capturing PC games however (using Game Capture or Screen Capture), the frame rate of my camera source always drops below 60 and varies (it's still above 30 but well less than 60), even though OBS says the canvas is running at 60 and it's using less than 10% CPU. This can happen with everything from a big AAA release to a pixel art indie game.

Weirder still is with certain games (like for example, Anthem and also the newly released Pathologic 2 which is from the stream the log file references), the entire frame rate of OBS will tank and vary wildly from as little as 30 to the mid 50s but also sometimes hold 60 like it should. Anthem is a demanding game yes but Pathologic isn't and it's a Unity game, an engine that I regularly stream games in without issue.

What causes OBS to choke like this seems basically random and I can't figure out why. And I can't seem to be able to capture any PC games with it at all without things like my camera source's frame rate dropping. Even if my CPU isn't close to being fully utilized, this happens regardless and doesn't to other streamers I know. I don't know what could be the cause.

That said, I've thought of one potential fluke issue that could be the case. Recently, when trying to troubleshoot a weird issue with Wallpaper Engine (and yes, disabling it doesn't help with this issue and in fact, it existed before I started using it), I came across this from their FAQ. Essentially, what it says is that due to a still unacknowledged bug from NVIDIA with how it handles the hardware acceleration changes in Windows 10, random poor performance in GPU accelerated applications can occur if you're running one monitor at 144Hz and another at 60, as I do. Since the games run on the 144Hz monitor and OBS runs on the 60Hz monitor, it's possible this is the cause of the issue. I'm prepared to buy a cheap 24" 144Hz monitor as my secondary display to solve this issue (even though it would be a waste as I don't play games on that display) but I don't want to make that investment until I know it's going to work.

Has anyone else come across this weird problem and if so, do you know if there's reason to believe this is the problem and potential solution? I've been pulling my hair out on this for literally months without a concrete answer but I'm hoping someone from here has some more experience with this.

Thanks!
 

carlmmii

Active Member
Can you post a log from your main rig, preferably with a local recording attempt? (yes I know you're using NDI, but the relevant logs don't show without either a stream or recording)

It sounds like you're either encountering the monitor refresh rate issue as mentioned, or you're just straight up encountering the usual rendering lag from letting your game run free and not allowing OBS enough GPU resources to render the scene. The log files should explain which.

Edit: didn't catch the log file mid-post...

Logs show you're encountering a significant amount of rendering lag, which means OBS isn't being allowed the GPU resources necessary to render the frames. Usually this is a result of letting the game run without a capped framerate (either with a max framerate set, or vsync, or even just being so taxing that neither limit is even reached).
 
Last edited:
D

Deleted member 121471

I fixed every single one of my jitter and stuttering issues by getting a second 144hz monitor and setting BOTH monitors to 120hz, capping my game's FPS.

However, reading through your log, you have a few additional issues.

1) Keep a single capture source per scene collection as adding more introduces potential performance hits;

2) Rendering lag suggests you are overloading your graphics card. Limiting the FPS will fix this.

I'm not familiar enough with NDI/capture cards so someone else more experienced can help you with advice.

For troubleshooting purposes, try creating an entirely new scene collection with only game capture and camcorder added as sources, limit your FPS to 120 and test again.
 
Logs show you're encountering a significant amount of rendering lag, which means OBS isn't being allowed the GPU resources necessary to render the frames. Usually this is a result of letting the game run without a capped framerate (either with a max framerate set, or vsync, or even just being so taxing that neither limit is even reached).

That log I posted was from my main rig, during my stream from last night. I wasn't locally recording but I can do another one if you want.

So I'm curious, how much in the way of GPU resources does OBS on its own need? In the setup I have, OBS on my main rig is literally doing no encoding whatsoever. I know the software itself still needs GPU but is it that significant, especially since it's a 1080Ti?

What you're saying about rendering lag makes sense but here's the weird thing: During last night's stream of Pathologic 2, I started having those severe frame drops so I actually went into my display settings and set my primary monitor to 60Hz (since NVIDIA doesn't allow you to actually cap games that don't have their own cap without using a third-party tool.) It literally made no difference. Now, Pathologic 2 is a poorly optimized game so it's possible it was overdriving my GPU anyway. But this is something I can easily test out.

I'll poke at some stuff and report back.
 
I fixed every single one of my jitter and stuttering issues by getting a second 144hz monitor and setting BOTH monitors to 120hz, capping my game's FPS.

However, reading through your log, you have a few additional issues.

1) Keep a single capture source per scene collection as adding more introduces potential performance hits;

2) Rendering lag suggests you are overloading your graphics card. Limiting the FPS will fix this.

I'm not familiar enough with NDI/capture cards so someone else more experienced can help you with advice.

For troubleshooting purposes, try creating an entirely new scene collection with only game capture and camcorder added as sources, limit your FPS to 120 and test again.

All great suggestions, thank you! I will test some things out and report back. NDI is definitely not the issue here as even if I'm not using it, the problem remains. You make a good point about the multiple sources and overdriving my GPU, though I do wonder how much GPU resources OBS actually needs when no encoding is being done.

I'll report back!
 
So, I was working from home today and managed to sneak in some testing.

I'm honestly just straight up confused now.

For starters, here's the log from that test session: https://obsproject.com/logs/O9pcKDUsiQpsspha

I wasn't streaming per ce but had my environment setup the same as normal, except my streaming rig wasn't pushing out to Mixer. Should have no impact on my main rig. I tried the following:

-Capping main rig FPS to 60 and running Pathologic 2 with Game Capture. GPU never went above 80% usage.
-Running Pathologic 2 with monitor at 144Hz and observing GPU load. GPU usage was higher and probably did cap out sometimes but it usually hung around 90% according to HWInfo.

Everything seemed...fine. The OBS canvas frame rate didn't fluctuate and my preview screen seemed microstutter free, including my camera. Which never happens. I didn't even reboot between my original post and now, basically nothing's changed.

The only thing I can think of is that because I was working from home, I had additional stuff open on the second monitor (mostly Chrome tabs) that while minimized, may have still been driving enough video driver "traffic" to the second monitor to keep it from stuttering due to the variable frame rate glitch

Curious if you have any other insights but I'm going to keep testing this. I have a stream tomorrow afternoon so if the same issue recurs, I'll make sure to grab a log of that. I feel the solution to this might be easier than I think.

Thanks.
 
D

Deleted member 121471

I can only offer you my experience with dual monitor setup, for the micro stuttering/jitter issue.

When I had 144Hz+60Hz monitors, while there were ways to minimize framedrops, it still oscillated from 60-120 FPS. There's nothing you can do to perfectly fix this, Windows 10 simply doesn't do mixed refresh rates properly on a single GPU. I think your processor has a iGPU though so connecting your 60Hz monitor to it will be the cheapest solution to this issue.

After buying a second 144hz monitor, this behaviour ingame stopped entirely but I still had jitter on my streams and recordings, they didn't "look" perfectly smooth.

Usually, setting the monitor refresh rate to slightly below it's max rated frequency helps with some issues like brightness, banding and so on, depending on the model chosen but It made me notice something. If all elements captured and used have values that are evenly divisible by 30, all jitter disappears and is perfectly smooth, which I assume it's because they are all perfectly synchronized.

Capping monitor and game to 144Hz/FPS doesn't divide evenly into everything else you configured so I assume that's part of where these issues come from.
 
Hey all.

Apologies for my taking so long to respond but work has been insane lately and between that and trying to maintain my streaming schedule, I haven't had much time to test things.

So, this problem is now officially driving me nuts. I was ready to chalk it up to the "two monitors with different refresh rates" thing. However, @Volfield, I decided to try your suggestion of enabling the iGPU on my 6700K and connecting my secondary monitor to that. It's helped but only a bit.

The two games I'm using right now to easily reproduce this problem are The Division 2 and Siege of Centauri from Stardock. For no apparent reason, my entire canvas frame rate in OBS tanks when playing either of these. I tried to showcase Siege of Centauri last weekend and whenever I was in the game, OBS would drop to about 22FPS and in The Division 2, it would only mostly hold at 60 if I capped the game at 60 and even then, it would still often drop significantly. Since putting my second monitor on my iGPU, Division 2 drops to between 52 and 57FPS depending on the situation and Siege of Centauri drops to about 45ish. My second monitor is now on a totally different GPU and I confirmed when using Division 2's benchmark mode that neither my CPU, nor my primary GPU were at maximum load. And this happens when I'm not even streaming or sending data to my NDI receiver. Just having OBS open and capturing is enough to cause this.

From what I'm reading, it seems a lot of people are having issues like this as of OBS 23. They seem to be getting the usual response of either silence or "it must be your PC." If this truly is the varying refresh rate issue, I'll get a cheap 144Hz monitor for my secondary display but I'm really having a hard time swallowing that idea when the second monitor doesn't even involve my main GPU any more.

Here's a log file from testing with Siege of Centauri: https://obsproject.com/logs/MCpY6XHfhqehYA3O
Here's one from tesing with Division 2's benchmark: https://obsproject.com/logs/2Rdsve4sQD9KrG4C

Now, in the case of Division 2, it does appear my CPU is getting pinned out (Division 2 is known to eat a lot of CPU) but the thing I don't understand is now OBS is saying it's using 20% CPU when I'm playing something, even though no heavy encoding is being done on my main rig. That usage used to be a lot lower. Even when a game isn't running, it's takint 15-23%, which makes no sense to me.

Any of you fine folks perhaps have another thought? I haven't blown away and redone by OBS config from scratch yet but I really don't see how that's going to help. Thanks again all!
 

TheChill

New Member
I was streaming theHunter: CotW the other night as a test (I test every release of OBS to see if the microstutter I suffer with it is resolved). I wasn't intending to do a session, so it was a quick test.

This was the setup:

Acer Predator X34 @1440, 60Hz - primary
Asus @1080, 60Hz - secondary running OBS
Asus ROG Swift @1440, 60Hz - just on

I'd made an effort with CRU to have all panels running close to the same Hz.

This is on an i7 6700K rig, 32 gig, SSDs, blah, blah, and 2x1080 (not SLI, with primary on card #1 and other panels on card #2).

With a simple scene, of either a Game or a Display Capture, my FPS would periodically tank. It'd last a few seconds before recovering, then do it again seemingly at random over and over again.

It never happened with previous releases. I resolved it by running both non-primary panels at double the Hz (120Hz).

After that I decided to DDU and clean install some latter-300's version nVidia drivers and test again at 60Hz across the board. The frame rates didn't tank. But I didn't test it for very long.

Anyway, none of the testing I've ever done (1, 2, 3 screen setups, with 1 or 2 GPUs) has ever solved the god damn microstutter I only ever get while capturing (never actually seen ingame, and something a great many contributors to this forum have experienced without resolution), but that whole periodic FPS tanking was new, and sounds similar to what you're suffering.

So, long and short of it, try switching out your drivers. Might work, might not, but it's the only thing I did between getting tanked FPS and regular FPS.
 
I'll definitely try to DDU my drivers and see what happens.

This definitely seems to be a very common problem. I don't like a lot about XSplit and can't use it right now anyway because they've had a long-standing NDI audio drift bug they don't seem interested in dealing with but this random awful performance isn't good enough for me. Hopefully it can be figured out.
 
Hey folks.

So, I've solved the problem but you're not going to believe what it was. I did try DDU-ing my drivers, which didn't help. So I decided to just nuke my entire OBS configuration and restart with just a game capture source. Worked perfectly, everything that used to tank my frame rate didn't any more. So I started to add in sources back in and then I found it.

I use a Cloner Alliance Flint LXT to capture a Canon camcorder that I use instead of a webcam. I previously used a Flint LX to capture consoles and upgraded to an Elgato HD60 Pro because the Flint LX was super unreliable and had lousy image quality. As soon as I added the LXT back in, my frame rate in certain games went to crap again, just as before.

It turns out, I recently got a second HD60 Pro temporarily to pass onto someone else who hasn't come to get it yet (yeah seriously.) Elgato also added the ability to use multiple of their PCI-E adapters in v3.7 of their drivers. I dropped in the second and the problem is gone. I tried 3 games that were tanking my frame rate (Division 2, Pathologic 2 and Siege of Centauri) and they all work perfectly. Division 2 and Siege of Centauri have a temporarily drop to about 53FPS when a new level starts, then they recover to 60 and stay there. In addition to that, the output from my camcorder is orders of magnitudes better than with the Flint (better colours, less blurry and I had to do zero adjustment to my chroma filter for my green screen.)

Given that the Flint adapters just use the UVC framework and don't require drivers, I'm guessing this is a problem with the devices themselves. Or maybe OBS just has issues with UVC devices as a whole. Either way, using two Elgato devices has solved my problem. I don't get to keep this second HD60 Pro but I do need to buy the HD60 S (USB version) eventually as I'll be using it with my NDI box as a capture device for charity events. So, that should work just as well.

This certainly isn't the normal version of this problem that a lot of people run into but I hope this can provide help to people. Needless to say, avoid Cloner Alliance products for streaming use, or at leat with OBS.

Thanks very much to everyone who helped me here! I'm going to leave my IGP enabled and keep my second monitor connected to it to help with the GPU allocation issue regardless and that alone has been a big help.

Cheers!
 
Top