Héraès
Member
Personally I don't have NVidia Broadcast. I think the problem of in-game FPS drops can be correlated to the problem of scenes/sources bad management in OBS : https://obsproject.com/forum/thread...ess-cpu-and-gpu-resources.147969/#post-548311
In short :
- the more sources you add in a scene (even simple ones like a text), the more it adds percents of GPU used.
- nesting scenes add a 5% of GPU usage even if the nested scene is emtpy.
So there's definitely a "problem" in OBS management of sources and how compositing is made.
On the top of this :
- compositing is managed by the main GPU (if you have two on the same system). The main GPU is automatically selected by the operating system as "the one who displays the main screen". So if you use a second GPU and plug a second screen on it, by temprarily declaring your second screen as the "default one" in Windows (with help of a Nirsoft program), and launching OBS, then reverting the screens, you can fully launch OBS on the second GPU (meaning compositing + encoding, and not just encoding). This solve the problem of 20-30% of the main GPU used. I personnally used this technic because my GTX 960 coule never stream 1080p while playing Apex Legends.
But still, launching OBS and streaming will result in a few FPS drops, because of GPU communication throught RAM (and CPU usage too). And I'm facing constant FPS drops in big games even with this technic (but it would be worse if not using it).
The whole problem is conplex, but I'm 100% sure that we would greatly improve things by solving the "nested scenes" problem + the sources insane consumption.
In short :
- the more sources you add in a scene (even simple ones like a text), the more it adds percents of GPU used.
- nesting scenes add a 5% of GPU usage even if the nested scene is emtpy.
So there's definitely a "problem" in OBS management of sources and how compositing is made.
On the top of this :
- compositing is managed by the main GPU (if you have two on the same system). The main GPU is automatically selected by the operating system as "the one who displays the main screen". So if you use a second GPU and plug a second screen on it, by temprarily declaring your second screen as the "default one" in Windows (with help of a Nirsoft program), and launching OBS, then reverting the screens, you can fully launch OBS on the second GPU (meaning compositing + encoding, and not just encoding). This solve the problem of 20-30% of the main GPU used. I personnally used this technic because my GTX 960 coule never stream 1080p while playing Apex Legends.
But still, launching OBS and streaming will result in a few FPS drops, because of GPU communication throught RAM (and CPU usage too). And I'm facing constant FPS drops in big games even with this technic (but it would be worse if not using it).
The whole problem is conplex, but I'm 100% sure that we would greatly improve things by solving the "nested scenes" problem + the sources insane consumption.
Last edited: