Question / Help Frames drop and drop of FPS

Paolo07700

Member
Hello !
I come to you for ask something, I have a problem with big game.
I stream in 720p 60fps.
The problem is when I want to stream Assassin's Credd Unity or Rainbow Six for exemple, OBS can't keep the stream at 60fps, and I lost some frames called "missed images due to rendering delay".

I have a I7 7700k with 16go and a GTX 1080 8g.
But I stream with the Intel graphic part of my CPU or my GTX 750ti to avoid overloading the CPU.
Beacuse I7 can't keep a big game with a 720p 60 fps stream.

This it's not an encoding overload because it's the Intel graphic part of my CPU or my 750ti who encoding the stream. And when the encoding is overloaded it's the bottom line called "image breaks due to encoding latency" not "missed images due to rendering delay".
So it's not an encoding overload.

Even when I'm on the game menu, I don't lost alot images but when I play it's horrible.
Moreover we can see bein the unstable FPS, image generation latency that increases too!

Why I have this ? How I can fix this ?

Thanks everyone !
 

Attachments

  • 20180914_170423 (1).jpg
    20180914_170423 (1).jpg
    966.2 KB · Views: 71
Last edited:

Paolo07700

Member
Update, I removed my GTX 750Ti, and plug my 2 others monitors on my GTX 1080, because when I launch OBS Studio, GPU 1 is my 1080, and GPU 2 my 750ti where is my 2 others monitors, OBS use both of them sometimes, and use 30% (maybe more sometimes) of the 1080, and 50% of the 750Ti sometimes (or less 15%), I don't understand what is happening here.
When I play a game or sometimes values return to normal... I really don't understand.
I did a stream on Far Cry 3 with only my GTX 1080, and my 3 monitors on it, the GTX 1080 didn't keep 60 FPS sometimes in the game so OBS drop some frames also... I don't know why, there is a problem with my GTX 1080 ?
And why when I plug my GTX 750Ti for the 2 other monitors OBS use so much my graphics card ? I didn't see if when OBS was launched if I was less FPS on games when I have my 750Ti.
I wanted to try that OBS use only my 750Ti for "capture" or the render.
 

Attachments

  • 2018-10-04 17-35-27.txt
    29.9 KB · Views: 28
  • CaptureOBS750ti.PNG
    CaptureOBS750ti.PNG
    4.4 KB · Views: 14
  • CaptureOBS750ti2.PNG
    CaptureOBS750ti2.PNG
    5 KB · Views: 13
  • CaptureOBS750ti3.PNG
    CaptureOBS750ti3.PNG
    4.4 KB · Views: 12

koala

Active Member
You have a huge and complex scene setup, so yes, OBS really needs so much of your graphics card. To handle all these sources needs quite some computing power on the GPU. To lower the GPU demand from OBS, shrink your scene setup. Make a simpler scene setup. Remove sources. If you don't need some scenes and sources for your current stream project, move them into a different scene collection. Only the scene collection that is currently loaded uses up resources.
 

Paolo07700

Member
You have a huge and complex scene setup, so yes, OBS really needs so much of your graphics card. To handle all these sources needs quite some computing power on the GPU. To lower the GPU demand from OBS, shrink your scene setup. Make a simpler scene setup. Remove sources. If you don't need some scenes and sources for your current stream project, move them into a different scene collection. Only the scene collection that is currently loaded uses up resources.
Thanks but I don't understand when I want to use my GTX 750Ti for my 2 others monitors, OBS use alot of my 2 GPU !
As you can see when OBS is launch with my 750Ti, he use alot of my GTX 1080, AND my 750TI sometimes, he switch sometimes with the 2 GPU, I don't understand, I want to use my GTX 750Ti for my 2 others monitors and use it for OBS, for the "capture/render", and let my GTX 1080 free for the game, is that possible or not ?
I removed actually my 750Ti, for test, and OBS don't use alot of my GPU anymore still 10-20% but not so much when both are connected, and I still don't know how to explain what's happenned why OBS use more the GPU and both of them when I use my 750Ti...
 

Harold

Active Member
You're on a 16-lane CPU, you're forcing both video cards into x8 mode, and OBS needs upwards of x4-x6 worth of pci-e bandwidth on a video card to function. Your game needs another x4-x6 worth for itself.
 

Paolo07700

Member
You're on a 16-lane CPU, you're forcing both video cards into x8 mode, and OBS needs upwards of x4-x6 worth of pci-e bandwidth on a video card to function. Your game needs another x4-x6 worth for itself.
So can I use my 750Ti for my 2 opther monitor and wonfig OBS to use it or it's impossible ? He will use my 1080 in all case ?
 

koala

Active Member
Did you physically remove the 750 Ti? If you have 2 graphics cards built in, they share the maximum available pci-express bandwidth of x16. Usually, they both run on x8 speed. A single GPU card runs on x16 speed. If they run on x8 speed, they need double the time to transfer data through the pci-e bus, and perhaps this raises the GPU usage.

If you run OBS on one GPU and a game to capture on a different GPU, OBS has to transfer the video data from the game GPU through the pci-e bus to the OBS GPU. This puts unnecessary load on the GPUs as well as on the pci-e bus.

It's best to have only one external GPU in the system and run everything on this. Keep the 1080 and physically remove the 750 Ti. Only if the 1080 is the only card, it is able to fully use x16 pci-e speed. The internal Intel GPU doesn't count with this, it has its own connection to the CPU.
 

Paolo07700

Member
Did you physically remove the 750 Ti? If you have 2 graphics cards built in, they share the maximum available pci-express bandwidth of x16. Usually, they both run on x8 speed. A single GPU card runs on x16 speed. If they run on x8 speed, they need double the time to transfer data through the pci-e bus, and perhaps this raises the GPU usage.

If you run OBS on one GPU and a game to capture on a different GPU, OBS has to transfer the video data from the game GPU through the pci-e bus to the OBS GPU. This puts unnecessary load on the GPUs as well as on the pci-e bus.

It's best to have only one external GPU in the system and run everything on this. Keep the 1080 and physically remove the 750 Ti. Only if the 1080 is the only card, it is able to fully use x16 pci-e speed. The internal Intel GPU doesn't count with this, it has its own connection to the CPU.
I physically removed it (I have 3 x PCI-Express 3.0 x16), like I said when I did it OBS don't use so much the GPU anymore.
 

Paolo07700

Member
I turn down my stream's to 30fps, because with my GTX 1080 8G and my I7 7700k, can't hold 60fps in some games like Far Cry 5, my principal minotor is a 144hz monitor, and some times OBS can't hold 30Fps it's when the games is running less of 60fps I think, but my monitor is a 144Hz, I don't understand, so I'll try to turn on "vertical synchronization".
To avoid overloading the gpu, as explained in some topics, but if the game does not reach 144 Fps (equivalent to the frequency of my monitor), there will always be an overload. Far Cry 5 for example is not never climbed above 100 Fps (except looking at the sky), and that's where I do not understand, OBS loses capture footage and therefore can not hold FPS when the game drops below 60 Fps (I guess), what happens to you how do you fix that? Does vertical synchronization fit this even if my monitor is in 144 hz?
 

BK-Morpheus

Active Member
To cap your ingame GPU usage, it seems that you need to do so with less than 144fps (aka VSync on 144Hz), because the GPU is already maxxed out even before hitting 100+ fps. So you need to set ingame fps limit to 60 for example. If the game has no such option, you can use Rivatuner Statistics Server to do so, or enable Vsync and set the monitor to 60Hz.
 

Paolo07700

Member
To cap your ingame GPU usage, it seems that you need to do so with less than 144fps (aka VSync on 144Hz), because the GPU is already maxxed out even before hitting 100+ fps. So you need to set ingame fps limit to 60 for example. If the game has no such option, you can use Rivatuner Statistics Server to do so, or enable Vsync and set the monitor to 60Hz.
Hey !
Thanks for your answer, I can set my monitor to 75hz minimal option possible in nvidia settings.
But I can see that my graphic card is not overload on Far Cry 5 it use 80 % whitout vertical synchronisation and 75hz for the frequencies.
When I allow this I have the same Fps not more of 75fps because the frequencies and limit is activated.
But my graphic card is used at 60 %.
I think my CPU bottleneck my graphic card.
But on call of duty bo4 my graphic card is overloaded. But my cpu also. I didn't try for now on this game.
 
Last edited:

Paolo07700

Member
To cap your ingame GPU usage, it seems that you need to do so with less than 144fps (aka VSync on 144Hz), because the GPU is already maxxed out even before hitting 100+ fps. So you need to set ingame fps limit to 60 for example. If the game has no such option, you can use Rivatuner Statistics Server to do so, or enable Vsync and set the monitor to 60Hz.
But damnit why with my PC I can't play at 144fps with streaming ?
I mean on R6 Siege, My graphic card is up to 100% but I don't lost any images, or very little.
Even on Far Cry 5, or other games...
But on Assassin's Creed Unity, I lost alot of images at only 60fps...
It's really weird ! My PC can't run 144fps with a I7 7700k and a 1080 ?
 
Top