Question / Help FPS dropping below 60 on Ryzen 5 1600x at 720p

CommZod

New Member
Hey Guys,
I've had an issue lately where I'll be streaming just fine then out of no where, without the cpu usage changing, OBS will start reporting its recording FPS down in the 40s-50s even as low as the 30's. Its not reporting any dropped frames and my overall CPU usage sticks around 60-70%. I've tweaked some settings here and there but am lost. Rather that spit out all my settings, here's the log. Any help is appreciated

Also I'm on fiber 150 d/l 75 u/l so I should be good there

https://gist.github.com/27ff21dc73d4a9c369b2fb23497a7dad
https://gist.github.com/anonymous/cc9adbf74c6cb53e238726a7af3558fe

You can see what I'm talking about starting at the 22:20 mark, before that things were going fine.
https://www.twitch.tv/videos/230535302
 
Last edited:
In your 'Live Broadcast' scene:
You should remove the monitor capture source from this scene as it relies on your system RAM to grab the frame, which is far less efficient (Far slower as well) than the direct hooking method used by game capture.
Game capture grabs the frame directly from the graphics card for OBS to encode before sending it back to the graphics card for final rendering of the frame.
Monitor capture has the graphics card write the frame to system RAM before OBS grabs it to encode, then sends it to the video card to render for final output.
If you have need for Monitor Capture, you should setup a scene for it by itself, without Window or Game Capture sources inside that scene.

If you don't already, limiting your frame rate in-game will help a lot in getting the best quality out of your hardware in regards to recording/streaming. I recommend using MSI AfterBurner or EVGA Precision to cap your fps to your monitor refresh rate, or 1-3 frames higher than your monitor refresh rate will generally resolve the issue of screen tearing. Try to avoid V-Sync as it can induce input lag.

You should also be able to use Lanczos filtering in Video options in OBS settings, this will help to improve the scaling 'fuzziness' that happens when you downscale from 1080p.

You could lower the performance hit of your webcam as well by lowering the fps value output to 30fps.

If you go to the site in the below link and download the TwitchTest tool (Created by R1ch) it will assist you in finding the most optimal Twitch ingest server based upon your location. You want the server with the highest throughput (Quality) that is closest to you (RTT) The tool will also analyze your actual bandwidth for streaming, which can be (Almost always is) lower than what a speed test site displays in its results:
https://r1ch.net/projects/twitchtest

Hope this helps!
 

BK-Morpheus

Active Member
And just as a tip: You usually should not need 7x game_capture sources in the same scene. Just one for all game_capture compatible programs (it will autodetect the game, if you don't force it to a specific .exe file).
 
Indeed a good tip from BK-Morpheus, a universal 'compatible 3D .exe' game capture setup is generally sufficient for most setups and much easier to manage.
Most of the time if you can't get game capture to work with a game, you will have to use window capture mode (Set it to the specific game .exe file) and will have to run the game in Borderless Fullscreen/Windowed mode to get OBS to see the game (Very few games you will come across with this issue)
The ability to designate specific .exe game captures is there if you want to include filters for a specific game title (Sharpen, Contrast, Colour, etc)
 
Your settings look fine, other than low bitrate for 720p60fps in terms of quality output (6k will provide great quality)

The first and foremost thing is to limit your fps, which you say you have done. The only other thing one can really recommend now is lowering slightly the graphics quality in-game to reduce the load on your GPU.

In order to optimize visual quality of the game, find a nice 3-5 minute path to run back and forward through, that induces heavy GPU load. The path should try to include:
An area where there is lots of lighting.
An area with lots of shadows.
An area with lots of ground clutter/debris/grass/trees/smoke, etc
A high vantage point so you can see the effect of the render draw distance option OR an area where you are at ground level so you can optimize the draw distance for texture LOD pop-in.

Start with the most intensive graphic options, generally in order (Not always the same for each game, game engines can be different though will be similar from the top) they are:
Anti-Aliasing: MSAA/SSAA
Ambient Occlusion (Can be very heavy on FPS in some game engines, can be a few places down the list in other game engines)
Lighting Quality (Can be below Shadow Quality/Draw Distance in some game engines)
Shadow Quality/Draw Distance
Texture/LOD Draw Distance (Can be heavy on FPS in some game engines, can be a few places higher on the list)
Texture Quality
Post Processing Effect Quality: FXAA/TXAA/SMAA

You just need to do one at a time, as long as you can maintain ~65-70fps as a minimum you should be able to clear up any rendering lag issues when recording/streaming whilst using x264 encoder, for h264 you should try for ~70-75fps as a minimum to allow for the extra GPU load.

I hope this helps you out!
 

CommZod

New Member
So I've tried a few more things and figured it out. The game I was having the most trouble with is "Batman: Arkham Knight" again its running fine for me, I locked it at 60fps in game but could easily get more, I even tried turning everything down to 720 at 30fps for the stream. Its lagging even then. The GPU usage at 98%. I went back into the graphics settings and started messing with the "Nvidia Gameworks Settings" and found that "Enhanced Light Shafts" makes things take a nosedive. Its not as bad in the open but in smaller locations with a lot of light sources just takes too much GPU for OBS to do its job. Thanks for all the suggestions along the way guys!
 

CommZod

New Member
This was actually in the graphics settings in the game itself. There's about 4 Nvidia specific features.
 
I don't have and never have played Batman Arkham, so had to look up expected fps and VRAM usage (I should have done earlier than this, sorry)
Enhanced Light Shafts should take about 3-5 fps off on average.

In-game what sort of VRAM usage are you hitting normally? I completely forgot about the next part until I looked at your log file before posting this, hence the question now...
The 4GB version of the GTX 970 technically has 3,584MB of 'fast access' DDR5 VRAM, the last 512MB of DDR5 VRAM of the 4GB uses a shared L2 cache which causes significantly slower bandwidth upon access.
In saying that, nVidia have optimized utilization of that last 512MB VRAM to a degree where you will likely never really see any performance drop, apart from minor hitching every so often with what I guess would be texture swapping/compression when your VRAM usage exceeds ~3.5GB.

If you are hitting ~3GB VRAM usage in-game (Taking into account Windows DWM uses ~300MB VRAM, then it may be the actual issue and not the setting itself as OBS may end up having to utilize some of the DDR3 VRAM your GTX 970 has.
Not entirely sure as to how OBS handles VRAM usage though (One for someone else with far more knowledge of the inner workings of OBS VRAM utilization to answer) as I've only ever had an r9 290X 4GB (DDR5) when using OBS to record/stream test and running games simultaneously and have never utilized more than 3GB VRAM as I have a 1080p monitor/don't play games that push more VRAM usage.

Maybe test the setting again, making sure that the game isn't pushing more than ~2.5GB VRAM to see if the issue occurs again?
 
Top