Paultimate
New Member
So I have read the FAQs, the tips and guides and have recorded on and off for 5+ years and never really asked this. I also want to preference this by saying I dislike using GeForce Experience and much rather use OBS.
The issue is with some games, its not reasonable to expect the GPU to not use 100% at some points. Even if I push everything to below nominal, achieve 120fps in one area, doesn't mean I will get even 50fps in another area (i have it capped at 60) 30fps or retooling settings for every area in a given dynamic game is unreasonable.
I recently tried GeForce with the same settings (25000kbps, NVENC, same framerate, same resolution, same game settings, same quality). OBS chokes and GeForce captures smooth 60fps gameplay. The only issue is GeForce sucks with audio and other settings that are useful. Unfortunately OBS suffers producing the main thing it is designed to do: capture frames consistently.
My question is why? What is the difference here? I have set countless settings in OBS including upping its process priority and putting it on a different cpu cores than the game, setting all variations of quality settings ect ect. Why is GeForce not dropping frames under the same conditions? If I can figure it out I can use OBS and move away from Geforce fully.
Below I put in the same settings for GeForce and OBS and the same game settings (this was done within 30 seconds of each other multiple times after restarts ect), recorded how they use the Encode hardware on the GPU with each and labeled them. I've also uploaded the walk done by each encode below. Why is OBS choking? I want to use OBS, but I can't.
(Sorry for the music please mute)
GeForce Example:
OBS Example (Same scene same recording settings same game settings):
Edit: From further testing, it seems like OBS is choking when the 3D engine is saturated (this is a 1070GTX). But that is very strange as the encoding doesnt take place on the 3D rendering engine, it takes place on the encoding engine (see image "vs"). Is there an obscure OBS setting ived missed to fix this?
LOG: https://obsproject.com/logs/mccVC-BUlm3715zN
The issue is with some games, its not reasonable to expect the GPU to not use 100% at some points. Even if I push everything to below nominal, achieve 120fps in one area, doesn't mean I will get even 50fps in another area (i have it capped at 60) 30fps or retooling settings for every area in a given dynamic game is unreasonable.
I recently tried GeForce with the same settings (25000kbps, NVENC, same framerate, same resolution, same game settings, same quality). OBS chokes and GeForce captures smooth 60fps gameplay. The only issue is GeForce sucks with audio and other settings that are useful. Unfortunately OBS suffers producing the main thing it is designed to do: capture frames consistently.
My question is why? What is the difference here? I have set countless settings in OBS including upping its process priority and putting it on a different cpu cores than the game, setting all variations of quality settings ect ect. Why is GeForce not dropping frames under the same conditions? If I can figure it out I can use OBS and move away from Geforce fully.
Below I put in the same settings for GeForce and OBS and the same game settings (this was done within 30 seconds of each other multiple times after restarts ect), recorded how they use the Encode hardware on the GPU with each and labeled them. I've also uploaded the walk done by each encode below. Why is OBS choking? I want to use OBS, but I can't.
(Sorry for the music please mute)
GeForce Example:
OBS Example (Same scene same recording settings same game settings):
Edit: From further testing, it seems like OBS is choking when the 3D engine is saturated (this is a 1070GTX). But that is very strange as the encoding doesnt take place on the 3D rendering engine, it takes place on the encoding engine (see image "vs"). Is there an obscure OBS setting ived missed to fix this?
LOG: https://obsproject.com/logs/mccVC-BUlm3715zN
Attachments
Last edited: