NVENC Performance Improvements (Release Candidate)

Status
Not open for further replies.

KuraiShidosha

New Member
Steam overlay LAGS OUT hard when OBS is hooked into a game. The friends list and chat UI just completely freezes down to 1 fps or slower. Happens as soon as game capture grabs a game.
 
One thing to consider with those advanced options are remote devices. Local recording is nice, but when used in a live environment those advanced settings might cause payload to mobile or web browsers. I'll stick to the test builds with all do respect.
 

DIRTY CES

Member
Try as an experiment to lower the frame rate to 30 and set two-pass encoding.

Thanks for the reply LiaNdrY
I don't think there is an option for 2 pass encoding on this beta. Am I wrong ? and what would that accomplish I'm not looking to stream at 30 fps. What is the logic so I can be educated. Also did you look at my log ?
This is a beta and I'm saying that my i7 7700k looks better at verfast than using my 2080ti and my log is posted and no one has posted if my log is showing something wrong. I was asked for it, but never got a reply. I figured this is the point of this forum to catch potential bugs.
Also on a side note my cpu sits at 5-10% when obs is idle no stream and no record. Is that normal ? I looked at task manager and it looks like it's all obs.
Thanks in advance for any help. No hostility here just wanted to get to the point.
 

jellysandwich

New Member
@jellysandwich be sure to make a pull request when the jim-nvenc branch is merged into master

I made the custom build for testing/fun - a valid pull request is much more complex. Per that other pull request, Jim seems to want a command line argument that would have to be parsed into the options (I don't currently have enough c knowledge to do so).

jp9000: I will be willing to allow the additional NVENC settings, however, I definitely have to opt for an options string rather than have additional advanced parameters as properties. https://github.com/obsproject/obs-studio/pull/1271

Additionally the settings are completely broken for the old NVENC ffmpeg implementation. Simply put, the build is just for testing the new NVENC and seeing if the additional options are even worth putting in
 

jellysandwich

New Member
- Refs;
- Presets slow, medium, fast, lossless, losslesshp;
- Rate Control: RC_MODE_VBR_HQ and others;
- Surfaces;

- Refs; -> not sure what this is?
- Presets slow, medium, fast, lossless, losslesshp; -> as previously mentioned slow = max quality
- Rate Control: RC_MODE_VBR_HQ and others; -> not sure will need to look into it
- Surfaces; -> couldnt find anything about this. i ctrl+f'd the nvenc file for the word "surface" and didn't find anything, they may have removed it
 

LiaNdrY

Member
- Refs; -> not sure what this is?
- Presets slow, medium, fast, lossless, losslesshp; -> as previously mentioned slow = max quality
- Rate Control: RC_MODE_VBR_HQ and others; -> not sure will need to look into it
- Surfaces;
-> couldnt find anything about this. i ctrl+f'd the nvenc file for the word "surface" and didn't find anything, they may have removed it
#Refs - > https://ffmpeg.org/doxygen/3.3/group__ENCODE__FUNC.html#gacaac02ec8abfa332afb174f1f2922850 (The first thing I could find, maybe this is no longer in the new version.)

#Rate Control - Cut from nvEncodeAPI.h
C#:
/**
 * Rate Control Modes
 */
typedef enum _NV_ENC_PARAMS_RC_MODE
{
    NV_ENC_PARAMS_RC_CONSTQP                = 0x0,       /**< Constant QP mode */
    NV_ENC_PARAMS_RC_VBR                    = 0x1,       /**< Variable bitrate mode */
    NV_ENC_PARAMS_RC_CBR                    = 0x2,       /**< Constant bitrate mode */
    NV_ENC_PARAMS_RC_CBR_LOWDELAY_HQ        = 0x8,       /**< low-delay CBR, high quality */
    NV_ENC_PARAMS_RC_CBR_HQ                 = 0x10,      /**< CBR, high quality (slower) */
    NV_ENC_PARAMS_RC_VBR_HQ                 = 0x20       /**< VBR, high quality (slower) */
} NV_ENC_PARAMS_RC_MODE;

#Surfaces - The parameter is found in ffmpeg, in the encoding block via nvenc_h264
Code:
  -surfaces          <int>        E..V.... Number of concurrent surfaces (from 0 to INT_MAX) (default 32, max 64)
Also found a mention here: https://ffmpeg.org/doxygen/3.3/structNvencContext.html#af326b5fbdaf8da2edd93a0e67c370ec8
 

ScaryRage

Member
Since people tested it long enough now can someone please tell me if its good activating look-ahead and psycho Visual Tuning?
 

Osiris

Forum Moderator
As far as i'm aware there are no OS restrictions on any of the rate control modes, some older videocards don't support lossless though, my GTX 770 (Kepler) didn't support it.
 

KLO

New Member
I have my PC running i9-9900k, RTX 2080ti x2 SLI enabled, playing at 4k resolution. Fps in game 120-140.
As soon as I'm switching to the game(Destiny 2) FPS in OBS drops to 30-40 and OBS says that encoder is overloaded, try to use lower quality preset. But no matter which preset I'm using with new nvenc encoder issue persists, Also I tried to encode on 0 and 1 GPU. Output resolution is 1600x900@60.
What should I try or maybe collect some more info to fix this issue?
 

saxtrain.ttv

New Member
I have my PC running i9-9900k, RTX 2080ti x2 SLI enabled, playing at 4k resolution. Fps in game 120-140.
As soon as I'm switching to the game(Destiny 2) FPS in OBS drops to 30-40 and OBS says that encoder is overloaded, try to use lower quality preset. But no matter which preset I'm using with new nvenc encoder issue persists, Also I tried to encode on 0 and 1 GPU. Output resolution is 1600x900@60.
What should I try or maybe collect some more info to fix this issue?
I also have a 9900k with a 1 rtx 2080ti and have had no problems with this beta on Destiny 2. So the root cause would seem to be related to multiple gpu's perhaps?
 

KLO

New Member
After some more tests I figured out that FPS drop is not related to SLI (tried to disable it). It's related to how GPU is currently loaded, when looking into the sky in the game OBS is able to maintain 60 FPS. More of than it's related to 4K resolution, so OBS has to downscale whole image to output resolution. When I set all resolutions (game, screen, output and canvas) to 1920x1080 it doesn't drop fps anymore since it hasn't downscale anything.
So seems that it's downscaling image on the same GPU where game is running.
 

Overflow

Member
After some more tests I figured out that FPS drop is not related to SLI (tried to disable it). It's related to how GPU is currently loaded, when looking into the sky in the game OBS is able to maintain 60 FPS. More of than it's related to 4K resolution, so OBS has to downscale whole image to output resolution. When I set all resolutions (game, screen, output and canvas) to 1920x1080 it doesn't drop fps anymore since it hasn't downscale anything.
So seems that it's downscaling image on the same GPU where game is running.
What the % of usage when you are playing in 4K and using OBS? If you are over 80/90% you can drop frames in obs
 

LiaNdrY

Member
As far as i'm aware there are no OS restrictions on any of the rate control modes, some older videocards don't support lossless though, my GTX 770 (Kepler) didn't support it.
The screens show that only 4 modes are available, and among them there is no VBR_HQ and CBR_HQ.
 

Attachments

  • Screenshot_1.jpg
    Screenshot_1.jpg
    23.5 KB · Views: 102
  • Screenshot_2.jpg
    Screenshot_2.jpg
    22.3 KB · Views: 97

Osiris

Forum Moderator
VBR_HQ is used when 2-pass is enabled (through Max Quality preset), otherwise it's just VBR. Same goes for CBR, except that CBR_LOWDELAY_HQ is used when 2-pass is enabled.
 

LiaNdrY

Member
VBR_HQ is used when 2-pass is enabled (through Max Quality preset), otherwise it's just VBR. Same goes for CBR, except that CBR_LOWDELAY_HQ is used when 2-pass is enabled.
Clear. Maybe I missed something, but what's wrong with win7, why do they want to disable it in the next build?
 
Status
Not open for further replies.
Top