Question / Help RTX new NVENC - is this working properly for anybody?

NotNow

New Member
I ask because I wasn't able to stream using the new NVENC when I last tried. And I can't find any recent posts from anybody who can successfully run the Cuda hungry new NVENC.

If you have it working without dropping frames, what are your settings?

I use a single PC setup with i7 8700k and RTX 2060.
New NVENC settings: Quality. 6000 bitrate (20mb upload speed). Psycho and look ahead turned off. 720p 60.
100fps capped in game.

Old NVENC works absolutely fine for me.

Thanks!
 

NG.

New Member
I had issues with the old NVENC and thus had to switch to the new one.
Seems the game is prioritised over OBS, meaning every time the game maxes out your GPU, the recording FPS suffer badly.
If you constantly got ingame fps above 60 you may use VSYNC to cap the fps and the recording should work fine.
For performance hungry games I couldn't find a way to use the new NVENC.
Let me know if you can figure something out.
 

NotNow

New Member
I had issues with the old NVENC and thus had to switch to the new one.
Seems the game is prioritised over OBS, meaning every time the game maxes out your GPU, the recording FPS suffer badly.
If you constantly got ingame fps above 60 you may use VSYNC to cap the fps and the recording should work fine.
For performance hungry games I couldn't find a way to use the new NVENC.
Let me know if you can figure something out.

I live stream and keep my in game fps to 100 on a 1080p monitor. With old NVENC I have no problems streaming at 720p 60fps.
I'm overclocking my 2060 hoping it might help but I've read about people dropping frames with a 2080, so I'm not confident I'll be using new NVENC any time soon.

Can I ask what GPU you are using?
 

NotNow

New Member
Ok I managed to stream in 720 60 last night with zero problems using the new NVENC.

I think this success might be down to having overclocked my GPU?

I was streaming Sea Of Thieves. I used the in game high settings (legendary) not ultra (Mythical) and kept frames capped at 100fps.
NVENC settings Quality. PSV and Look Ahead disabled.

I wouldn't want to mess with these settings, there's no way I'd retain the in game FPS. My question now is whether there is any improvement over using old NVENC and higher in game quality settings?

More testing!
 

NotNow

New Member
Another successful stream. Noticed I wasn't using Lancsoz 32 bit, added that and enabled psycho visual tuning. Looked great, no dropped frames! Might try 900p next...
 

NotNow

New Member
You may know what a log file is, but we need it for troubleshooting and you didn't provide it. That's why the link to the sticky thread was given to you.

Yeah I couldn't provide a log file from weeks back when I had last used New Nvenc. All you would have seen was encoder overload anyway, nothing you could troubleshoot.

Last night I streamed a 60 fps game and it went fine. I switched to a game capped at 100fps and started dropping frames in SLOBS. Turns out I had forgotten to choose an overclocked profile for my RTX 2060. So it's clear now that, for my single PC setup at least, the 2060 FE isn't beefy enough for new NVENC at higher frame rates. I have to overclock it to be in with a chance.

Before switching back to a 2 pc setup, I might grab one of these RTX Super cards, maybe a 2070, and see if I can get by.

Edit: just realised my 144hz monitor had reverted back to 60 hertz in windows settings. Maybe that had a bearing on things... but I've definitely had better results since the overclock.
 
Last edited:

NG.

New Member
Thanks for the insights, I'm using a 1080 and didn't manually overclock.
Guess you didn't track your GPU load while streaming? That would be too much multitasking for my brain :)
I think with performance hungry survival games even a 2080ti would max out at 60 fps 1080p at times. And each load spike causes frame drops in the recording/stream. I could manage to use NVENC but I'd have to reduce graphics settings ingame to stay above 60fps in all circumstances.
And of course activate VSync.
That's too much hassle for me. ARK loves spiking the GPU. I would have to play at an average 50% GPU usage to avoid 95%+ spikes, and for bossfights I would have to adjust the settings again... And there isn't even a guarantee that there won't be 100% load spikes at times.

CPU encoding isn't great either. At CPU loads above 80% my Intel system gets sluggish with tiny amounts of framedrops and occasional lag, but at least it works mostly as supposed to.

Would be interesting to know why Intel systems can't work efficiently at 100% load and multi tasking. If that's to do with the low cache or poor drivers or something. I expected to get better multitasking performance with more cores but it's pretty awful.
While rendering videos I even get errors and of course terrible load times when using my browser... Oo
I'm running a I7-7800X @ 4.2GHz, 32Gb RAM
 
Top