NVENC Performance Improvements (Release Candidate)

Status
Not open for further replies.

Overflow

Member
You can reduce the frequency of the monitor to 100-120 Hz. You need to understand, no matter how powerful your video card is, if you want to use it in the stream, then its load in the game should at least be 85-90% maximum. How do you achieve this decide for yourself, limiting the frame rate or reducing graphics in the game, or both.
Ok, thanks.
 

intense

New Member
I am trying to install websockets plugin in the beta build, but it's not working for me, is not implemented ? I really enjoy A LOT this beta build, but I need websockets :(
 

davewilliams000

New Member
I was having an issue when recording assetto Corsa competizione today, which is a graphically intensive game. On checking the recorded videos would have full audio but only a few frames of recorded video -i e it would record 1frame and freeze it for 30 seconds or so and then record anothet one.

A few times it's recorded footage normally for a bit and then went to that behaviour. I noticed when I tried clicking stop recording that it didn't seem to work and I had to click it a few times for it to stop

I was originally running beta 6 and then checked this forum and download beta 9 with the same result. I'm going to go back to v22 stable to re-test that

PC specs:
i7+4790k, 32gb ddr3, gtx2080, 2x Samsung SSD. Running at 3440x1440.

I recorded in the regular assetto Corsa the other day and it was fine. Not sure if windows updates could have broken it...I'll try and do more testing

I tried it again using v22 and with quicksync too and had the same result so nothing to do with OBS...

Think I've solved my issue.... I had accidentally enabled Windows 10 'game mode' so OBS must have been classed as a background task and not getting enough cpu time!
 

Mike321

Member
Don't do that, that adds more input delay than using v-sync

No it doesnt. say for example hes getting 200 fps in a certain game, and he caps it at 180, thats not going to give as much input lag as vsync

Edit: heres a good example https://linustechtips.com/main/applications/core/interface/imageproxy/imageproxy.php?img=http://www.blurbusters.com/wp-content/uploads/2014/01/lag-csgo.png&key=e2ce8260dbfad9de9eda545b65bee699d6a216e006e629ea33ec997897c7f9d9

Also, even the obs creators recommend capping your fps a bit lower to take the strain off your gpu/cpu to allow obs to stream/record smoothly.
 
Last edited:

ntoff

New Member
I am trying to install websockets plugin in the beta build, but it's not working for me, is not implemented ? I really enjoy A LOT this beta build, but I need websockets :(
Evidently you need to build websockets against the test source code. I don't know why it's necessary, perhaps a QT version mismatch? Anyway, if you recompile obs-websockets and link it against the nvenc branch source code, it works.
 
So I have been testing and live streaming with this for a few days. I have a 1060 3GB, dedicated PC for recording.

My setup: The gaming PC is connected to a 4K splitter (for cloned input into the Razer Ripsaw) which is output to the dedicated PC (two GTX 760's SLi) and OBS (test 9). I set the bitrate at 4800 only, 720p output with 60 FPS (fractional: 60000/1000), and use a LUT to help with color loss or errors in high motion video. I also changed the max pixel rate on the splitter to 110 Mhz instead of the default 600 Mhz in Windows, because of high speed motion capture and conversion from DisplayPort to HDMI; some frames are actually lost when captured from the PC. The video clip is recorded live to Twitch and captured over a wireless connection (Netgear A7000). I set the keyframe to 1 instead of 2.

Feedback: I don't know what Psycho Visual Tuning does, but the quality difference is amazing. The clip below is set only on the Quality setting and decided to try a higher speed game like Overwatch. I didn't measure anything like frame drops by browsers because there are too many variables in there. I did have the pleasure of testing the fact that I had less packet loss over a mobile device when it was viewed from Kansas to the UK (though I didn't pry to ask what type of device the viewer used).

I also didn't experience any weird behavior with SLi since video scaling can be a headache. I definitely seen a massive drop in CPU usage with this AMD 8150. Before, the CPU would skyrocket to at least 6% CPU usage over NVENC. This is an older CPU so I'm impressed by that. I didn't perform any local recording as of yet, only from a live streaming use case.

https://clips.twitch.tv/ArborealGlutenFreePassionfruitTinyFace
 

LiaNdrY

Member
So I have been testing and live streaming with this for a few days. I have a 1060 3GB, dedicated PC for recording.

My setup: The gaming PC is connected to a 4K splitter (for cloned input into the Razer Ripsaw) which is output to the dedicated PC (two GTX 760's SLi) and OBS (test 9). I set the bitrate at 4800 only, 720p output with 60 FPS (fractional: 60000/1000), and use a LUT to help with color loss or errors in high motion video. I also changed the max pixel rate on the splitter to 110 Mhz instead of the default 600 Mhz in Windows, because of high speed motion capture and conversion from DisplayPort to HDMI; some frames are actually lost when captured from the PC. The video clip is recorded live to Twitch and captured over a wireless connection (Netgear A7000). I set the keyframe to 1 instead of 2.

Feedback: I don't know what Psycho Visual Tuning does, but the quality difference is amazing. The clip below is set only on the Quality setting and decided to try a higher speed game like Overwatch. I didn't measure anything like frame drops by browsers because there are too many variables in there. I did have the pleasure of testing the fact that I had less packet loss over a mobile device when it was viewed from Kansas to the UK (though I didn't pry to ask what type of device the viewer used).

I also didn't experience any weird behavior with SLi since video scaling can be a headache. I definitely seen a massive drop in CPU usage with this AMD 8150. Before, the CPU would skyrocket to at least 6% CPU usage over NVENC. This is an older CPU so I'm impressed by that. I didn't perform any local recording as of yet, only from a live streaming use case.

https://clips.twitch.tv/ArborealGlutenFreePassionfruitTinyFace
You can compare the output quality with Psycho Visual Tuning and without doing the following in ffmpeg:
Code:
ffmpeg.exe -i "Input.mp4" -c:v h264_nvenc -preset hq -profile:v high -b:v 4800k -rc:v cbr -bf 2 -sws_flags lanczos -s 1280x720 -2pass 1 -y "Output.mp4"

ffmpeg.exe -i "Input.mp4" -temporal-aq 1 -c:v h264_nvenc -preset hq -profile:v high -b:v 4800k -rc:v cbr -bf 2 -sws_flags lanczos -s 1280x720 -2pass 1 -y "Output_mod.mp4"

Where -temporal-aq 1 will be "Psycho Visual Tuning"

But if the developers would give more freedom of action, it would look like this:
Code:
ffmpeg.exe -i "Input.mp4" -c:v h264_nvenc -preset slow -profile:v high -b:v 4800k -rc:v vbr_hq -bf 2 -rc-lookahead 32 -surfaces 64 -spatial-aq 1 -aq-strength 10 -refs 4 -sws_flags lanczos -s 1280x720 -2pass 1 -y "Output_mod+.mp4"
 

Fam3mon5ster

New Member
So I tried it last night, I have a 2080ti and it looked blurrier when moving then when I stream with my i7 7700k at veryfast and bitrate at 8000. I was very sad. I recorded and streamed at the same time and I also stop the recording and had the same outcome. I'm lost here. Oh forgot I'm streaming BLACKOUT on xbox. My upload is not an issue (900 plus mbps)

are both your encoders set to the same ? for instance your streaming should have been marked nvenc new then recording tab should be set to use same encoder as stream i believe or manually input nvenc -

Ok but i have a 144hz monitor so, even with vsync on i go on 100% with a RTX 2070 Full HD if the games are maexd out...
You can reduce the frequency of the monitor to 100-120 Hz. You need to understand, no matter how powerful your video card is, if you want to use it in the stream, then its load in the game should at least be 85-90% maximum. How do you achieve this decide for yourself, by limiting the frame rate or reducing graphics in the game or both.

there is no reason for him to reduce his frequency from 144hz to 100-120hz if your gonna cap your **frames** - its better to leave it 1.Max Hz then 2.Turn vsync on then 3.Cap **Frames/FPS** via say rivaturner for the best possible outcome when using gsync - capping your **Frames** will reduce GPU utilization - but capping your HZ is kinda pointless / poor way to limit FPS-- when Capping FPS you can go as low as 30 FPS if u wanted to on that screen before gsync auto switches back to vsync -- what you DONT wanna do is let gsync or vsync itself cap for you on a gsync monitor-

for example you can go and play a game right now - have gsync on / vsync turned off in menu and notice your fps is capped right on 144 - and say gsync is doing its job. yea tru but what most don't realize is that you just introduced to input lag into your game. so that nice monitor you have is not doing what it was suppose to do fully- some tests say from big sites like BlurBusters show it acting just like vsync only monitor with input lag that vsync brings when running gsync alone -

your technically suppose to turn on G-sync and V-sync on for G-sync to properly work - most popular way is to force vsync on via nvidia control panel - what it translates to is you playing your game and if your FPS lands in 30-144Hz range gsync is on doing its thing once your FPS is getting close though to either 143 or 144 gsync beings talking to vsync for the hand off and your game will still be tear free WITH NO fps limit and continue to run high numbers depending on your build but with some input lag since your in the 144 + territory - if you had vsync off though t will stop at 144 hz still try to "hand off" to no one and look like you capped your FPS yet you just inherited all the input lag vsync would have gave you anyways since its trying to do the switch - at this point you should see some tearing and notice the input lag - when i accidentally run my setup like this usually the tearing happens at the bottom half -

this is why people cap their FPS below the gsync range to prevent the switch communication some do at 143 for 144hz panels - same applys to other 100hz 240hz etc. - best bet is to drop it to 3 fps less then your max hz that way in case the rivatuner fps cap has a hiccup and introduces some extra fps you wont feel that switch as much happening

anyways

personally I would limit your frame rate 1st - then modify graphics options - and adjust accordingly - if your getting 100% usage at 144hz try limiting your ((frames/fps))down to say 100-120*FPS* and look at your usage from their - remember your HZ is not the same as FPS - capping your FPS 1st will allow you to enjoy your graphics settings where they are at max settings. when you need to change graphics settings is when you realize that even capping FPS doesn't help then its one of the settings triggering your gpu to work itself up - my 1st go to option is always shadows or shadow resolution -
 
You can compare the output quality with Psycho Visual Tuning and without doing the following in ffmpeg:
Code:
ffmpeg.exe -i "Input.mp4" -c:v h264_nvenc -preset hq -profile:v high -b:v 4800k -rc:v cbr -bf 2 -sws_flags lanczos -s 1280x720 -2pass 1 -y "Output.mp4"

ffmpeg.exe -i "Input.mp4" -temporal-aq 1 -c:v h264_nvenc -preset hq -profile:v high -b:v 4800k -rc:v cbr -bf 2 -sws_flags lanczos -s 1280x720 -2pass 1 -y "Output_mod.mp4"

Where -temporal-aq 1 will be "Psycho Visual Tuning"

But if the developers would give more freedom of action, it would look like this:
Code:
ffmpeg.exe -i "Input.mp4" -c:v h264_nvenc -preset slow -profile:v high -b:v 4800k -rc:v vbr_hq -bf 2 -rc-lookahead 32 -surfaces 64 -spatial-aq 1 -aq-strength 10 -refs 4 -sws_flags lanczos -s 1280x720 -2pass 1 -y "Output_mod+.mp4"

Will do. I'll take a look at this. The PM is locking it down, bolt and key.
 
Last edited:

Overflow

Member
are both your encoders set to the same ? for instance your streaming should have been marked nvenc new then recording tab should be set to use same encoder as stream i believe or manually input nvenc -




there is no reason for him to reduce his frequency from 144hz to 100-120hz if your gonna cap your **frames** - its better to leave it 1.Max Hz then 2.Turn vsync on then 3.Cap **Frames/FPS** via say rivaturner for the best possible outcome when using gsync - capping your **Frames** will reduce GPU utilization - but capping your HZ is kinda pointless / poor way to limit FPS-- when Capping FPS you can go as low as 30 FPS if u wanted to on that screen before gsync auto switches back to vsync -- what you DONT wanna do is let gsync or vsync itself cap for you on a gsync monitor-

for example you can go and play a game right now - have gsync on / vsync turned off in menu and notice your fps is capped right on 144 - and say gsync is doing its job. yea tru but what most don't realize is that you just introduced to input lag into your game. so that nice monitor you have is not doing what it was suppose to do fully- some tests say from big sites like BlurBusters show it acting just like vsync only monitor with input lag that vsync brings when running gsync alone -

your technically suppose to turn on G-sync and V-sync on for G-sync to properly work - most popular way is to force vsync on via nvidia control panel - what it translates to is you playing your game and if your FPS lands in 30-144Hz range gsync is on doing its thing once your FPS is getting close though to either 143 or 144 gsync beings talking to vsync for the hand off and your game will still be tear free WITH NO fps limit and continue to run high numbers depending on your build but with some input lag since your in the 144 + territory - if you had vsync off though t will stop at 144 hz still try to "hand off" to no one and look like you capped your FPS yet you just inherited all the input lag vsync would have gave you anyways since its trying to do the switch - at this point you should see some tearing and notice the input lag - when i accidentally run my setup like this usually the tearing happens at the bottom half -

this is why people cap their FPS below the gsync range to prevent the switch communication some do at 143 for 144hz panels - same applys to other 100hz 240hz etc. - best bet is to drop it to 3 fps less then your max hz that way in case the rivatuner fps cap has a hiccup and introduces some extra fps you wont feel that switch as much happening

anyways

personally I would limit your frame rate 1st - then modify graphics options - and adjust accordingly - if your getting 100% usage at 144hz try limiting your ((frames/fps))down to say 100-120*FPS* and look at your usage from their - remember your HZ is not the same as FPS - capping your FPS 1st will allow you to enjoy your graphics settings where they are at max settings. when you need to change graphics settings is when you realize that even capping FPS doesn't help then its one of the settings triggering your gpu to work itself up - my 1st go to option is always shadows or shadow resolution -
I have this monitor MSI Optix MAG24C so, i have enabled G-Sync and i have enabled V-Sync from Nvidia Contro Panel and i always disable V-Sync and Framerate limit in all my games, but, if the GPU reach 100% of usage i will go in game options and i will limit fps to 120, if there still be the problem i will go down to 100fps ecc... or maybe i will drop down some graphics options. This is the best way to avoid the 100% gpu usage and still dont have input lag, it that correct?
 

DIRTY CES

Member
are both your encoders set to the same ? for instance your streaming should have been marked nvenc new then recording tab should be set to use same encoder as stream i believe or manually input nvenc -

I set both the same , I selected nvenc new for streaming and selected nvenc new for recording. When I was streaming I saw it was getting blurry when moving so I stopped recording and left the stream running. It still looked the same.

stream and record at 1080p
max quality on both
Psycho Visual Tuning on both
keyframe 2
bitrate 8000 on stream and 40000 on recording.
just can't believe my i7700k at veryfast looks better. This sucks.

Any help is greatly appreciated
 

Fingers

New Member
I fixed my stream issue. Game mode had gotten enabled again. With it off I have no issues with the current version.
 

DIRTY CES

Member
You need to post a log then

This will be my first time getting a log, When I stream and I'm done where can I find it ?

EDITED
Oh wait a sec I just googled it. If it saved that stream log I'll post that one. Is that okay ?

Also I just read that max quality and quality is the same except max has 2 pass encoding, Should I change both stream and record to quality ? Thanks in advance
 
Last edited:

KILLOGERC

New Member
Why YouTube is so square picture? On twitch everything is perfect!
111.jpg
TWITCH YOUTUBE
 
Hey all. Nice improvements on the Nvenc, especially for those who game and stream on the same GPU. I have seen plenty bad streams turned out to pretty good quality with a 1070.

I have been running NDI out to a 1080/Ryzen 7 streamer box and usually running x264 medium. Been trying the new Nvenc for a week or so now. What is the absolute max quality settings we can set for Twitch?, cause i'd like to see how far we can take the previous gen GPU's. Haven't tried RTX to stream yet though.
 
Status
Not open for further replies.
Top