Considering that the purpose of the OBS changes and the new Nvidia Codec is to bring "better performance and quality" can you even explain to me what the purpose is then when I literally lose 10% of my GPU power? "Better performance" means literally nothing. In fact, it's even worse.@Bonezz you seem to forget that the new nvenc works a bit differently then the old nvenc, so you can't really compare the two.
Comparing OBS to shadowplay is even more weird, shadowplay doesn't do scene compositing, so less GPU usage.
That is what I am saying! NVENC old or new no difference. Even more, NVENC New is way more sluggish. Testing with clean OBS install without beta 9. And NVENC new supposed to be like Software Fast preset. It is nowhere near fast. It is barely on par with veryfast.
Made couple of more tests with clean OBS install (still mic does not work for some reason in OBS).
Here is NVENC OBS 22.0.2
Default Preset
1080p60
Two Pass On
9000kbits
High
Auto
NVENC(old)recording2
And also a test with x264
VeryFast Preset
High
Same bitrate, res and fps
x264VeryFastRecording1
Both look similar. VeryFast looks a bit better on some angles, and a bit worse on others. As you can see both look almost same as with new NVENC beta 9.
It seems Beta 9 does not give any benefit to RTX 2080 I am using.
And one more thing. Look at usage. See anything weird? I do. Look at GPU mem usage. While using NVENC encoder it drops to 7500Mhz when it enters P0 state mode. In main menu it is 7700mhz as it shoul.
No such thing happens while using x264 encoder.
Considering that the purpose of the OBS changes and the new Nvidia Codec is to bring "better performance and quality" can you even explain to me what the purpose is then when I literally lose 10% of my GPU power? "Better performance" means literally nothing. In fact, it's even worse.
So we can see quality improvements even with the stable version of OBS because is all in the Turing chip? The beta only do minor impact on the GPU?The new NVENC build is not even supposed to increase quality, the quality increase is in the Turing chip, and i'm definitely seeing that quality increase in my tests with Anthem.
So we can see quality improvements even with the stable version of OBS because is all in the Turing chip? The beta only do minor impact on the GPU?
But the stable versione of OBS doesnt have Psycho Visual Tuning and Look-ahead so, this 2 options arent for quality improvements? Only for impact?Correct, quality-wise the new nvenc shouldn't be any different from the old nvenc encoder at the same settings (on the same GPU).
That's why I mentioned "at the same settings", the settings you mention aren't Turing specific though.But the stable versione of OBS doesnt have Psycho Visual Tuning and Look-ahead so, this 2 options arent for quality improvements? Only for impact?
I also had problems with 4K60P on this build, when I commented on it Osiris told me that the max bitrate is locked to ~40mbps. Thats why you are getting low quality.Greetings everyone,
Just got RTX 2080 and started testing OBS. Performance was great on current build and quality was ok (usual nvenc). So decided to test new rtx beta 9 build. Copied files from zip to OBS folder, launched. It runs ok. But. There is but. That is why I decided to create account and post here. The quality. Oh... It is terrible! Worse than software ultrafast! No matter what quality preset (tested quality and max quality), psycho tuning on and off, look ahead on and off. I can stream 4k60 fps now with current OBS build at 51k bitrate no problemo. But beta build. Wow. I expected at least somewhat better than current build. Here are my specs (also no dropped frames, no frame lag, no dropped (network) frames, frametime around 0.5-1ms):
Win 10 Pro Newest Build fresh install (installed two days ago).
Newest Nvidia drivers
9600k CPU clocked at 4.7Ghz (watercooled)
16gigs ddr4 dual channel Corsair 14CL 2400mhz ram
256gig nvme, 500gig ssd, 256gig ssd
MSI RTX 2080 DUKE clocked at ~1900 core (stock) and 15ghz mem clock
GPU usage around 65% while streaming
If you need something else let me know!
Maybe there is some kind of a glitch.
Thank you!
Update:
Tried Max Performance (With same settings as above, meaning 1080p60 9000kbits, psycho and la on) and what do you think? Quality is same as Max Quality or Quality preset! Even worse usage is higher and it introduces some microstutter in game!
So that two options could improve quality even in Pascal because they arent a Turing thing, right?That's why I mentioned "at the same settings", the settings you mention aren't Turing specific though.