So, I have an Nvidia Titan Black Graphics card, which is why I use the NVENC encoder. I went off of this video guide here for quality settings, and it gave me pretty satisfactory results: https://www.youtube.com/watch?v=h08Sqg5B9As
however, I wanted to push the quality a bit further from what was suggested in this video, and I wanted to find out which setting would have the most bearing quality wise when using the NVENC encoder.
I will go over a check list of the settings I have (if you watch the video, they are essentially the same)
* Output mode: Advanced
* Encoder: ENVENC
* Enforce Streaming Service encoder settings: Checked
* Rate Control: CBR (Constant Bit Rate) [Should I try Lossless?]
* Bitrate: 6000
* Keyframe Interval: 2
*Preset: Default
* Profile: High
* Level: 4.2
* Use Two Pass Encoding: Unchecked
* GPU: 0
* B-frames: 2 (I tried 4, but yielded either equal, or slightly worse results)
---------------------
I've only actually tried testing B-frames, since some say that More B-frames = better quality, some say less B-frames = better quality, I'm not sure which is which, however I tried 4 from 2, and yielded equal to slightly worse results, so i think less B-frames = better quality?
I haven't fiddled around with more bit rate, but when I was using X-split, a slower CPU preset had more bearing on quality, but also drastically increased my CPU core temps, and also unsteady frame rates at times.
I guess what i'm trying to find out is if there is sort of like a CPU preset equivalent when using the NVENC encoder, because when I was using Xsplit with medium CPU preset, I could pump out amazing quality at a lesser Bitrate (I was using 5000 Bitrate at the time)
What would be the variable that would affect quality the most out of these settings? I was thinking maybe increasing the level, but i read this source that increasing the level, only specified to OBS how much Bitrate and decoding speed it would allow for, so i don't think this has direct bearing on quality if I increased it to a higher level: https://en.wikipedia.org/wiki/H.264/MPEG-4_AVC#Levels
----------------------------------
My Computer Specs:
Motherboard: ASUS Rampage V Black Edition
Processor: Overclocked Intel Core i7 Extreme -5960X (8 Cores 4.2 - 4.7GHz)
Memory: 32GB - Corsair Dominator Platinum 3100MHz DDR4
Graphics Card: EVGA NVIDIA GeForce GTX Titan Black Superclocked - 6GB GDDR5
Power Supply: Corsair AX1500i Modular Power Supply (1500 Watts)
Liquid Cooling: Custom pump and pipes
Optical Drive: 14X Blue Ray / RW Combo
Solid State Drive: Samsung 840 EVO 1TB SSD
Hard Disk Drive: Western Digital Black 4TB - 7200RPM HDD
Card Reader: Internal 5.25" Bay 6-Slot Card Reader
however, I wanted to push the quality a bit further from what was suggested in this video, and I wanted to find out which setting would have the most bearing quality wise when using the NVENC encoder.
I will go over a check list of the settings I have (if you watch the video, they are essentially the same)
* Output mode: Advanced
* Encoder: ENVENC
* Enforce Streaming Service encoder settings: Checked
* Rate Control: CBR (Constant Bit Rate) [Should I try Lossless?]
* Bitrate: 6000
* Keyframe Interval: 2
*Preset: Default
* Profile: High
* Level: 4.2
* Use Two Pass Encoding: Unchecked
* GPU: 0
* B-frames: 2 (I tried 4, but yielded either equal, or slightly worse results)
---------------------
I've only actually tried testing B-frames, since some say that More B-frames = better quality, some say less B-frames = better quality, I'm not sure which is which, however I tried 4 from 2, and yielded equal to slightly worse results, so i think less B-frames = better quality?
I haven't fiddled around with more bit rate, but when I was using X-split, a slower CPU preset had more bearing on quality, but also drastically increased my CPU core temps, and also unsteady frame rates at times.
I guess what i'm trying to find out is if there is sort of like a CPU preset equivalent when using the NVENC encoder, because when I was using Xsplit with medium CPU preset, I could pump out amazing quality at a lesser Bitrate (I was using 5000 Bitrate at the time)
What would be the variable that would affect quality the most out of these settings? I was thinking maybe increasing the level, but i read this source that increasing the level, only specified to OBS how much Bitrate and decoding speed it would allow for, so i don't think this has direct bearing on quality if I increased it to a higher level: https://en.wikipedia.org/wiki/H.264/MPEG-4_AVC#Levels
----------------------------------
My Computer Specs:
Motherboard: ASUS Rampage V Black Edition
Processor: Overclocked Intel Core i7 Extreme -5960X (8 Cores 4.2 - 4.7GHz)
Memory: 32GB - Corsair Dominator Platinum 3100MHz DDR4
Graphics Card: EVGA NVIDIA GeForce GTX Titan Black Superclocked - 6GB GDDR5
Power Supply: Corsair AX1500i Modular Power Supply (1500 Watts)
Liquid Cooling: Custom pump and pipes
Optical Drive: 14X Blue Ray / RW Combo
Solid State Drive: Samsung 840 EVO 1TB SSD
Hard Disk Drive: Western Digital Black 4TB - 7200RPM HDD
Card Reader: Internal 5.25" Bay 6-Slot Card Reader
Last edited: