Question / Help Why does NVENC look better than x264 at lower bitrates?

Panpan

Member
Hello everyone!

I've always preferred the Dual-PC stream approach to get the most out of quality. Especially when most of my viewers don't really have internet connections that can handle anymore than 3000 bitrate streams.

I recently watched a friend's stream and it looked fairly clear for a 2500 bitrate stream! He was using NVENC in Simple Mode (I know right!!) and was streaming at 720p/30fps.

After testing x264 (Slow Preset) vs. NVENC (HQ, 2-Pass, 4 B-Frames), it seemed like they were practically the same @ 2500 bitrate. In some areas, it looked like NVENC was even pulling ahead. Can anyone explain this?

I've tested it on: Overwatch, CS:GO, Blade and Soul (can't tell the difference here, way too pixelated).

x264 Encoding is done on a Ryzen 1600 @ 3.9GHz w/ 3200 RAM
NVENC Encoding is done on a 1070 Ti.

I'm happy to include logs, but it doesn't seem to show any errors/obs isn't generating any errors.

EDIT: At 3500 bitrate the results are still the same. But once I push to 4500 bitrate, it seems like x264 pulls ahead :s
Perhaps my CPU isn't properly encoding it at the right preset?
 

koala

Active Member
Yep, my research didn't show any different. NVENC is very good in comparison to x264. It's almost equal in compression quality. And if your game suffers fps because of the high cpu usage of x264, nvenc is definitely better, because it uses almost no CPU.
Cannot understand why many folks here repeat over and over again that NVENC is so bad. That's simply not true.
 
Are you streaming at 720p30fps as well?
The biggest difference in quality at low bitrates is output resolution and frame rate target, based off your in-game resolution.
Another major factor is how much detail there is in the scene you are recording. Lots lighting, smoke, shadows, foliage, etc all have a very large impact as you need more bitrate to handle the change in pixel position for each frame.
 

Panpan

Member
Yep, my research didn't show any different. NVENC is very good in comparison to x264. It's almost equal in compression quality. And if your game suffers fps because of the high cpu usage of x264, nvenc is definitely better, because it uses almost no CPU.
Cannot understand why many folks here repeat over and over again that NVENC is so bad. That's simply not true.
This is some really good information! I just didn't think that it'd be comparable to the SLOW preset of x264. Although I have a Ryzen 1600, some games get a much harder hit from using any preset below "faster" compared to NVENC. Thanks for the link to your thread, I'll make sure to take a thorough read through it c:

EDIT: I forgot to mention that in previous tests I was using an older generation Nvidia card. I remember reading somewhere that quality has improved over each new generation of cards. That might've factored in as to why I thought x264 was largely better than NVENC.
EDIT 2: One more question - what B-Frames were you using during your tests? I'm currently using 4 as I heard B-Frames "can" increase quality, but I haven't really seen the difference in both quality or impact on performance. Did you also use 2-Pass?

Are you streaming at 720p30fps as well?
The biggest difference in quality at low bitrates is output resolution and frame rate target, based off your in-game resolution.
Another major factor is how much detail there is in the scene you are recording. Lots lighting, smoke, shadows, foliage, etc all have a very large impact as you need more bitrate to handle the change in pixel position for each frame.
I see.. that would make sense. My in-game resolution is 1080p. Yep! I've tried 720p/60fps and also 720/30fps. At the moment, since I'm trying to leave bitrate as low as possible, I'm stremaing at 720/30. It seems like CS:GO in areas with smokes looks alright, which is awesome! However, games like Blade and Soul with insane amounts of oliage, text, shadows and particles everywhere really REALLY kills the quality at low bitrates. I wish there was a way around this :/ Thank you for the reply though c:
 
Last edited:

alpinlol

Active Member
Lately I have been using NVENC more and more for streaming since I was testing stuff and I came to the conclusion that 1080p60 with 5500 Bitrate works really well for me even in games like Fortnite, D3 and a couple of other games that are easy to stream anyway.

GTX 1060 6G with literally default Settings for NVENC except for the Bitrate.
 

BK-Morpheus

Active Member
I tested the same (NVENC on GTX1700) and x264 (slow and up to very fast) always wins for me, when it comes to quality.
 
The only way around poor quality at low bitrates is to either increase bitrate, or use a slower preset/fine tune custom parameters of x264.

In regards to quality, I think it really depends on what game it is and how much is going on in each frame the encoder has to handle.
Software encoding (CPU) has all functions of the h.264 codec available and can be fine-tuned.
Hardware encoding (GPU) is quite limited in available functions of the h.264 codec and is fixed or very limited in tuning.

GPU encoding will be superior when using NVENC on the 1000 series of nVidia cards in comparison to AMD cards at lower bitrates, due to further refinements.
This is in comparison to VCE 3.4 on 500 series of AMD, not sure about VCE 4.0 on Vega chipsets though (Lack of information from what I can find)

At higher bitrates one should not see much difference between any of the encoder libraries (x264, Q-Sync, NVENC, AMD VCE) Twitch unfortunately doesn't like people doing over 6k.

Some links for you guys to peruse:
https://en.wikipedia.org/wiki/X264
http://dev.beandog.org/x264_preset_reference.html
https://en.wikipedia.org/wiki/Nvidia_NVENC
https://en.wikipedia.org/wiki/Video_Coding_Engine
 

Panpan

Member
I tested the same (NVENC on GTX1700) and x264 (slow and up to very fast) always wins for me, when it comes to quality.
I see.. I hope that it's just that I'm not noticing it or the games I'm streaming. If you say that veryfast is better than NVENC for you, then I have a feeling my CPU isn't utilizing the slow preset properly. Are you also testing this with 2500 bitrate?

The only way around poor quality at low bitrates is to either increase bitrate, or use a slower preset/fine tune custom parameters of x264.

In regards to quality, I think it really depends on what game it is and how much is going on in each frame the encoder has to handle.
Software encoding (CPU) has all functions of the h.264 codec available and can be fine-tuned.
Hardware encoding (GPU) is quite limited in available functions of the h.264 codec and is fixed or very limited in tuning.

GPU encoding will be superior when using NVENC on the 1000 series of nVidia cards in comparison to AMD cards at lower bitrates, due to further refinements.
This is in comparison to VCE 3.4 on 500 series of AMD, not sure about VCE 4.0 on Vega chipsets though (Lack of information from what I can find)

At higher bitrates one should not see much difference between any of the encoder libraries (x264, Q-Sync, NVENC, AMD VCE) Twitch unfortunately doesn't like people doing over 6k.

Some links for you guys to peruse:
https://en.wikipedia.org/wiki/X264
http://dev.beandog.org/x264_preset_reference.html
https://en.wikipedia.org/wiki/Nvidia_NVENC
https://en.wikipedia.org/wiki/Video_Coding_Engine

Thanks for all the links! That would make sense, although I haven't tested enough games, I've tried the variety that I have and NVENC seems to look better at all of them. Considering I'm already using the "slow" preset for x264, I'm not too willing to try and get anymore quality out with custom parameters in-case I break anything :P
To be fair, I did search up in the past quite a few custom 'presets' (mainly a lot of custom parameters) and didn't really see too much of a difference.
 

koala

Active Member
EDIT 2: One more question - what B-Frames were you using during your tests? I'm currently using 4 as I heard B-Frames "can" increase quality, but I haven't really seen the difference in both quality or impact on performance. Did you also use 2-Pass?
I was using the OBS default at that time (the tests are one year old). I assume it is the same as today: 2 B-frames and two-pass encoding enabled. As far as I remember, I did not see any difference changing any of these values.

By the way, the "slow" preset for x264 is extremely CPU-demanding. My slightly overclocked i7-6700k was able to only encode 1024x720 at 30 and 60 fps, and 1920x1080 at 30 fps without frame loss. 1920x1080 at 60 fps wasn't possible, always frame loss. If you have other software running, it isn't usually able to get enough CPU. Usually, you go with the x264 veryfast preset, which is about the same quality as NVENC at the same bitrate. The quality difference between slow and veryfast is existing, but it's not very high. Your watchers will probably not notice it.
 
The difference in CPU preset speeds becomes very noticeable in areas of high motion, high pixel movement density (Foliage, smoke, particle effects, etc) Even at veryfast x264 will outperform at low bitrates due to the enabled features that h264 via GPU lacks/doesn't have enabled. That difference will only be reduced when bitrate increases or x264 preset is lowered.

One can use a bitrate calculator in order to try to find an optimal bitrate output:
http://www.silverjuke.net/public/misc/bitrate-calculator.html

Formula to calculate bandwidth:
Total Horizontal Pixels
X
Total Vertical Pixels
=
Resolution ( Pixels/Frame )
-----------
Resolution ( Pixels/Frame )
X
Refresh Rate ( Frames/Second )
=
Result ( Pixels/Second )
----------
Result ( Pixels/Second )
X
Color Depth Factor ( Clock/Pixel )
X
Coefficient(Bits/Clock)
10
=
Bandwidth Per Channel
----------
Below is a link if you want to determine just how much data you are sending out:
https://toolstud.io/data/bandwidth.php
 
Top