Question / Help AMD encoder giving bad quality?

i use a 50k cbr h.264 Amd encoder to record my games, but when uploading to YouTube, the quality goes to shit.
But I don’t think it’s YouTube’s compression as a friend of mine uses the same settings but gets significantly higher quality.
my friend has an nvidia card and thus the nvidia h.264 encoder and that’s the only difference between gameplays of mine and my friends.
Is it just the amd encoder?
 

DEDRICK

Member
For best recording quality use CRF or CQP, start at 23 then work your way down, 20 should cover a lot of sources(probably overkill). Lower = More bitrate available, larger filesizes, in-perceivable difference in quality after a certain point.

CRF (Constant Rate Factor) is a variable bitrate mode that adapts to the source and strives to maintain the same image quality at all times.

The more bitrate you can feed Youtube the better for complex scenes, CRF can cause your bitrate to go between 8-80Mbps to cover the requirements it needs to maintain image quality.
 
Last edited:

DEDRICK

Member
CRF and CQP are the same thing pretty much, ICQ is the name for it for QuickSync though it also has CQP.

CRF = ICQ
CQP allows you to control the rate factor for the 3 frame types I(Intra/ Keyframe),B(Bi-directional),P(Predictive)
 

DEDRICK

Member
Either the same or stepped upwards. You want I to be the lowest, B higher, P even higher, so like 23/26/29 as an example but that is getting into advanced encoding where you are trying to save bitrate in the right places. Doesn't serve much purpose for a local recording
 

Suslik V

Active Member
Slow down your ingame moves, use gamepad whenever possible instead of the mouse (for smooth camera moves), try lower fps (30 for example). And quality will increase...
 
Slow down your in game moves, use gamepad whenever possible instead of the mouse (for smooth camera moves), try lower fps (30 for example). And quality will increase...
Good god just stop please lol

Either the same or stepped upwards. You want I to be the lowest, B higher, P even higher, so like 23/26/29 as an example but that is getting into advanced encoding where you are trying to save bitrate in the right places. Doesn't serve much purpose for a local recording
So the different frame types just saves bitrate whenever possible and doesn’t affect quality? So putting say 20 for I and P in basic mode would just be fine?
 

DEDRICK

Member
I frames require the most bitrate, they are the full image frames, Intra/keyframes hold the video stream together.
If you lose a keyframe the video stream will fall apart until the next keyframe, you have probably seen this happen on digital TV, it becomes a blocky mess missing a lot of data.

B frames need slightly less bitrate as they are bi-directional prediction frames, they only contain the data that has changed since the last keyframe, they are also known as reference frames.

P frames look forward to try and predict what will change next, they need the least amount of bitrate.

You can just use the same value for all 3 in a local recording, you would split them if you are doing a release where you have bitrate requirements
 
Comparison of x264, NVENC, Quicksync, VCE - maybe you miss it? As soon as you talking about the high bitrates (50`000 kbit/s) - there is almost no difference. YouTube re-encodes videos.
Yeah I've tried using x264 before, (and in trying it right now I just found the CRF option) the only issue is I get terrible performance in pretty much every game when using x264

I just tested it again using x264 crf on the veryfast mode nad my game was extremely choppy and laggy once I started recording. It's very weird as in the thread you linked, the guy performing the tests used an i5 3570k and an i7 6700k in his x264 and h.264 tests.

But I'm running a 6700k @4.5 ghz and it's not able to record x264 veryfast?
 

koala

Active Member
Your CPU must be able to run the game and encode video at the same time, if you use the software x.264 encoder. In that thread, I run my tests by replaying a losslessly recorded video, not by capturing a game, so the full CPU power was available to the encoder. If you want to record a game, CPU power is split between the game and the encoder. The linked thread is an encoder quality comparison, not a OBS benchmark or a demo how OBS may perform with certain settings.
 
Your CPU must be able to run the game and encode video at the same time, if you use the software x.264 encoder. In that thread, I run my tests by replaying a losslessly recorded video, not by capturing a game, so the full CPU power was available to the encoder. If you want to record a game, CPU power is split between the game and the encoder. The linked thread is an encoder quality comparison, not a OBS benchmark or a demo how OBS may perform with certain settings.
Ok so maybe I wasn’t so clear in the original thread as I think you’re misunderstanding me. It’s not that h264 or x264 are specifically giving me bad quality when a different encoder can do better, I’m saying my friend and I are using the exact same settings and should get the same quality (with the difference being I am using the amd encoder and my friend is using the nvenc) but my videos are seriously grainy and poor quality when they shouldn’t be.
 

koala

Active Member
The comparison in that thread is for streaming at a fixed bitrate. For this scenario, the amd encoder is definitely worse than nvenc or x264. You can ramp up the bitrate, but the output always seems more grainy and blurred with amd than with nvenc/x264. Even Quicksync is better than the amd encoder.

I didn't do tests with quality-based encoding. In that scenario, bitrate varies automatically according to motion in the video. High motion takes more bitrate, low motion lower bitrate. You configure no explicit bitrate, but instead with CRF/CQP/ICQ you define how much quality is removed from the original video before saving it. Even here, amd seems to perform not so good as nvenc/Quicksync/x264. If encoding with x264 with crf=23 looks good, encoding with amd and cqp=23 may look worse, although the quality removal parameter mostly says the same. Try a lower value for amd, for example cqp=15. The resulting video may be much larger, but perhaps the quality is better. But I fear that it isn't possible to get a video that isn't distinguishable from the original game. With nvenc/Quicksync/x264 this is possible.

Last remark: the demo videos linked in the above thread are not hosted on Youtube. These are the original videos output from OBS to enable you to compare them directly. Youtube, on the other hand, recodes every uploaded video as was said earlier in this thread. That recoding makes the videos look worse than the original, so a comparison is very difficult - you compare the Youtube recoding, not the original OBS output.
 
Yeah, I’ve been using a cqp=15 for a while, and the quality looks good, just always bad on YouTube. So I guess this is more of a thread for a YouTube forum, but anyways is quicksync available to anyone? Or only people with certain hardware, etc. I’d like to try it..
 

koala

Active Member
For Quicksync, you need a Intel CPU with igpu (integrated GPU) and you need a motherboard that supports the igpu. Then you make sure in the system BIOS that the igpu is enabled.
Then, with Windows 10, the igpu drivers are automatically retrieved from Windows Update and installed, regardless of a monitor connected to the igpu or not. If all this went well, you see a "Intel(R) HD Graphics" entry in Windows Device manager in the Graphics Card category. If this entry in Device Manager is present, you should find "Quicksync H.264" in the encoder dropdown box of OBS as well.
 
Top