Question / Help Best x264 Performance vs Quality

xSonic521x

Member
I was wondering what preset would be the best one in terms of quality and performance, kinda like, if you went from Very Fast to Faster, you get a noticable quality increase without much of a performance hit but from Fast to medium, it's a fairly big hit but not much of a quality difference so not worth the hit. I may have answered my own question but I'd still like some imput, I'm asking about the differences in quality AND performance from going from one preset to another, not trying to squeeze every single pixel on screen even if I can. The best example I can give would be like buying a $500 video card with great quality and amazing performance, but buying a $1000 video card that performs maybe 15% better, not worth the "bang for your buck"
 
Last edited:

D2ultima

Member
Depends on your CPU, upload and game. I personally stream medium or slow from my laptop but my bitrate doesn't ever go above 1600 total (audio + video). I almost never go faster than "medium".
 

xSonic521x

Member
Well I have a 4770k, and granted I could stream at a Medium preset with a 2000 bit rate, but the performance hit doesn't seem worth it with the very little gain visually so I'm trying to find the sweet spot.
 
Last edited:

Jack0r

The Helping Squad
I would think that veryfast has the best cpu usage to quality factor and because of that is the default used in pretty much any software. And I can say, that going from veryfast to faster can double the cpu load easily with normal resolution/fps settings but heavy to encode content.
Going from veryfast to slow will increase your quality by about 3% on low bitrates, and the higher you go with your bitrate, the smaller the difference is. This is calculated SSIM difference which is different of the perceived visual difference the human eye will see. To my eye though, the difference between veryfast and slow on low bitrates might be 15% with a cpu usage increase of probably 200% :) And as mentioned, that will shrink if you can increase the bitrate.
At the same time, going from 1000kbit to 2000kbit bitrate will increase your quality by up to 10% calculated difference, which is probably like 30-50% better perceived quality and while staying on veryfast preset so without a usage increase.

In the end, if you can increase the preset, of course, do it, it increases the quality, but in general its not worth it, at least not in my experience.
 

xSonic521x

Member
Thank you very much for your input. =) 2000 is a solid bitrate and buffer though for streaming right?
 
Last edited:

D2ultima

Member
In terms of quality : performance drain factor, veryfast is probably indeed the best... but the performance change can be a bit more than 3%. In my experience, "veryfast" to "faster" has the largest visual difference of any single step, but for a visual improvement, two or more steps should be chosen. I can confirm that going from veryfast to medium at 1500-2000 bitrate will indeed make a notice-able difference, but again, it's up to how much performance you can spare or are willing to use.

I'm a perfectionist for quality who is cursed with low bitrate, so I have learned how to get the best out of what I got. But if you aren't comfortable with say... 80%+ CPU usage on most games and/or you don't know what you're doing, I say don't go below fast for 1600-2400 Kbps and don't go below faster for 2400-3000 Kbps.

Please note, the drain is different depending on the game, the fps, the resolution and the bitrate. More bitrate, higher resolution, higher FPS, "maid" or "high" encoding profile, etc? All have an impact as well as what's being encoded. So if you're unsure, you can use faster-veryfast or you can go under a longggg time of learning so you know which game does what and how to set your settings per each game.
 
Visual quality is highly subjective, so what looks bad to someone might look fine to you. Sometimes, you won't even notice certain artifacts until someone points them out. I myself can see severe jerkiness on the ground with the veryfast preset; it's very apparent when I run in a steady, straight line. The faster preset doesn't have this problem, thanks to better subpixel refinement, and the ground looks perfectly stable when I move. I'm not able to discern any improvements in the fast and medium presets, though, so I consider them practically useless for live encoding. YouTube also uses faster for their VODs, and, considering the costs of Internet bandwidth, it's in their best interests to encode their content efficiently. Note, however, that dual-core CPUs can't encode 1080p with faster in real-time yet; the best they can do is veryfast. For my i3-3240, I was able to squeeze in subme 3 with veryfast, clocking at 31 FPS (vs. 24 with faster and 38 with veryfast alone). The resulting quality was as good as faster's, too! That's the absolute sweet spot for me.

Also note that the thread count affects the accuracy of the VBV, so you don't want to use more threads than you have to. You'll see this as a sudden burst of macroblocking that lasts for a few frames. Just as you wouldn't consider only the average frame rate for a game, you shouldn't consider only the average quality for videos - you have to account for the minimums. I wouldn't go up to 4 threads for fast if I could do faster with just 3.
 
Last edited:

D2ultima

Member
Yes... except you're using a weak CPU to do 1080p encoding and trying to stay at 30fps or higher. He is using a comparably MUCH stronger CPU to encode 720p or maybe a little less. What is best for you is not best for him. At higher resolutions and such it's a lot more efficient to use a faster encoding preset than otherwise. I'm glad you were able to tune your sweet spot, but that results in a far larger research time to assimilate all of that as well as to internalize the quality-bitrate-video encoded-preset relations. It makes little sense in general.

Also, the bursts of macroblocking (large transparent white blocks I assume?) I only see when there's repeated, slightly varying objects of the same colour (like a field of grass in DayZ) and you're passing through them. But a faster preset makes that worse. I've never run into a situation where a faster encoding preset improved my quality in streaming a game.
 

xSonic521x

Member
This seems to be a very confusing subject. I did however ask several people who were streaming at the moment on twitch what their settings were and some were even willing to try other settings and as far as I know, unless you're full screening it, it's gonna be fairly hard to notice the difference from Very Fast to Faster, not that it isn't there. At least, that was my experience. I don't know, like I said it seems very confusing.
 
Last edited:

D2ultima

Member
It depends on bitrate/bandwidth. If you have 3000 bitrate 720p 60fps, faster and veryfast and maybe even "fast" will look rather similar unless you indeed fullscreen... far less if they're streaming at 30fps. And encoding profile (high or main) also makes a difference. AND importantly too, the game being streamed makes a difference in quality. Battlefield 4's slow/steady movement is going to look nicer than Titanfall's ultra rapid movement/wallrunning at the same settings.

If you are ok with "watchable" or something, then just throw 2500 bitrate on and 720p and you should be fine for almost all games. If you are willing to use your CPU within an inch of its life (which is exactly what I do, because that's why I bought an i7... and cuz my bitrate is low and thus compression helps me more) then go for it. If not, then just go with something fairly generic.

If you need more information than that, you can find someone who will teach you all the variables (though I doubt such a huge consolidation of information would be free from 1 person) or you could research it and/or trial/error it yourself like I've done since 2009 across a variety of programs such as but not limited to: xFire streamer, Xsplit, FFSplit, FMLE, OBS, Livestream procaster, etc.using various settings for hours on end.
 
Top