Question / Help Guide to find the best downscale for streaming high resolution gaming.

I've typed this up for a few friends lately and as 1440p gaming becomes more popular with time I thought it might make sense to put it somewhere public. This forum seems as good a place as any I could think of, but perhaps I should make a resource/guide, as this is not applicable to windows only.... We'll see where it goes over time.

This is basically a copypaste from a chat room so I tried to keep it brief and simple and avoid jargon. If more details or corrections are needed I will reply or edit it as needed. I just wanted to archive this info somewhere, so that others don't have to guess or work it out the hard way. As always, YMMV, do what you want, do what works for you. This just explains the reasoning and method behind the choice of 864p downscaled streams from 1440p games. Perhaps you'd prefer a different scale or a higher bitrate or whatever, I'm not the boss of you. I'm just explaining some stuff because it's not really well documented online, since most people historically play at 1080p or less. I hope this helps :)


Yeh if you are partnered, it is easy, you just stream at your native resolution and a super high bitrate as high as your upload can handle it... then twitch servers transcode that into different resolutions and rates for the viewers, but for the rest of us, we have to stream at around 6000kbps maximum, because twitch will not transcode, so the viewers have to be able to download >6Mbps to watch that.
But high resolution = more pixels = more bits of data
So the stream will have to either be high bit rate for high resolution, or we must compress it more
And the more we compress, the worse the quality of the picture
But if we reduce the resolution, then we also lose quality
So it is a balancing act :)

This is why it is generally recommended that if you play in 1080p it is best to stream at 720p... Because for 1080p to fit in 6000kbps, there is a lot of compression, and the compression makes the picture look bad especially for lots of movement (such as in FPS games where we are very 'twitchy' in our movement and aim)
If we downscale from 1080p to 720p there is also some quality loss, but the compression required to fit in 6000kbps at 720p is not so high, so the overall result is better

But 720p is 3/4 of 1080p and if we use this same scaling for 1440p, we would have a 1080p stream and we have the same problem as before... but if we scale from 1440p to 720p, that is 1/2 resolution, and the quality loss from that downscale is very bad so now we have a new problem.

So we have to find a new place to balance it... we can downscale a little more than the 3/4 but there is more quality loss, or we can compress more and there is more quality loss. We will always have lower quality than gamers playing at 1080p because we are always trying to fit more pixels from our game in the same bitrate stream....
This is why sometimes you will have problems watching 1440 or ultrawide streams, because sometimes people will just stream a higher bitrate to avoid this problem... but then it creates other problems with watching the stream because it is a fast download and there are lots of places between you and the streamer where that can become a problem so even if your download is fast enough it can still effect you.

So to work it out we can start with this list https://pacoup.com/2011/06/12/list-of-true-169-resolutions/ which shows resolutions at our same aspect ratio, and importantly it shows (highlighted in green) those resolutions which are divisible by 8. It is important to use one of those because of the way h264 encoding works, it breaks down your picture into squares, these blocks of pixels, to figure out how to compress them, and if we don't use one of those resolutions divisible by 8 (divisible by 16 is best in fact - another reason why 720p streaming is better than 1080p) then our encoder has to add some 'empty' pixels onto the side and bottom of our image, which means it has to do extra work and we lose performance for nothing.

So if we just consider those resolutions between 720p and 1080p, we have 792p, 864p, 936p, 1008p ... at 792 the downscale is still too much and looks ugly. At 1008p the compression is too much and looks not good.... So we should choose from one of the two in the middle. At 864p, we now have a downscale which looks OK - every 5x5 square of pixels becomes a 3x3 square of pixels. This gives us more room to do less compression so a moving picture looks best. We can go up to 936p resolution, and our downscale is less, but it is a very uneven ratio (65/100) and now we have to compress our image more and it can look bad especially for fast movement.

Sooooooo that is why 864p is the best streaming resolution for 1440p gaming. (Most of the time! You do you!)


You can use these same concepts to help you choose a downscale from any resolution. Just play the balancing game -
Too much downscaling looks bad because you are losing your picture.
Less downscale is more pixels is more data to compress and more compression looks bad especially with lots of movement.
Increasing bitrate gives you room to play with but can cause problems for viewers.
Sticking to to resolutions divisible by 16 is best but not a must.
 

Blank Hero

New Member
Yo, any updates for 2019?
But yeah, I see where you're going. 864 seems like the sweet spot. but that's for 16:9 aspect ratio streams.
Before I delve any further, what's your take on down scaling?
So Canvas is 1080p but output is set to 864p.
Would it be better to also set the canvas to 864p, downscale the sources to fit 864p.
Making both the canvas and output the same, witch as I've searched so far, is a good thing, keeping these two settings as close if not identical as possible.
and if recording in x264 or NVENC(old), upscale to 1080p.
Or.. Do you advice recording in 864p or even 720p, and upscaling to 1080p in a Video Editing Program.
There'll surely be lost in quality, but the question is, will it be noticeable? Assuming ideal render settings, I use Premiere btw.

Next Question, I'm planning to stream in a 19.5:9 or 16:3 aspect ratio.
So that'd be 2340 x 1080.
(I'm probably gonna do the math after posting this) but what's the best downscaled resolution in your opinion?
Would 1872 x 864 still be ideal? This is a 16:3 aspect ratio of 864 using the Aspect Ratio calculator found here >>> skip.gg/par

Thanks!
 

Blank Hero

New Member
Was doing the math, and another question came up.
Why does it have to be divisible by 8 and better 16?
is the 8 derived from the 16 in 16:9
and the 16 as well?

so if i'm doing a different aspect ratio, say 13:6
instead of dividing by 16, I divide by 6.5 and better 13?

I'm probably gonna need to calculate the pixels, or whatever I'm thinking.
The same way you arrived at 5x5 square becomes 3x3.
 

Blank Hero

New Member
How in the world did you compute "65/100"
also how did you compute for 5x5 and 3x3 based on the respected resolutions?
 

Blank Hero

New Member
I figured why it's 5x5 and 3x3
So the X is divided by 512
and the Y is divided by 288

so for 1536 x 864
1536/512=3
864/288=3

and for 2560 x 1440
2560 /512=5
1440/288=5

So now the question is why are we dividing by 512 x 288
hmmmmm... ???
 

koala

Active Member
Upscaling a video doesn't make sense, because it bloats the video with no additional quality. It's even the reverse: transmitting a bloated video needs more bandwidth, so it takes away bandwidth, so it removes quality. Upscaling a video should take place at the video player. It scales the video locally to a resolution that is suited best to the watcher.
Upscaling only makes sense if you compose a video from different sources, and one of the sources is "too small" to fit the other sources. In this case, you avoid downscaling all other sources by upscaling the one too small source.

About your resolution math: Please make yourself familiar with how encoders and decoders+players work, if you want to step away from the de facto standard resolutions 1440p, 1080p, 720p, 480p, 360p. Also look it up also from the player's side, not only from the encoding side.
If you need to find the right resolution for you, you have many constraints to consider. Usually, only a very limited amount of viable resolutions, or encoding settings in general, stay as possible settings for actual use.

The most important constraint from the player side is the resolution and aspect ratio of the player your users will use to consume your videos. Today, literally everyone except a tiny minority will use a 16:9 aspect ratio player, and the vast majority will use a display with a resolution of 1920x1080.

Try to produce a video (stream) that fits these 2 constraints best. An aspect ratio of 16:9 and a resolution of 1920x1080 at the player side. This means, no other aspect ratio than 16:9, and a scaling that is viewed most conveniently scaled to 1920x1080. That doesn't mean you have to produce 1920x1080 videos, it means if your video is rescaled by the player to 1920x1080, it should look best with no component (branding or game UI for example, or a facecam) being too big or too small.
In addition, assume part of your viewers will view your video not fullscreen but in a windowed viewer, embedded in the portal's website. See how your videos will look if scaled to a windowed viewer and make sure the most important stuff is still readable in this scaled down resolution. The last scaling consideration is how your video will look if it is embedded in some 3rd party website, for example in a forum or news website. This is a tiny viewer, really suited only for preview, where only really big letters are still be readable. Usually, only your title is big enough for this.

And now think again how your output resolution should look like, with all the other usual constraints: native game resolution, encoding power of your PC, your bandwidth, the bandwidth of your viewers, transcoding at streaming service or not.

By the way, resolutions dividable by 8 are recommended, because the h.264 algorithm divides frames into 8x8 macroblock squares and handles compression within these squares mostly. Actually, 16x16 macroblocks are also used, and not having the horizontal and vertical resolution dividable by 8 or 16 makes h.264 encoding inferior at least at the edges if the resolution is not a multiple of 8 resp. 16.
 
Last edited:
Sorry for the belated reply, I am not around much lately. Koala covered it pretty well so I'll just try to 'plain English' it a little and also discuss other aspect ratios.... The thing here (as always with technical topics) is not to just copypaste the settings from the webpage, but to gain some level of understanding of the topic so that you can apply the logic to your own scenario. Hopefully this will help.

864 seems like the sweet spot. but that's for 16:9 aspect ratio streams.

That's correct.

Would it be better to also set the canvas to 864p, downscale the sources to fit 864p.
Making both the canvas and output the same, witch as I've searched so far, is a good thing, keeping these two settings as close if not identical as possible.
and if recording in x264 or NVENC(old), upscale to 1080p.
Or.. Do you advice recording in 864p or even 720p, and upscaling to 1080p in a Video Editing Program.
There'll surely be lost in quality, but the question is, will it be noticeable? Assuming ideal render settings, I use Premiere btw.

The reason to keep them the same is that any scaling you do will place a performance load on your equipment and you want to keep that to a minimum. koala was corect above, upscaling should be left to the player. If you want to record and stream, then it's not appropriate to do so at the same resolution, obviously, so leave the canvas set to the monitor/game resolution, and downscale the output for the stream. This way you will record it just as it is on screen, from the canvas, for editing later in high quality, but downscale the output for the stream, to maintain a manageable bandwidth.

Why does it have to be divisible by 8 and better 16?

Again koala was correct. the reason is that the encoder cuts the image up into squares and does its work in those little chunks (called 'blocks' - and yes, this is why we see the 'blocky' artifacts in streams sometimes). If you have something not divisible by 8 or 16, then the encoder has to deal with the remainder, and the way it does this is by padding the image with junk pixels at the sides and the top, to fill up those blocks completely. Your encoder will be doing work to encode that junk, and then afterwards, it cuts the extra pixels off again to give your desired resolution. It's wasted performance.

so if i'm doing a different aspect ratio, say 13:6
instead of dividing by 16, I divide by 6.5 and better 13?

Accordingly, keep it at 8 or 16.

That being said, you mentioned some really weird aspect ratios (well 16:3 is not so weird, that's 3* 16:9 monitors, are you doing flight sims or something?) So, what resolution should you stream at, for other than 16:9? Well, take the concept from the OP and apply it to your situation. What you need to do is find the optimum balance between:
Bandwidth - needs to be minimal to be accessible by most viewers (about 6k) - but less bandwidth is less data so less quality
Downscaling - quality is lost because you're sending less pixels than the original image - but you reduce bandwidth
Encoding - quality is lost in the compression - but you reduce bandwidth

Let's apply this 'balancing act' to your extreme example of 16:3 aka 3*16:9. Now, you have 3* the screen space to broadcast. No problem, just use 3* the bandwidth, you have no quality loss, and... nobody will be able to watch your 18000k stream. D'oh. OK obviously we have to compress it down to something more manageable, but 3* the compression is going to make for one very blocky, blurry, ugly stream. So, how do we avoid that? Downscale it some. OK what do we downscale to? Well, we should scale by integer multiples, because we can't have a fraction of a pixel, that's not a thing ;) But we don't want to just simply downscale to 1/3 of the native resolution because now you're talking about a LOT of lost pixels. So, we need to balance between some degree of downscaling and some degree of compression. How much? Let's break out the calculator, so we can put some actual numbers to this.

Let's use a 'normal' resolution and aspect to get a kind of 'baseline' here. Let's start with 1080p gaming. That's 1920*1080 pixels. Since we want to talk about this in vertical resolution (like 1080p) we can express this as:

((16/9)*1080)*1080
Each of those pixels is 24 bits; 8 bits each of red, blue and green. This happens 60 times every second at 60FPS. That's our native, uncompressed, unscaled bitrate:
(((16/9)*1080)*1080)*24*60 = 2985984000 bits per second
So what is our compression ratio, done by the encoder, if we don't downscale this at all? Well let's assume a 6000kbps stream:
((((16/9)*1080)*1080)*24*60)/6000000 = 497.664
Pretty amazing that we can have one 500th of the data and it's still a recognisable image!
Anyway, what if that was downscaled to 720p instead?
((((16/9)*720)*720)*24*60)/6000000 = 221.184
That's less than HALF the compression! No wonder it looks better. Of course, we did lose a lot in the downscale - we went from 1080 to 720, and that's not 1080/720, because we're dealing with two dimensions here - height AND width - square pixels, not lines, so it's 1080^2/720^2 - or in simpler terms, it's not 2/3rds the quality, it's 2/3rds of 2/3rds = 4/9ths the quality. Downscaling is entirely lossy, always... We're just deleting pixels.... but encoder compression varies depending on how much we compress. The higher the compression ratio, the more lossy it gets.

That being said, let's look at 1440p:
((((16/9)*1440)*1440)*24*60)/6000000 = 884.736
That's a whopping compression ratio. At this point, the encoder becomes quite inefficient. Blurring, blocking, colour banding, etc all become quite ugly. As above, downscaling is a nasty way to reduce the bandwidth requirement, because we square the loss, but it has its upsides - at least it's relatively consistent. It will behave the same way no matter if we are moving fast, walking through fog, whatever. We can try downscaling to 720p and if you do, you'll quickly find that it doesn't look too nice. There won't be many compression artefacts like blocks or blur but the downscale to half the resolution, because downscaling is squared, is actually a quarter of the image and... it's a bit too much. So, as per the OP, we step it up a little and we can try 864p:
((((16/9)*864)*864)*24*60)/6000000 = 318.50496
Not too bad. 936p works out to 373.80096 which is also nice, but as discussed in the OP, it's a very uneven ratio to downscale (936/1440=0.65=65/100), which tends not to look good (because, again, fractions of pixels aren't a thing) and 864 works much better (864/1440=0.6=6/10)

So let's take it to your theoretical 16:3 1440p image
((((16/3)*1440)*1440)*24*60)/6000000 = 2654.208
Obviously, that's not going to work well! You're going to need to downscale it, and quite a bit. But how do you know how much is right? well, beauty is in the eye of the beholder....but we do have some science to save us. There are metrics such as PSNR and SSIM which can be used to compare the 'before and after' of the compression and give us objective datapoints to measure the quality of the stream. I did this when finding the 864p suggestion above, and I'll leave it to others to test it for other resolutions, but there is one other factor to consider here - framerate.

What you're talking about there is equivalent to about one and a half 4K streams! That's a TON of pixels dude. I wish I had a PC capable of gaming at that resolution. Streaming that res, is going to have to come with some sacrifices. We've discussed sacrificing raw resolution by downscaling, but you can also sacrifice some framerate and lower the bandwidth requirements significantly. If you're playing at that res, I very much doubt you're doing anything that involves super high framerates anyway, so I'd strongly consider streaming at 30 or 40FPS instead. Just to demonstrate, if you downscaled that to 864p (as per OP) and halved the FPS, you'd be looking at
((((16/3)*864)*864)*24*30)/6000000 = 477.75744
A huge drop from the above, and now you're in the neighbourhood of the compression ratios of the 1080p native stream.

All of this being said, now you're at 1/3 the native pixel count, half the framerate, and still compressing heavily - and chances are, people aren't viewing it on a 16:3 monitor. All that loss, for what? The viewers will probably be downscaling it even further just to fit it on their monitor. koala said above, and once again I agree, that you should really consider the viewer's machine when considering the stream resolution. Discussing your specific 16:3 example, I would SERIOUSLY consider cropping the edges off that stream. If you do, you might be able to practically stream 60FPS.

As always with these kind of things, this is not a "do this it works better" document. Anyone who says that about computers is probably wrong as often as they are right, because every computer is different ;) Instead, this is intended to explain what's going on, and to help to better understand the whole thing, so that you can take the logic and apply it to your own specific circumstance. If you really want to know, and I mean objectively KNOW, not 'muh feels', then you're going to need to simply test it. Take a recording of the game and the stream, push them through a tool, get SSIM and PSNR measurements, and let the numbers do the talking. Generally speaking, you'll be able to try a few test streams and just see it. I hope this helped give you some ideas of settings for the tests :)
 

NarryG

New Member
Correct me if I'm wrong, but OBS downscales long before it hits the encoder. OBS gives you lots of different downscaling options which do a very good job at downscaling between non-directly divisible types. What's your point with macroblocks? Yeah, if you're partnered it'll transcode and then you'll hit potential issues with the blocks, but Twitch's transcode is mediocre at best anyways so no matter what you do, the transcode is gonna have quality issues.
 
OBS gives you lots of different downscaling options which do a very good job at downscaling between non-directly divisible types.

None of which will ever be as good as downscaling to directly-divisible resolutions.

What's your point with macroblocks?

Encoding efficiency. Refer to the 7th paragraph in the OP, and more info in koala's reply.

If you have an alternative approach to mine, you should make a new thread so it isn't lost in the replies here. I stumbled across this but I don't really come here often any more, so I may not reply further in the future.
 
Top