Question / Help Bit for Bit Quality Settings

NicholasLAranda

New Member
I'm sure this has been asked in may ways before, but I don't think in the way I'm about to.

"What's the best quality settings for 1080p 60fps?"

IN THE SENSE: how many pixels are on the screen? Is 1 pixel 1 bit? Multiply that by 60 frames and I get my bitrate? Bitrate per minute/second???

I want just technically the matching quality of bit for bit quality, nothing more, nothing less than math.
 

BluePeer

Member
? you can calc the theoretical uncompressed bitrate yes
the real required bitrate is related to the constellation of pixel changes
related to the input to the encode of the 1080P60 it can go from 15% up to 80%?
 

koala

Active Member
1080p@60fps means resolution 1920x1080.
So the amount of pixel is 1920*1080 = 2073600 pixel.

One raw pixel has 8 bit for red, 8 bit for green and 8 bit for blue, thus 24 bit or 3 bytes.
In bytes, one raw frame is 2073600 pixel * 3 bytes = 6220800 bytes per frame (one frame is 1 rendered image).

60 frames per second means 60 full raw frames, thus 6220800 bytes per frame * 60 frames per second = 373248000 bytes per second.

If you stream this with bitrate 6000 (recommended bitrate by Twitch for 1080p 60 fps), you are sending with 6000 kkbs, this is 6000 kilo bit per second, which is 6000 * 1000 bit per second = 6000000 bit per second.

8 bit are 1 byte, so 6000000 bit per second = 6000000 / 8 bytes per second = 750000 bytes per second.

Now you can compare the ratio between raw data and compressed data. Raw data is 373248000 bytes per second. Compressed data is 750000 bytes per second. This is a ratio of 750000 / 373248000 = 0.002. This means video encoding compresses the data with the factor 0.002 (or by 1/500, if you want it descriptive).

Per pixel: with 24 bit per raw pixel, this is 24 * 0.002 = 0.048 bit per compressed pixel.

tl;dr
A 1080p 60fps stream with bitrate 6000 carries 0.048 bit per pixel.


You wanted math nothing more nothing less, you got math nothing more nothing less, but I doubt you can deduct any useful with with regard to your streams or recordings.

In the end you need to balance the trinity resolution/bitrate or quality/resource consumption to something that gives the best looking video for your given hardware and your given raw material. Usually, you can always get only 2 out of the 3. That's different for every one of us here in the forum. If you just want a setting suited to your machine, fire up OBS, do Tools->Auto Configuration Wizard and forget about all the math.
 

NicholasLAranda

New Member
1080p@60fps means resolution 1920x1080.
So the amount of pixel is 1920*1080 = 2073600 pixel.

One raw pixel has 8 bit for red, 8 bit for green and 8 bit for blue, thus 24 bit or 3 bytes.
In bytes, one raw frame is 2073600 pixel * 3 bytes = 6220800 bytes per frame (one frame is 1 rendered image).

60 frames per second means 60 full raw frames, thus 6220800 bytes per frame * 60 frames per second = 373248000 bytes per second.

If you stream this with bitrate 6000 (recommended bitrate by Twitch for 1080p 60 fps), you are sending with 6000 kkbs, this is 6000 kilo bit per second, which is 6000 * 1000 bit per second = 6000000 bit per second.

8 bit are 1 byte, so 6000000 bit per second = 6000000 / 8 bytes per second = 750000 bytes per second.

Now you can compare the ratio between raw data and compressed data. Raw data is 373248000 bytes per second. Compressed data is 750000 bytes per second. This is a ratio of 750000 / 373248000 = 0.002. This means video encoding compresses the data with the factor 0.002 (or by 1/500, if you want it descriptive).

Per pixel: with 24 bit per raw pixel, this is 24 * 0.002 = 0.048 bit per compressed pixel.

tl;dr
A 1080p 60fps stream with bitrate 6000 carries 0.048 bit per pixel.


You wanted math nothing more nothing less, you got math nothing more nothing less, but I doubt you can deduct any useful with with regard to your streams or recordings.

In the end you need to balance the trinity resolution/bitrate or quality/resource consumption to something that gives the best looking video for your given hardware and your given raw material. Usually, you can always get only 2 out of the 3. That's different for every one of us here in the forum. If you just want a setting suited to your machine, fire up OBS, do Tools->Auto Configuration Wizard and forget about all the math.
Reading your post now. Will edit momentarily.

Thank you!
 

NicholasLAranda

New Member
I used Google Calculator to double check some things:
Code:
3*(1920x1080)*60 = 373,248,000 Bytes = 373,248 KiloBytes of Pixels per Frame, that are Colored, and then we added 60 of those Colored Frames together
{color depth} * ({horizontal length} x {vertical length}) * {frames per second}
If you copy & paste that function { 3*(1920x1080)*60 } into google calc: it'll give you those results.

I then proceeded to convert those 373,248 KiloBytes to be: 2,985,984 KiloBits.

So then would: 2,985,984 KiloBits be the number to use?

Also, you mentioned Compression: does that happen on my end, or the servers end? Or driver, hardware, codec, program, router, switch, etc.?

Here is my result of running Auto Config: 10,000 KiloBits per Second, that's a difference of: 298.5984‬ KiloBits, per KiloBit.
 

carlmmii

Active Member
It's actually 2,916,000 KiloBits/sec with the proper 1024 conversion, but that's semantics.

The main concept that needs to be understood is the difference in usage between the uncompressed (raw) data and the compressed (encoded) data.

2,916,000 kbps = 2.78 Gbps. This is 3 times faster than gigabit internet's capability. Sending uncompressed video data over the internet is completely infeasible in the current day for real-time purposes, and is never done.

Every video source that you view on the internet is compressed. That is the only way to send video data over the limited bandwidth that you have available from your ISP.

The only time uncompressed video bandwidth is worth mentioning is when dealing with stuff like connection to monitors (i.e. how much bandwidth is required over a displayport cable, as that would determine which DP revision is required for a certain resolution and framerate), or if you're talking about the transfer of uncompressed frames within your computer (such as between the GPU and CPU, which occurs over the PCIe bus).

With that out of the way, all of the video data that you deal with in OBS is uncompressed up until the point at which it is encoded. This is the point where bitrate is important to consider, since this is one of the factors that goes into the quality of the compressed image. The more bits are allowed to be used, the better the output is going be.

The auto configurator apparently gave the recommendation of using 10,000kbps (which is the compressed bitrate), however this is still quite a bit higher than Twitch's recommended maximum bitrate of 6000kbps.

The comparison between compressed bitrate and uncompressed bitrate is more of an academic question that looks at the efficiency of the encoder for the given quality difference, and is definitely way out of the scope of standard discussion for OBS... it's interesting, but in more of a "pure maths" point of view, instead of an applied one.
 

koala

Active Member
It seems you're missing some base understanding of video encoding, data compression, bandwidth terminology, internet technology.

What's your point? What do you want to achieve? You want to produce some kind of video or stream, so tell us what kind of product you're trying to create.Try to approach from a high level point of view, without detailed math. Tell us what you want to achieve, and we will try to tell how that can be done.

Some basic information about how the stream creation pipeline works:
  1. a sequence of frames is produced, for example by a game, a camera or a capture card
  2. these frames are loaded into the streaming client, for example OBS Studio. It's called capture (game capture, window capture, etc.)
  3. the streaming client composites one or multiple such sources to a video stream. Additional sources can be overlays, or a facecam in addition to a game capture.
  4. the video stream is rescaled by the streaming client to the final resolution the streaming service requires
  5. the rescaled video stream is encoded (compressed) by the streaming client on the local computer
  6. the encoded video stream is sent by the streaming client to the streaming service over the internet
  7. the streaming service distributes the encoded video stream to all the currently connected viewers of the stream
In no case your first question is about bits and pixels. Much more important are all the restraints you have to handle. They restrict so much, you usually don't need to compute anything but instead choose between only a few remaining choices. Restraints are the computing power of your machine, your network bandwidth, the kind of footage you want to capture, the technical parameters of the streaming service, the size and aspect ratio of the display devices of your viewers, the network bandwidth of your viewers.
To help keeping all this in mind, there are just a few rules of thumb to adhere - nothing more. For example, the encoding guidelines by Twitch: https://stream.twitch.tv/encoding/ or Youtube: https://support.google.com/youtube/answer/2853702
 

NicholasLAranda

New Member
I guess I just want to figure out how to calculate a good ratio for my setup.
I have a 1080p monitor that can go up to 165 Hz, but my nVidia GTX 760 TI can only run it at 145 Hz.
I already know how to find out my streaming quality settings, but I want high quality, no pixelation recordings.
My CPU is an Intel Core i5-9600KF.

I am going to be recording Counter-Strike: Global Offensive.
I run the game at 2x the refresh-rate of my monitor, in case it drops a frame instead of waiting for the next frame, it can just grab that copy of frame X, then I found from testing to have a smooth GUI experience for FPS counters (show_fps & netgraph), you add 1 more to your max FPS because if your game is maxed out in frames, for a reason I haven't found why yet, the GUI will show it as 1 less.

I used to want to be funny and ran at a lower resolution to get even more frames and capped it at 420, but the GUI would always show 419 FPS, and that's how I found out that little quirk lol. So 291 is my in-games FPS.

I am using a Cooler Master LC240E to cool my CPU.
My motherboard is an MSI z390-A Pro.
My RAM is 4x 8GB of HyperX Fury HX426C16FB3K2/16 sticks.
My disk drive is a 500GB Samsung 970 EVO M.2 NVMe SSD MZ-V7E500BW.
All inside of a Phanteks PH-EC416PTG_BK Eclipse P400 case ATX.

How do we benchmark my setup ?
 

koala

Active Member
Again: use Tools->Auto Configuration Wizard. If you insist on configuring things manually, use simple output mode. For streaming set the recommended bitrate for streaming according to your streaming provider, and for recording set the desired quality (usually "Indistinguishable quality") and encoder (you want to use nvenc for recording).
Make sure you play the game with a fps that is a multiple of the recording fps, i. e. if you record with 30 fps, play your game with 30, 60, 90, 120, 150 fps, or if you record with 60 fps, play your game with 60 or 120 fps to avoid stutter due to an odd number of frames removed at the conversion.
And make sure you don't overload your GPU or CPU by playing your game with an fps too high. If the game consumes all resources, nothing is left for OBS and the recording becomes laggy. This might lead to the fact that you might not be able to use the nice 165 fps monitor to its fullest extend. Screen recording and encoding is a resource intensive process.
 
Top