Question / Help How does OBS use bitrate?

Bensam123

Member
So, I currently have OBS setup with a 3200 bitrate and 720p@30fps. I use a quality of 10 and veryfast preset.

I noticed that OBS eats all of that bandwidth in high action scenes. I don't drop any frames, but at what point does offering more bitrate reduce quality? Does OBS always eat how much bandwidth you give it in high action no matter what or does it have its own internal baseline for 720p@30?

Could I run at 60fps and maintain the same quality? Obviously I could try a bunch of different settings and see what works, but I'm interested in how OBS interacts with my bandwidth settings for it.
 

Warchamp7

Forum Admin
This is more based upon your quality settings. By setting it to 10, the encoder will use whatever it can to try and keep that level of quality, and your bitrate is the limiting factor there. In high motion, it tries to keep every frame at that high quality, and without the bandwidth to maintain it, it can only keep it so long.

Setting quality to 8 is generally good balance between nice quality, and still maintaining it in high motion.
 

Mr_KyleG

New Member
To add to Warchamp.

OBS uses x264 for encoding. x264 allows for a variable bitrate. So what you're seeing is higher usage because of the faster scenes. For example, if you are playing a slower paced game, it will use less of the bitrate you input because it doesn't need to use it. And when you're playing FPS games, you generally get higher usage because of the fast pace.

Hope this makes sense =D
 

Bensam123

Member
I understand that x264 uses a variable bitrate and that my settings may be pegging my bandwidth. I streamed using similar settings with xsplit and it didn't seem to eat as much bandwidth. I'm more curious as to how much of my bitrate is 'overprovisioning', where the encoder is actually being wasteful with my bandwidth and isn't actually improving the quality despite the bandwidth usage.

I'll have to give a quality setting of 8 a try... I know I can just try out a bunch of random settings, but I was curious if there was a baseline. It doesn't seem to take much action to max it out at 3200 all the time (watching a bandwidth monitor), but it doesn't drop frames, the stream doesn't get laggy, and there seems to be no real downside. It doesn't seem like I'm actually running into a situation where my quality settings are straining my bandwidth.

For instance when streaming Xsplit and my quality settings are too high for my bandwidth I'd end up with dropped frames or a laggy stream.
 

Bensam123

Member
So I've been reading around the forums and through posts and it always seems to change. For instance one guy had a bitrate of 2500 and someone said he could stream fluidly at 1080p?!?! There is no real definitive source or standard for quality. Given that this is based heavily on how much action is going on on the screen, but we could assume a baseline for FPS as far as actions per inch or something (there has to be technical term for this). But right now I'm sorta interested in how the quality setting also effects bandwidth usage in addition to the encoder preset.

There seems to be a trade off between the quality preset and bitrate at certain resolutions. It would be really nice if someone could plot the relationship between the two at certain resolutions to illustrate the trade off between the two. I don't know enough about the presets yet to do this, but I think it would help people figure out how to stream properly. Quality, bit rate, fps, resolution, and encoder preset all seem to interact with eachother in different ways. Quality and bit rate seem to have a much bigger relationship to resolution, then to fps (for some reason). I'm still unsure of how the encoder preset weighs these qualities to increase the quality of the stream (or does it reduce the amount of bandwidth your stream uses?).

Some of these options are explained, but their relationships aren't really. All I can do is keep fiddling with presets to try and make things look better.
 
Top