The only people who are qualified to truly speak anything on the codec are the people who understand the codec, understand the codec's code, and are willing to actually put their faces directly into the code of the codec itself (which let me tell you is quite unpleasant because their code is ungodly devilish to read, though extremely optimal). The
best people to talk to on the subject are the x264 devs themselves, and the people who originally designed the actual codec ISO specifications. I would warn anyone else, that even though I've spent time in the x264 code, what I say now is only based upon my limited experience with it. I will be the first to say I do -not- understand every single inner working of this codec. It's too complex and I don't have the time, so I am not a qualified person to say definitively how the encoder works, and I can
guarantee you right now neither are the guys whos blog you read at that other program which shall not be named (and let's be honest, how many C# developers do you know with an mathematical degree? "They asked if I had a degree in theoretical physics. I told them I had a theoretical degree in physics, and they said welcome aboard"), but I can tell you what I have learned based upon my experience writing this program.
So ideally a bigger buffer is better, but it also causes problems, as to how or why I still don't fully understand it
Throw away your preconceived notion of "kilobits per second" that you use for vbv-maxrate. Though you enter a per-second value, the rate sent out is not going to be the max you send out in every single given second of the stream. It's the "average" rate for the whole stream which is
based upon the buffer size, not "max allowed in any given second of the stream". Again, vbv-maxrate means "
average max per second for the stream itself", not actual max each second. It is based upon the buffer size, and the higher the buffer size, the more data that can be sent out at a given time.
For example, if you set your stream with a 1000 kb/s maxrate, stream for 5 minutes, then regardless of what your buffer size is, if you divide your total bits sent by 300 seconds, you will almost always get very close to 1000 kb/s. However, if you measure each second individually, you'll see that with a higher buffer size it will fluctuate higher because the buffer is allowed to use more data before it has to be sent out, it's allowed to create larger frames, and thus larger fluctuations in transmission, which in the end means poor QoS handling of the packets for both the streamer and client because the packet sizes keep going wild. I've measured this many times myself.
If you compile the project and make the preview use CreateBandwidthAnalyzer() for the OBS::network variable instead of a CreateNullNetwork(), it shows you both the max per stream, and the max total that was sent out in a given second. The more you increase the buffer size, the larger "max sent out in a given second" increases (though you do need to make sure your stream is sufficiently complex).
Things like "minimize network impact" can help with the QoS problem caused by large buffer sizes by splitting the packets up and sending them out in properly timed intervals, but that results in more TCP acknowledgements which can increase the chance of frame drops, requires more client-side buffering time, and thus is also somewhat problematic in its own regard. Twitch itself may or may not decide to send the entire frames as a whole instead of splitting them up either, so the client could suffer some serious QoS issues as well.
The buffer is also client side, it's not on the streamers end, which is also something to consider
This is incorrect. It's both, what is sent is received. The streamer is the one who creates the buffer in the first place, if it was not on the streamers' end it would not be an option for the streamers' encoder.