You don't seem to be aware of some relations, if you talk about overclocking, x264 presets, cpu demand for x264 and x264 quality.
I take x264 figures f
rom here.
- Overclocking gives up to 10-15% more cpu power.
- faster preset gives approx. 75% file size compared to veryfast (which means something like 25% better quality)
- faster preset needs 58 % more cpu power than veryfast preset
- medium preset gives approx. 80% file size compared to veryfast (yes, that's worse than faster according to this)
- medium preset needs 150% more cpu power of veryfast preset (that's 2,5 times)
- slow preset gives approx. 76% file size compared to veryfast (yes, that's still worse than faster according to this)
- slow preset needs 268% more cpu power of veryfast preset (that's 3,68 times)
- With overclocking you don't solve any issue. You increase some benchmark figures, but nothing visible except increased instability
- With any preset beyond faster you just make your CPU work more, but don't really increase the quality. Even if the referenced report might be a bit different if it were performed for different footage, especially if other footage results in actual quality increase for presets beyond faster, this size decrease that results in quality increase would be some 1-5%
- A quality difference of 1-5% is not barely visible, it isn't visible at all, given the fact that about 70-80% of video detail is removed in any case if you do streaming.
If you want high quality streams, focus on a smooth running machine and smooth running stream without a single lost or lagged frame. Lost or lagged frames are much worse for visual perception than highest compression performance.
With "lagged frame", I mean how constant the frame rate is.
- A 60 fps stream where the GPU is producing one frame every 16.6 ms and this frame is encoded, transmitted and displayed to the viewer every 16.6 ms, appears absolutely smooth to the viewer.
- A 60 fps stream, where the GPU is producing 80, 100, 200, 300 or 400 fps and OBS has always to take one of the intermediate frames that are approx. 16.6 ms away, but not exactly 16.6 ms, doesn't appear absolutely smooth to the viewer. This is because the actual frame rate of the GPU is not dividable by 60. Such a video will always appear somewhat stuttery. Not really bad, but you can tell a difference between such a video and a really smooth video.
- A 60 fps stream, where the GPU is overloaded and producing less than 60 fps, has always some kind of stutter
It's also not really constant fps what is needed, it's also important that frames are shown at exactly the point in time they were produced. If a game produces 5 frames with 15 ms, then 16 ms, then 20 ms, then 15 ms in between, which is 16,6 ms average, the corresponding video isn't smooth if the video player displays the 5 frames every 16.6 ms. Instead, it is smooth if the video player shows the frames with the same delay between the frames as they were produced, i. e. with 15 ms, 16 ms, 20 ms, 15 ms. To achieve this, modern container formats contain time stamps for every frame. The issue with this is that there is no time stamp within the GPU when the frame was produced. It may be that a frame was produced 4 ms before it is scanned by OBS, so the time stamp OBS puts in the video is 4 ms off. This is ok, if every frame would be 4 ms off, but in high load situations this delay varies, so the resulting time stamps stutter.
To avoid such situations, make sure you don't load your system fully, so there is leeway for every process to perform its work exactly when it is supposed to. Don't fully load the encoder, don't fully load the CPU, don't fully load the GPU. So: don't use the most CPU demanding preset you think your CPU can handle!