Jose Tortola
Member
Hi all.
I'm using a dual-pc setup. The streaming one has a Avermedia live gamer 4K capture card and a Ryzen 7 2700x. I play games in the gaming one at 4K HDR 60 fps. In OBS I use a source for video capture device, and on its properties I set resollution to 1920x1080 (canvas in same 1920x1080). Downscalling in source properties seemed to be the more efficient than setting the canvas in 4K and output in HD.
Obviously, OBS capture can not read HDR metadata, so imput video needs to be tonemapped to SDR. To do so, in the video capture source I use two filters: a LUT filter (3dlut) and a colour filter (a version of both for each game, fine tuned for them).
With that settings, streaming preview on that PC looks good, with high image quality and a good colour matching to the original HDR source.
I use x264, Medium profile, preset high, 7500 kbps bitrate. Around 45% of CPU use and no frames lost due to rendering lag or encoding lag. If I try to go to Slow profile, CPU doesn't take load to 100% but encoding lag makes me lost about 45% of my frames. So I use Medium and no problem at all with the configuration.
But... when you look the stream on Twitch, image quality looks really worse. You can see "colour-noise", it looks grainy more than pixelated. A grain and a colour noise that is definitively not seen in the preset view on my streaming PC.
Clip with Medium profile: https://clips.twitch.tv/IgnorantWonderfulGrasshopperKappaRoss
That is a problem for me.
A friend with a very similar PC setup told me to try this configuration:
Placebo profile, preset high, tune ssim, 7500 kbps bitrate, and in the command line:
8x8dct=1 aq-mode=2 b-adapt=2 bframes=1 chroma-qp-offset=2 colormatrix=smpte170m deblock=0:0 direct=auto ipratio=1.41 keyint=240 level=4.2 me=hex merange=16 min-keyint=auto mixed-refs=1 no-mbtree=0 partitions=all profile=high qcomp=0.6 qpmax=51 qpmin=10 qpstep=4 ratetol=10 rc-lookahead=30 ref=1 scenecut=40 subme=7 threads=0 trellis=2 weightb=1 weightp=2
All the rest of the OBS configuration (sources, filters, etc) remains the same. CPU use remains almost the same, in some cases goes up to 52% average, but image quality (as far as grain and colour noise) really improves.
Clips with that custom command line and configuration:
https://clips.twitch.tv/FunRelatedBasenjiGivePLZ
https://clips.twitch.tv/EncouragingFriendlyLampOpieOP
My understanding of x264 command line options is vey reduced, and I'm trying to learn about the difference between the Medium profile and that custom settings and why they make the grainy colour noise dissapear. Not only to learn, but to improve this and raise image quality even further as far as my CPU allows me.
So what are the key commands that make that grain dissapear?, what are the ones that will benefit image quality even further with that more 40% CPU power available?.
Thank you very much in advance, any help will be really appreciated.
I'm using a dual-pc setup. The streaming one has a Avermedia live gamer 4K capture card and a Ryzen 7 2700x. I play games in the gaming one at 4K HDR 60 fps. In OBS I use a source for video capture device, and on its properties I set resollution to 1920x1080 (canvas in same 1920x1080). Downscalling in source properties seemed to be the more efficient than setting the canvas in 4K and output in HD.
Obviously, OBS capture can not read HDR metadata, so imput video needs to be tonemapped to SDR. To do so, in the video capture source I use two filters: a LUT filter (3dlut) and a colour filter (a version of both for each game, fine tuned for them).
With that settings, streaming preview on that PC looks good, with high image quality and a good colour matching to the original HDR source.
I use x264, Medium profile, preset high, 7500 kbps bitrate. Around 45% of CPU use and no frames lost due to rendering lag or encoding lag. If I try to go to Slow profile, CPU doesn't take load to 100% but encoding lag makes me lost about 45% of my frames. So I use Medium and no problem at all with the configuration.
But... when you look the stream on Twitch, image quality looks really worse. You can see "colour-noise", it looks grainy more than pixelated. A grain and a colour noise that is definitively not seen in the preset view on my streaming PC.
Clip with Medium profile: https://clips.twitch.tv/IgnorantWonderfulGrasshopperKappaRoss
That is a problem for me.
A friend with a very similar PC setup told me to try this configuration:
Placebo profile, preset high, tune ssim, 7500 kbps bitrate, and in the command line:
8x8dct=1 aq-mode=2 b-adapt=2 bframes=1 chroma-qp-offset=2 colormatrix=smpte170m deblock=0:0 direct=auto ipratio=1.41 keyint=240 level=4.2 me=hex merange=16 min-keyint=auto mixed-refs=1 no-mbtree=0 partitions=all profile=high qcomp=0.6 qpmax=51 qpmin=10 qpstep=4 ratetol=10 rc-lookahead=30 ref=1 scenecut=40 subme=7 threads=0 trellis=2 weightb=1 weightp=2
All the rest of the OBS configuration (sources, filters, etc) remains the same. CPU use remains almost the same, in some cases goes up to 52% average, but image quality (as far as grain and colour noise) really improves.
Clips with that custom command line and configuration:
https://clips.twitch.tv/FunRelatedBasenjiGivePLZ
https://clips.twitch.tv/EncouragingFriendlyLampOpieOP
My understanding of x264 command line options is vey reduced, and I'm trying to learn about the difference between the Medium profile and that custom settings and why they make the grainy colour noise dissapear. Not only to learn, but to improve this and raise image quality even further as far as my CPU allows me.
So what are the key commands that make that grain dissapear?, what are the ones that will benefit image quality even further with that more 40% CPU power available?.
Thank you very much in advance, any help will be really appreciated.