This display device is running at 8-bit per color component:
Code:
...
01:58:05.635: bits_per_color=8
previous (LG) claimed that it was running 12-bit:
Code:
...
19:47:10.688: bits_per_color=12
But both has same gamut that is close to 8-bit space. It is not right or wrong - just slightly different devices.
For some devices in the display settings (device menu, accessible by device buttons or via remote) it is possible to configure video mode of the device (HDR, 10-bit etc) but this all depends on manufacturer, model, port of the connection (at highest resolution and framerate not all interfaces can handle more that 8-bit - look at the manual what modes and at what conditions your device supports).
From the start of HDR development in OBS there were implemented 10-bit formats first, then was added 16-bit formats. Support for 12-bit is not implemented yet. So, for OBS it it better to have 10-bit devices or 16-bit devices. Thus, if you can set your display device to 10-bit mode - try it instead of 12-bit. As far as I understand Windows OS doing all math with 16-bit per component precision, so can fit anything to any display device.
As for the RGB10A2 textures - they can be in sRGB space or in Rec. 2100 (Rec.2020/BT.2020). OBS "doesn't know" what it is capturing right now, so user should select manually.
For implementation details see:
https://github.com/obsproject/obs-studio/pull/6249
Because your texture is RGB10A2 ("BufferDesc.Format: 24" from the log) - then both values of the parameter already assumes 10-bit input. The math for proper rendering (without color errors) will be chosen depending on OBS Advanced settings and where the window lies (on which display device), because user can drag preview window to other device, for example to 8-bit device.
In OBS Advanced settings you specifying canvas parameters (or let's say "output" parameters). All sources that will be placed on this canvas can be in different bitness/spaces/color_formats etc. Thus, if you set 10-bit there - you forcing OBS to scale everything to 10-bit, so final output will be at 10-bit. If you set here 8-bit, you forcing all sources to be scaled on the fly to 8-bit, so final output will be at 8-bit.
I would experiment with RGB10A2 setting while keeping OBS Advanced at some 10-bit format. Color Range set to "Limited/Partial" are good for TV screens but you may try "Full" as well.
Darker colors (sometimes mistaken to very high contrast) is when full range video represented on full range monitor without proper scaling - software "thinks" that the video has Limited range (while it is truly Full range). So, take it in mind too (you can imagine 8-bit as "limited" and 10-bit as "full" - behavior is the same).
Edit: more about color range. Shaded image - means Limited range video represented on Full range monitor but software "thinks" that this is Full range video (while it is truly Limited range).