The UI is setup to force the user to choose a downscale filter. I am questioning why this is the case and am not assuming anything here.
I cannot assume that filter selection is being ignored or being applied in the case of 1:1 mapping as the UI does not reinforce that line of thinking.
If it is indeed no downscaling is done, then great. However, I'm not just going to take someone's word for it here, especially if its not coming from Jim.
Thank you R1CH for directing me to the source file of interest.
After review, I am not left any more confident that no downscaling is being applied. I say this, because I found two good examples of code that under the hood surprised me.
First example below shows "equalness" as a 16 pixel differential. If a user typed in 1920x1080 as their base and 1904x1064 as their output, the user isn't going to know that OBS is treating them to be the same. Who would this apply to? A Q/A Automation engineer evaluating OBS may want to use this technique to force a downscale with a certain filter to see that it loads correctly. But OBS isn't allowing that so he gets unexpected results.
static inline bool resolution_close(struct obs_core_video *video, uint32_t width, uint32_t height)
{
long width_cmp = (long)video->base_width - (long)width;
long height_cmp = (long)video->base_height - (long)height;
return labs(width_cmp) <= 16 && labs(height_cmp) <= 16;
}
Another example. If the scaled output is less than 50% of the base canvas size, OBS forces bilinear scaling.
static inline gs_effect_t *get_scale_effect_internal(
struct obs_core_video *video)
{
/* if the dimension is under half the size of the original image,
* bicubic/lanczos can't sample enough pixels to create an accurate
* image, so use the bilinear low resolution effect instead */
if (video->output_width < (video->base_width / 2) &&
video->output_height < (video->base_height / 2)) {
return video->bilinear_lowres_effect; <--------------------------------- BILINEAR
}
-- snip;
Who would this apply to? Streamers trying to capture at one size and scaling down by more than 50% to another size. Play a game in 4K and downscale to 720p, and you are forced to use bilinear scaling. And somehow people are supposed to know this?
Look at what the downscale filter setting is when I just monkey with the resolution settings. Here's an example of when I change from 2K to less than 50%. Notice, I set it to Lanczos. Which is being used, Lanczos or Bilinear? Does the Log have a bug? Is the code buggy? I mean c'mon. No, this is not obvious.
21:09:25.560: video settings reset:
21:09:25.560: base resolution: 2560x1440
21:09:25.560: output resolution: 928x522
21:09:25.560: downscale filter: Lanczos
21:09:25.560: fps: 60/1
21:09:25.560: format: NV12
21:09:25.569: Settings changed (video)
Back to this code:
-- snip
if (resolution_close(video, width, height)) {
return video->default_effect;
--- snip
If default_effect is initialized to no downscaling filter being applied and the effect loop is not invoking a downscale filter because of this, THAT is the technical answer to my question.
I've yet to prove this and I'm not too thrilled about building OBS source and setting breakpoints to find the truth. I'll leave that to R1CH and Jim to chime in.
However, regardless of all of this, the real solution to all the problems mentioned above is to change the Video Output UI to be more informative to the user AND to take these special situations into account by disabling or setting the downscaling filter combobox appropriately to fully reflect the state of the program.