I'm not aware of any capture card on the market that outputs a scaled HDMI signal. As I said, the only scaling that's possibly happening is going to be for what it outputs to the computer, and even then that's assuming the capture card can even do hardware-level scaling.
Hardware scaling requires a scaler to perform that action. That scaling will also always introduce some amount of latency, so you're right to have concerns about input lag, regardless of the device that's performing it (including the TV's that can do their own scaling).
You say you can't force upscaling to 4k on your GPU... what's your actual setup that you have? Is it just a straight HDMI connection between computer and TV, or is there something else going on?
Yes, there's an HDMI cable connecting both directly.
And I can use NVIDIA's custom resolutions to make a desktop resolution of 1800p on a signal resolution of 4K, but there's no way to make that resolution seem like it's native for my TV. Which means, I can only apply NVIDIA's DSR to 4K and not the resolution I want.
When I try to use an EDID editor to delete all 4K resolutions and add 1800p as the highest possible option, I can trick my GPU driver into thinking that's my TV's native resolution and will scale DSR to it. But the problem is that my TV won't actually accept a signal resolution of 1800p.
Ideally there'd be some software or hardware option that will automatically upscale anything to a signal resolution of 4K, so I can scale DSR from any resolution I prefer. But so far, I haven't found a solution without causing additional problems..
I feel like upscaling a signal resolution shouldn't be such a problem, the GPU could probably do it in an instant, there's simply no option to do so.
And on the other hand, EDID resolutions only apply to the signal, there doesn't seem to be a way to add an upscaled resolution.
Seriously, any help or advice would be much appreciated, I've been trying to figure this out for days..
Thank you in advance! :)