Is there a way to decrese the load NDI uses? Maybe lower bitrate? Cant see any settings of NDI
I believe that NDI is completely uncompressed. Not even lossless. (*) So it takes a TON of bandwidth on the network, but the dedicated professional installations that it was designed for are assumed to have that in mind and so it's okay. Thus, no settings. That comes from the standard itself, not a particular implementation of it.
That then brings the question of why it uses so much GPU, if it's not actually compressing? Well, there's still some translation to do, between the frame buffer and a series of network packets and then back again, which lends itself much better to a parallel process than a serial one. So that might be what the GPU is doing. It's not h.264, so as to use the dedicated chunk of silicon that's made specifically for that and is invisible to the usage meter, but it's not a 1:1 correlation either. So the general-purpose parallel processor gets to do it, and that does appear on the meter.
(*) Even if the *visible* data is uncompressed, I've noticed that a mostly transparent picture does use less bandwidth than a fully-opaque one, according to the network traffic meter that I have on my taskbar. So it appears that fully-transparent pixels are not transmitted, or maybe there *is* some lossless compression going on in general, which the GPU would also be responsible for.
But the end result still remains: no loss of quality *at all*, and the protocol "just works" so that those who use it can think about other things. Fewer settings makes it more likely to "just work", all the way to no settings at all, which makes only one configuration to verify before release.
---
Professional tools, like NDI, are different from consumer. In the pro world, you don't care about the amount of data you're throwing around locally, or how much processing power it takes to handle that. You use what is guaranteed to work, and you buy what is needed to support it.
(You're also not concerned with constant high loads. If the usage meter is 95% +/-3%, it's good! You can run a critical show on that. You only care that it's *always* less than 100%, which that is: max 98% < 100%. It's the *spikes* in load that make people nervous and want a lower average. If you don't have those, you can run a higher average and be perfectly okay.)
Unlike consumer tools, that are designed to be nice to the cheap hardware that someone already has.