Different protocols can mean different bitrate, resolution, compression, etc.
A protocol meant for rendering in a browser will be different than a protocol (ex NDI) that will focus more on latency and image fidelity. And then there is the all important codec associated with the sending system, and are you using the best available? This stuff gets complex
if you are recording locally, and playing back only on the same system for ever, then yes if you can't see a difference on your monitor, it is unlikely to matter. But
And if you can't tell with a monitor, then you can't tell.
represents a fundamental misunderstanding of what is going on technically, and is certainly NOT true when streaming/watching on other systems. I've read a fair portions of the posts in these forums for past 15 months, and I'm highly technical to begin with. And some of the technical nuances of encoding, and streaming protocols, etc go over my head. so just a caution about over-assuming what does and does not make a difference