We all know our recording capability with x264 encoder is pretty much defined by our CPU. It's core, it's number of core, it's clock speed etc.
But what about NVENC?
Let's say Turing Gen here
Does NVENC works alone without taking a single NVIDIA GPU cores resources?
If this is the case, it means that 1650 Super and 2080 Ti will perform completely the same
Or NVENC do takes some resources on GPU cores? Which means that NVENC on 1650 Super and 2080 Ti will perform differently.
Let me know what you guys think! Thank you!
But what about NVENC?
Let's say Turing Gen here
Does NVENC works alone without taking a single NVIDIA GPU cores resources?
If this is the case, it means that 1650 Super and 2080 Ti will perform completely the same
Or NVENC do takes some resources on GPU cores? Which means that NVENC on 1650 Super and 2080 Ti will perform differently.
Let me know what you guys think! Thank you!