DayGeckoArt
Member
The premise of NVENC is to handle video encoding without impacting game performance much. There's a lot of discussion on the internet about the computer resources used by NVENC in various forums, blogs, Youtube videos, etc and the general consensus seems to be that it does use a bit of CPU and GPU computing power, and more if you use color encoding settings other than NV12, but not enough to worry about.
But I recently discovered by playing and recording an 15 year old game, World of Warcraft, that NVENC encoding uses quite a bit of CUDA computing power and watts! I have an RTX 2060S and I play at 4K with Vsync on limiting FPS to 60. OBS is set to 4:4:4 color and 30fps using NVENC_HEVC rc=constqp qp=30
With OBS open but not recording, GPU clock is 1200mhz and it draws about 70 watts. When recording, clock increases to 1905mhz and 140 watts. CUDA usage goes from 0% to about 35%. So that's 70 watts for just encoding!
I'm monitoring using HWInfo64 with Stream Deck plugin
Log:
But I recently discovered by playing and recording an 15 year old game, World of Warcraft, that NVENC encoding uses quite a bit of CUDA computing power and watts! I have an RTX 2060S and I play at 4K with Vsync on limiting FPS to 60. OBS is set to 4:4:4 color and 30fps using NVENC_HEVC rc=constqp qp=30
With OBS open but not recording, GPU clock is 1200mhz and it draws about 70 watts. When recording, clock increases to 1905mhz and 140 watts. CUDA usage goes from 0% to about 35%. So that's 70 watts for just encoding!
I'm monitoring using HWInfo64 with Stream Deck plugin
Log: