DayGeckoArt
Member
I've been trying to figure out optimal NVENC HEVC settings for screen recording so I've done some experiments over the past few days with the settings in this doc that Koala posted in the RGB thread https://gist.github.com/asus4/f5aef0f3f46fde198436da12f0332013
I've made some surprising discoveries and I have questions!
I'm testing with 4K drone video. My settings are 4K 30fps I444 with 40000bps bitrate, but the encoder ignores that number. Keyframe interval is 60. I have a Quadro T400 with the latest Nvenc chip.
Generally, nvenc hevc encoding uses about half of the CPU power of my 4 core E5-1620V2 Xeon. I figured out the usage by stopping recording and just letting the video play, and it would drop from 95% to half that. Also, encoder settings have a big impact on CPU usage, despite the popular belief that Nvenc is totally handled by the GPU.
From this forum I've learned the basic settings Rate Control Variable Bitrate and Constant Quality number
rc=vbr cq=##
## is a number between 0 and 51. 0 is Automatic and the other numbers are a scale where 1 is highest quality and 51 is lowest quality. From experimenting recording the same 4K video clip, I've figured out that 1-30 basically produce the same results. You get 20 megabits pretty much all the time, and if the frame is totally motionless, bitrate drops. The still frame bitrate is what the cq value affects, but even then, the difference is like 5Mbps vs 8Mbps. cq=1 to cq=30 basically CBR that will reduce bitrate if there really is nothing to record.
Around cq=33 is where bitrate starts to vary more, it goes between about 12Mbps and 16Mbps, occasionally jumping to 18mbps. The quality is barely distinguishable but there's higher CPU load so with my old 4 core I get some dropped frames.
At cq=35 it's 6Mbps to 9Mbps! A massive drop. But there are even more dropped frames . It still looks pretty good but is visually distinguishable from the higher quality settings.
At cq=40 you get 4-6Mbps which is starting to look pretty bad but would be OK for some content.
I also did a bit of testing with rc=vbr_hq and the cq= scale is totally different. At cq=1 you get 40Mbps and at cq=25 you get 20Mbps, but without much variation. At cq=30 there's more variation, between 13Mbps and 20Mbps. Overall rc=vbr_hq uses more CPU power and I get dropped frames just like with the higher vbr cq values.
So are these weird scales intentional? Are you supposed to get almost constant bitrates until you reach a certain threshold? Is there another setting besides rc=vbr and rc=vbr_hq that will allow more bitrate variation but without dropping frames? Is it worth trying rc=constqp?
I've made some surprising discoveries and I have questions!
I'm testing with 4K drone video. My settings are 4K 30fps I444 with 40000bps bitrate, but the encoder ignores that number. Keyframe interval is 60. I have a Quadro T400 with the latest Nvenc chip.
Generally, nvenc hevc encoding uses about half of the CPU power of my 4 core E5-1620V2 Xeon. I figured out the usage by stopping recording and just letting the video play, and it would drop from 95% to half that. Also, encoder settings have a big impact on CPU usage, despite the popular belief that Nvenc is totally handled by the GPU.
From this forum I've learned the basic settings Rate Control Variable Bitrate and Constant Quality number
rc=vbr cq=##
## is a number between 0 and 51. 0 is Automatic and the other numbers are a scale where 1 is highest quality and 51 is lowest quality. From experimenting recording the same 4K video clip, I've figured out that 1-30 basically produce the same results. You get 20 megabits pretty much all the time, and if the frame is totally motionless, bitrate drops. The still frame bitrate is what the cq value affects, but even then, the difference is like 5Mbps vs 8Mbps. cq=1 to cq=30 basically CBR that will reduce bitrate if there really is nothing to record.
Around cq=33 is where bitrate starts to vary more, it goes between about 12Mbps and 16Mbps, occasionally jumping to 18mbps. The quality is barely distinguishable but there's higher CPU load so with my old 4 core I get some dropped frames.
At cq=35 it's 6Mbps to 9Mbps! A massive drop. But there are even more dropped frames . It still looks pretty good but is visually distinguishable from the higher quality settings.
At cq=40 you get 4-6Mbps which is starting to look pretty bad but would be OK for some content.
I also did a bit of testing with rc=vbr_hq and the cq= scale is totally different. At cq=1 you get 40Mbps and at cq=25 you get 20Mbps, but without much variation. At cq=30 there's more variation, between 13Mbps and 20Mbps. Overall rc=vbr_hq uses more CPU power and I get dropped frames just like with the higher vbr cq values.
So are these weird scales intentional? Are you supposed to get almost constant bitrates until you reach a certain threshold? Is there another setting besides rc=vbr and rc=vbr_hq that will allow more bitrate variation but without dropping frames? Is it worth trying rc=constqp?