Title says it all I suppose? Or at least, most of it.
I run a Ryzen 3900X and usually CPU encode because my GPU (GTX1660) is usually quite pushed to the max, and though I know NVENC is on its own dedicated chip, if I'm only just managing to run the game as I desire leaving some OBS overhead, recording on NVENC can introduce some issues on OBS side, and slow down my performance a tad, making the framerate inconsistent which of course exacerbates the problems. If I'm playing a console game, well, I have enough CPU to spare to straight up run x264 slow.
But the question is, does that actually matter? It's largely well established that past 'fast', the gains for slowing down the CPU usage preset are well into diminishing returns past fast, but also differences between encoders normalise around 10,000kbps bitrates and above. What about, however, if you're running at good quality local recording settings on x264 - with enough bits to go around, can you drop the preset to faster ones with next to no, if any, loss at all?
I record at CRF14 (1440/60 unless it's a console game), and if I'm also simultaneously playing, it depends on game but generally CRF14 faster leaves some CPU headroom, but CRF14 fast can max out the CPU and of course then you get encoding overload and the game dropping frames. I did record some footage at CRF14 fast and faster of the same game, look and visually at least I probably wouldn't notice any difference unless I pixel-peeped some fast moving particle effects. The bitrates (variable of course, but per win10 properties) were also nigh-identical, definitely diminishing returns for the CPU usage uptick.
So then I wonder 'is even faster overkill? Does the preset effectively mean nothing if there's enough bits to go around and dropping it further can save me some usage and temperatures with next to no drop in quality?'
What are your experiences with usage presets at higher bitrates / lower CRF values? Thanks in advance!
I run a Ryzen 3900X and usually CPU encode because my GPU (GTX1660) is usually quite pushed to the max, and though I know NVENC is on its own dedicated chip, if I'm only just managing to run the game as I desire leaving some OBS overhead, recording on NVENC can introduce some issues on OBS side, and slow down my performance a tad, making the framerate inconsistent which of course exacerbates the problems. If I'm playing a console game, well, I have enough CPU to spare to straight up run x264 slow.
But the question is, does that actually matter? It's largely well established that past 'fast', the gains for slowing down the CPU usage preset are well into diminishing returns past fast, but also differences between encoders normalise around 10,000kbps bitrates and above. What about, however, if you're running at good quality local recording settings on x264 - with enough bits to go around, can you drop the preset to faster ones with next to no, if any, loss at all?
I record at CRF14 (1440/60 unless it's a console game), and if I'm also simultaneously playing, it depends on game but generally CRF14 faster leaves some CPU headroom, but CRF14 fast can max out the CPU and of course then you get encoding overload and the game dropping frames. I did record some footage at CRF14 fast and faster of the same game, look and visually at least I probably wouldn't notice any difference unless I pixel-peeped some fast moving particle effects. The bitrates (variable of course, but per win10 properties) were also nigh-identical, definitely diminishing returns for the CPU usage uptick.
So then I wonder 'is even faster overkill? Does the preset effectively mean nothing if there's enough bits to go around and dropping it further can save me some usage and temperatures with next to no drop in quality?'
What are your experiences with usage presets at higher bitrates / lower CRF values? Thanks in advance!