Nvenc on Pascal chip (GTX 10x0 series) is about the same quality as x264 veryfast preset, and Nvenc on Turing chip (RTX 20x0 and GTX 16xx series) is about the same quality as x264 medium preset, sometimes fast preset.
We don't know "how" nvenc does this, it's simply a fact supported by comparison of encoded videos.
Bitrate is an important parameter as well, and also the actual content. You need to compare exactly the same video and exactly the same bitrate if you do a comparison between encoders.
I did this with the Pascal nvenc
here, and nvidia did this with the Turing nvenc
here. Valid comparison is not done by manually compare videos with your eyes but with PSNR values ("peak signal to noise ratio"), which is a kind of value that measures the difference between an original video and the encoded version of the same video mathematically.
Why is nvenc so good: it seems on consumer GPUs it is a byproduct of Nvidia's strategy to provide hardware for cloud gaming services via streaming, such as Google Stadia or Nvidia Geforce Now. Such services can only rely on hardware encoders (x264 is too CPU intensive to be cost-efficient), and are accepted by gamers only, if the video quality is good enough. Acceptable quality is crucial for this kind of service. So Nvidia developed a hardware encoder that is able to compete with the reference h.264 encoder x264 and was successful with that.
AMD probably also tried to enter that market with their VCE, but it seems they prioritized other GPU aspects after the initial effort, because their quality isn't able to compete with nvenc's quality and didn't really improve the last years.