The 'Default' preset just chooses between High Quality and High Quality Low Latency depending on your resolution and frame rate. Currently, OBS picks High Quality for 1080p60 or greater. 'Auto' would probably have been a better name for it.
'NVDefault' is the real default preset exposed by the NVENC API. It has most of the features enabled except B-frames. There is no encoder latency, since NVENC has no frame lookahead, but frame sizes vary significantly, requiring a buffer for smooth playback.
'High Quality' is essentially NVDefault with B-frames enabled. B-frames usually improve compression efficiency, but, with NVENC, they do not. I don't know if this was fixed in Maxwell, but the High Quality preset actually produces worse results than NVDefault on my GTX 670.
The 'Low Latency' presets add extra processing to try to keep frames at a constant size. Frames can be decoded and displayed in real-time without the need for a buffer. This is good for applications like remote gaming and videoconferencing where every millisecond counts, but it significantly reduces picture quality.
All of the presets are fast enough to be used for real-time encoding; however, if you were encoding multiple streams at a time, you might need to use the faster presets.
In terms of quality, I would rank the presets as such:
NVDefault > High Quality > High Quality Low Latency > Low Latency
The High Quality Low Latency preset preserves more detail than the High Quality preset, but it also generates a lot of blocky artifacts, making it end up looking worse in most cases.
For general use, I would use the NVDefault preset. ShadowPlay uses something like it, and it has very good video quality. The High Quality preset would be useful if I dropped frames a lot or wanted very fast seeking. The Low Latency presets just weren't designed for our use case and subjectively doesn't look very good. Of course, you should test it yourself and see if it looks better to you.