I have never seen a "hq_cbr" rate control. There is CBR and there is VBR, sometimes ABR, but that's it.
A "high quality" modifier within the encoder settings usually asks the encoder to be more thorough with motion prediction, or in general with features that will result in better prediction of an encoded image, so the quality gets better. This needs more computing time within the encoder, regardless if software or hardware encoder, so the maximum frame rate for the encoder becomes lower. If the maximum frame rate for a given encoder setting is lower than the frame rate of your output, you will get lost frames.
There are also quality-based rate controls like CQP, CRF or ICQ. This is something different to a "high quality" modifier, and comes on top.
With a bitrate-oriented rate control (CBR, VBR, ABR), the encoder is forced to remove as much detail required to produce output with a bitrate not higher than the requested bitrate. If the footage is high motion and the requested bitrate is low, this will get a blurry mess.
On the other hand, if you choose a quality-based rate control (CQP, CRF, ICQ), you give the encoder no bitrate constraint but instead tell the encoder to remove a constant amount of detail. There is the quality parameter of 0..50 that tell how much detail to remove. 0 removes nothing, 50 removes almost all. Sane values are in the range 15-35.
The "high quality" modifier works just like with CBR: the encoder works harder to perform better prediction, thus a bit better quality.
To classify the strength of possible improvements: If you enable or disable a "high quality" modifier, you vary the quality about 1-5 percent. That's often not visible. Changing from CBR to CQP enables you to vary the quality up to 90%. Just by the flick of the number you can get from blurry mess to quasi-lossless. You pay with disk space.