As a postscript to all of the above, my original inquiries still stand regarding my own screen recordings of my desktop etc. Is it always best to use the highest available display refresh rate (60.1hz in this instance)? If so, what's the best frame-rate to use for recordings? 30?
Generally, use the lowest settings for both size and speed that produce an acceptable result, while sticking to integer scaling factors. If your screen is 60Hz, then run your screen capture at 60 or 30, depending on how much motion you have.
A *slight* error isn't going to hurt much. A 60.1 display captured at 60, drops one frame every 10 seconds. Probably not noticeable. The same display captured at 30 will normally drop every other frame, but every 10 seconds it'll drop two to get back in sync.
Running that display at 59.95 while capturing at 60 will duplicate one frame every 20 seconds. Capturing at 30 will grab adjacent frames then, instead of skipping one. Again, probably not noticeable.
Likewise for the display end. If it's going to be displayed at 60fps, then your content should ideally be 60 to match, or an integer scale factor away from it like 30. But small errors probably won't be noticed.
I'm still curious why your system doesn't offer an exact 60, and why it gives you two that are so close. Maybe it has something to do with a convenient divisor from an internal clock somewhere???
(Kinda like MIDI runs at 31250 baud, which is pretty far from any of the standard serial rates, simply because it's exactly 1/32 of 1MHz, which was itself a practically universal system clock at the time. (integer multiples of 1MHz are still common today, for low-cost embedded stuff) And not only is 32 an integer, but it's actually a power of 2! Even better! (2^5, specifically) That makes the divider plumb easy: run a counter at the 1MHz system rate, and every time the lower 5 bits roll over, send another bit of data.)
With such a close choice between 59.95 and 60.1, it's probably not doing integer division, but fractional schemes exist too, which more-or-less increase the clock to an integer multiple and then integer-divide that. My guess is that you're actually choosing the pixel clock, not the frame clock, and that difference is 1 least-significant bit in a divisor somewhere. I'd be interesting to know for sure though.
Are PAL/NTSC irrelevant factors these days? Is there ever an instance when, say, it might be better to use 50hz/25fps in PAL territories?
You'd think so, but since everything is an incremental improvement all the way back to the original TV broadcasts, parts of the old mindsets still hang on. The framerate is part of that.
Composite -> VGA -> DVI at least, all keep the exact same concept of sending only one color at a time, and using logically-separate sync signals to sweep that one color across the entire screen as it changes. All of them even keep the "porches" of undisplayed time between scan lines and between frames, which can be thought of as effectively a "picture frame" around the intended image that you're not supposed to see. Originally, that was meant to give the TV circuitry some time to reset so it can draw the next line or frame, but it became used for other things too, like closed captioning as digital information on one of those undisplayed lines:
Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.
www.youtube.com
DVI is interesting because it converts 8-bit color into 10-bit codes, both for error-correction and to embed both video-sync and bit-sync signals (and probably other information as well) as additional codes on the same wires during the porches. But the general concept that it uses to draw the picture itself is still exactly the same as in the first TV's.
Modern HDMI is backwards compatible with DVI, so those adapters really are just "dumb wires" from one connector to another with no logic at all. (and a number of graphics cards can output full-spec HDMI on their DVI ports: all you need is that adapter) The actual spec for HDMI is behind a paywall, but DVI and previous are open if you're curious and ambitious enough to look them up and wrap your head around them.