Question / Help 59.94 vs 60 / 29.97 vs 30 fps

EpicReset

Member
Since the NTSC standard is 59.94 fps or 29.97 fps, should I really set my fps of my sources and output to the NTSC standard?

For example my current setup is:
Code:
Sources:
-Logitech C920 webcam capturing at 29.97 NTSC fps
-Elgato HD60pro capturing at 59.94 NTSC fps

OBS: Video settings:
Base res: 1920x1080
Output res: 1920x1080
FPS: 29.97

Output Settings;
Rescale output: 1280x720
This setup has all my sources set to NTSC standards and my output is set to the 29.97 NTSC standard. The Elgato Game Capture HD60pro lists the capture source as reading 59.94 fps so I wanted to match that to prevent any sync issues.

Now... my questions is does this really matter or is this just a simple matter or terminology, where 29.97 is just how it's listed, but the NEW standard setting is 30 and 60 fps? considering that consoles can actually target 60fps.

So is it just semantics or should I really set my settings to "60" and "30" fps?
 

R1CH

Forum Admin
Developer
You should only care about fractional frame rates if you're dealing with legacy systems / analogue TV and such. 30 / 60 is fine for digital broadcast.
 

EpicReset

Member
You should only care about fractional frame rates if you're dealing with legacy systems / analogue TV and such. 30 / 60 is fine for digital broadcast.

I figured that much, but my capture card says that the source input is 59.94 fps (even if I select the fps on the card to capture at "60fps"), So I'm wondering how many issues that is causing when running obs at 60fps.

See attached screenshots. You can see I have the FPS of the capture card set to 60fps, but the source info when checking the configuration settings still says the source input is 59.94... which is why i've been sticking to using NTSC 29.97 and 59.94 options on the output.
 

Attachments

  • hd60pro_settings.png
    hd60pro_settings.png
    532.4 KB · Views: 1,199
  • hd60pro_info.png
    hd60pro_info.png
    25.9 KB · Views: 1,161

carlmmii

Active Member
If you're talking about matching exact frame rates, then any time you're dealing with true NTSC frames it's going to be either 29.97 or 59.94. This is true for broadcast television, VHS, DVD, anything that outputs a composite, s-video, or component signal.

HDMI is where things start to change. Most home consoles still use the NTSC standard of 59.94fps, but this is actually just a holdover. There's no requirement for HDMI to adhere to the NTSC standard, and this is very much true for computer monitors where you're likely getting a flat 60fps display now.

For capture, as R1CH said, the only time you should ever really worry about exactly matching framerate is when dealing with retro sources, specifically for the purpose of deinterlacing. As long as the deinterlacing algorithm is behaving nicely, all it needs is to be fed the exact top/bottom frame order and it will be able to output a smooth picture. If there's any desync though (say, from assuming there's supposed to be an extra frame because it's expecting 30fps instead of 29.97fps), that will result in deinterlacing artifacts until it can resync -- lines will look doubled, or be bobbing rapidly, or it will just look interlaced still.

For progressive capture, the impact of a lost frame due to desync is just that -- a lost frame.

To put things in perspective a little bit more, the difference between 59.94fps and 60fps is 1 frame every 16.7 seconds. For the purposes of capture, as long as you're feeding a progressive scan, it honestly doesn't matter if you chose to go with 59.94 or 60 for your output framerate. Most likely, random framestutters from the capture chain, render hiccups, or even playback stutter are going to be much more noticeable.

Another thing to keep in mind. Most playback devices that people are watching on are now 60fps devices -- monitors, phones, even TVs. There is going to be a framerate conversion at some point in the chain if this is the case -- all that changes is where that framerate conversion happens.
 
Top