Since the NTSC standard is 59.94 fps or 29.97 fps, should I really set my fps of my sources and output to the NTSC standard?
For example my current setup is:
This setup has all my sources set to NTSC standards and my output is set to the 29.97 NTSC standard. The Elgato Game Capture HD60pro lists the capture source as reading 59.94 fps so I wanted to match that to prevent any sync issues.
Now... my questions is does this really matter or is this just a simple matter or terminology, where 29.97 is just how it's listed, but the NEW standard setting is 30 and 60 fps? considering that consoles can actually target 60fps.
So is it just semantics or should I really set my settings to "60" and "30" fps?
For example my current setup is:
Code:
Sources:
-Logitech C920 webcam capturing at 29.97 NTSC fps
-Elgato HD60pro capturing at 59.94 NTSC fps
OBS: Video settings:
Base res: 1920x1080
Output res: 1920x1080
FPS: 29.97
Output Settings;
Rescale output: 1280x720
Now... my questions is does this really matter or is this just a simple matter or terminology, where 29.97 is just how it's listed, but the NEW standard setting is 30 and 60 fps? considering that consoles can actually target 60fps.
So is it just semantics or should I really set my settings to "60" and "30" fps?