Something I noticed while messing around with OBS is when I stream using a directshow device, like DXtory, OBS won't take into account the FPS of the device.
So, I had OBS setup for 120fps, but DXtory was running at 30fps, OBS would run at 120fps regardless of how fast DXtory was inputting (looking at stream fps in the bottom right). Looking at the properties for the DXtory directshow device in OBS said it was set to 30fps and the DXtory capture itself is variable. Now when I switch to screencapture still while having OBS setup to stream at 120fps, OBS only streams at however fast screencaps capturing at, which is variable.
Should OBS be respecting the FPS set by the directshow device it's set to capture instead of overriding it? Why is screencap fps variable, but directshow capture isn't?
So, I had OBS setup for 120fps, but DXtory was running at 30fps, OBS would run at 120fps regardless of how fast DXtory was inputting (looking at stream fps in the bottom right). Looking at the properties for the DXtory directshow device in OBS said it was set to 30fps and the DXtory capture itself is variable. Now when I switch to screencapture still while having OBS setup to stream at 120fps, OBS only streams at however fast screencaps capturing at, which is variable.
Should OBS be respecting the FPS set by the directshow device it's set to capture instead of overriding it? Why is screencap fps variable, but directshow capture isn't?