Bug Report Capture FPS Not Followed

Bensam123

Member
Something I noticed while messing around with OBS is when I stream using a directshow device, like DXtory, OBS won't take into account the FPS of the device.

So, I had OBS setup for 120fps, but DXtory was running at 30fps, OBS would run at 120fps regardless of how fast DXtory was inputting (looking at stream fps in the bottom right). Looking at the properties for the DXtory directshow device in OBS said it was set to 30fps and the DXtory capture itself is variable. Now when I switch to screencapture still while having OBS setup to stream at 120fps, OBS only streams at however fast screencaps capturing at, which is variable.

Should OBS be respecting the FPS set by the directshow device it's set to capture instead of overriding it? Why is screencap fps variable, but directshow capture isn't?
 

Lain

Forum Admin
Lain
Forum Moderator
Developer
Hypothetical situation: You are capturing a game at 60fps, but your webcam can only do 30fps. If I were to make the application wait for a device, it would make the entire scene 30 fps, there by slowing down the game for no necessary reason. The device frames come in and can easily be discarded because it's RAM data. Software capture does not work in that same way.
 

Bensam123

Member
Hmmm... but you make it sound as if even if it's capturing at that frame rate there isn't a performance hit. For instance I was capturing a video with DXtory last night, it's a whole 24FPS... Software capture gave me about 24-32FPS, DXTory capture gave exactly 24FPS, but OBS was still recording it at 120FPS... It had ridiculously higher processor utilization as well. We're talking about 30-40% higher on a i5-3570k.

I understand what you're saying about making it wait for other devices (which would be bad), but what about making it use the highest common denominator? So if there really is a device inputting at 120FPS, it'll record that fast, but not otherwise. I'm not entirely clear on this either... Is there any benefit to streaming higher then the FPS is coming in from sources? Like a 60FPS stream for a game that's only running at 30FPS? Does it help with interpreting multiple sources? Like two webcams and a game?

As a work around I can just set the stream to a lower FPS, but I do like the idea of simply setting my FPS for as high as whatever is being recorded.
 

paibox

heros in an halfshel
I'm not quite sure what to make of this, honestly. If you don't want to record at 120 frames per second, do not specify a frame rate of 120 frames per second in OBS.

Adding some weird thing where OBS adjusts its frame rate to the frame rate of a certain source is not something that would be useful for much of anything at all. I'm not actually sure if you can change the base frame rate in the middle of a stream, either, though I might be wrong about that.
 

Bensam123

Member
It's not that I don't want to stream at 120fps,I don't want OBS to stream and encode at 120fps if the content in the stream is not 120fps content.

It would absolutely be useful. It helps saves on computer performance and bandwidth. If my in game FPS is varying between 45-70fps, I don't want OBS to stream and encode at 120fps. Maybe I'm asking for something impossible due to the way encoding works? I assumed it isn't because OBS is variable with software input (based on the FPS indicator in the bottom right).
 

Grimio

Member
I don't understand what the problem here is. Did you or did you not specify 120 FPS in Settings->Video?
If you did, then you get 120 FPS, I mean, it's what you wanted, no?
 

Bensam123

Member
I've explained reasons why it would be useful and needed, if it's possible. I assume it's possible because software capture is doing it. I thought it was a bug because software capture is doing it, but hardware capture is not.
 

neto87

New Member
ok I think i understand what its happening here, and maybe its a bug in obs or at last I dont understand why it happens, if I put my settings to 120 fps and use game capture obs will stream at 120 fps but for some strange reason if I use monitor capture the stream is strangely fluctuating between 20~30 fps with aero on and 105~110 fps with aero off, im using version .466a

and Bensam123 if you dont want to stream at 120 fps you should set it lower in the settings, obs will try to stream at whatever fps you put in settings because you are telling it to stream like that, even if some of the sources are at 30 or 60, there is no point in slowing down the stream if some source its at lower fps, obs will stream at the fps you have in settings, or at least that is the way I think it works

EDIT
Warchamp7 said:
Monitor capture with aero on will always be laggy unless you're on Windows 8. That's just how it works.

yeah I know, I should have said it, that is why I have also test it with aero off, and its still not at 120, and my cpu its not maxed neither the gpu, so I dont know what is limiting the stream fps, maybe its just the way monitor capture works, anyway for me its no problem as I always stream at 30, just something I found while messing with 120 fps
 

Warchamp7

Forum Admin
neto87 said:
ok I think i understand what its happening here, and maybe its a bug in obs or at last I dont understand why it happens, if I put my settings to 120 fps and use game capture obs will stream at 120 fps but for some strange reason if I use monitor capture the stream is strangely fluctuating between 20~30 fps with aero on and 105~110 fps with aero off, im using version .466a

Monitor capture with aero on will always be laggy unless you're on Windows 8. That's just how it works.
 

Bensam123

Member
I don't understand why you guys wouldn't want a stream to stream at whatever FPS is the highest common denominator. If you're streaming at 120fps, then that's wasted bandwidth and wasted processor usage if the content in the stream is only operating at 30fps. Making it variable to suit the content.

Perhaps a tic box for both options? Simply streaming a higher fps with content that is low fps wont improve the quality and I understand slowing the stream down to the lowest common denominator would be bad, but isn't what I'm talking about...

I also mentioned software capture a few different times already having a variable capture rate, even when OBS is set to stream at 120fps... it will only stream at whatever the software capture is capturing at, which is variable. I think I asked for this feature before in the suggestion forums, but was told it was already implemented (for everything).
 

paibox

heros in an halfshel
Technically, it is already implemented for everything. Since your image doesn't change for those frames, an insignificant amount of bit rate will be used to keep displaying the, say, 30 FPS image for another three frames.

The CPU usage is a different matter entirely, to put it simply, the encoder still has to evaluate your frame to check how similar it is to the previous one. People who stream at 120 FPS usually only do so only for testing purposes, you even have to tick a checkbox to enable this since it in itself is not very useful to the majority of people. (And most people can't even see all of your 120 frames per second, since they use 60Hz LCD displays.) The reason I, for example, wouldn't want the stream to update at the frame rate of my sources is because I typically pick a good frame rate for the content I'm going to stream and how much CPU I can spare. For instance, I wouldn't want it to bounce around 60-30 when I stream Borderlands 2, since that would make the CPU usage fluctuate even more than it does during high motion scenes and cause an undesirable experience when playing the game.

If you absolutely want this feature, you could just implement it yourself. OBS is open source, and you are of course free to submit a patch that adds this as optional functionality or something.
 

Bensam123

Member
Aye, I was using 120fps as a more extreme example, as you most definitely can see the CPU usage difference between 30fps and 120fps... But that also means that you could have all the variations in between 30 and 120. 60fps may be a more common example, but content doesn't always get updated that fast from in game which makes it pointless.

For the same reason you'd want to use variable bit rate in encoding instead of constant bit rate, you'd want to use variable FPS. My OP in this thread was operating under the impression that constant FPS wasn't intended and a variable system, such as already in use with software capture, was there for hardware capture as well.

(Lets not get into a discussion of display technology as all we're discussing right now is OBS. That's a whole other topic of discussion that has a bunch of different aspects to look at regarding it.)
 
Top