I have finally fix the legendary INTERVAL STUTTER issue, and I wish OBS would integrate it within the setting.

Genshin DPS Calc

New Member
For years I have been frustrated by the random stutters that appear every few minutes or so, using both of my PC (old and new).
So after spending weeks of trials and errors, I have finally able to fix it 100% (at least for the recording), and I can even manipulate the stutters, and predict when is going to happen.
I know the culprit behind those random stutters, which some people already mentioned, which is the refresh rate difference between the monitor and the video frame rate.

ANALYZING THE CULPRIT
I used VSYNC OFF for OBS program setting in Nvidia control panel to know exactly what's going on before the stutter happens.
So when the OBS is out of sync with the captured source, it will stutter, but the problem is, the stutter duration can be much longer than it needs to be.
For example:
If my monitor is 60Hz and my OBS video frame rate is set to 61fps, then the out of sync will happen every 1 second, and the stutter only lasted like a few frame drops (just a blip to the eye)
But if I set my OBS video frame rate to 60.125fps, then the out of sync (or the stutters) will happen every 8 seconds, BUT the stutter duration will also be longer, as if it's waiting for the sync to get in sync again, without OBS trying to do anything.

In most scenario (most people), where their monitor refresh rate is just slightly inaccurate like 60.001Hz, the out of sync will happen every 1000 seconds or 16.6 minutes, and the stutter duration can be as long as 20-30 seconds.
The solution is, to either set a custom refresh rate so it can be as close to 60.000000 Hz as possible (which not many people can do and not even possible), or change the video frame rate to fractional FPS value.
I have applied both solutions and now I can record perfectly without any random stutters anymore.
But this involves me opening up displayhz.com and wait for the result to be as accurate as possible.
Even with that, GPU clock speed also affect the Hertz value, so I had to open up my game and displayhz simultaneously, to get the perfect Hertz of my monitor (simulated with playing a game).
Then, in order to put the result (60.000114 Hz in my case) to the OBS video frame rate setting, I must play around with math, trying to find the perfect Numerator and Denominator to get that 60.000114 Hz. Not all people can do this.

This is why I spent a few days to make this video:

Now what I wish from OBS developers are:

1. If you guys can integrate a Hz analyzer to the video frame rate setting, and can automatically calculate and set the perfect Numerator and Denominator for the user, it should be awesome! So a user just need to press the "Check my monitor refresh rate" button, then after waiting for a few minutes, a user can press the "Stop" button, and the result will be applied automatically, with the perfect video FPS matching the monitor refresh rate.
Keep in mind that for most accurate result, the GPU clock speed must be brought to the running speed, to simulate a user recording a running game. GPU clock speed also affect the sync as shown at 8:42 on my video above)

2. Or, if you guys can bring variable frame rate for recording, I guess that would solve the issue completely? at least for recording. I know that Windows Game DVR can record a perfect game video with no stutters because it uses variable frame rate. The only issue is that their bitrate is maxed out at 72Mbps and can't go higher than 2560px width or 1440px height.

3. BUT, all of those solutions can't be used for streaming, because as far as I know, you can't stream using weird fractional FPS value.
So you are left with the constant 60fps setting for streaming, and the only solution I can think of is, once OBS detected that the video is about to out of sync (check out the sync demo at 9:15 on my video above), OBS should somehow "reset" the capture and have it refreshed from the middle of the sync. So instead of having full blown 20 seconds of full stutters, you get a few frame drops instead, just a blip every few minutes.
I don't know if it can be done, but I'm hopeful!
Maybe once it detected a couple of frame drops in a pattern, then reset?

I think that's it from me, thank you for reading and I hope we can find a true solution to end this interval stutters issue that has been plaguing for years.
 

TsuZing

New Member
Many gamers use variable refresh rate technology, which significantly reduces latency while preventing screen tearing. When variable refresh rate enabled, game enabled vertical sync, monitor refresh rate is always equal to the game frame rate, and even if the game does not lag, the game frame rate will be slightly lower than the standard refresh rate (for example, monitor standard refresh rate is 144Hz, game's frame rate is usually 143.xxHz), which causes us to stutter forever.
In fact, new video encoding foramt supports variable frame rates. I don't know the details of this technology, but literally, is it possible to synchronize video frame rate with monitor refresh rate? (Although this will cause stuttering when playing this video, it can also be solved using a player that supports variable refresh rate. As far as I know, mpv-windows supports it)
As more and more players using high refresh rate monitors, variable refresh rate will become more and more popular. I think fixing stuttering is an important feature.
 

AaronD

Active Member
That may be true, and a big benefit at one specific point in the chain - the recording is no longer jittery - but the problem does not go away. It only moves to another part of the overall system.

You now have a variable-rate recording to play back, or often, to edit. Playback *might* be able to adjust the monitor's refresh rate to match the recording, but the editor is likely using a fixed rate. Playback on a casual viewing platform, like YouTube, Twitch, etc., is guaranteed to have a fixed rate for display.

Maybe the editor and distribution platforms handle it gracefully, maybe they don't. If they don't, we'll get a bunch of complaints about OBS making a bad recording because it doesn't edit or watch well and others do.

---

By the way, there's no visual purpose to running anything higher than 60fps. You already can't see that fast. Any perceived benefit comes from anticipating, not reacting, and you can do that just as well at 60 as you can at 120, 144, etc.

Slow motion would benefit, but that's pretty much it. *Displaying* faster than 60fps is entirely placebo.

---

Digital audio has had this problem for a long time: different independent sample rates from different sources, that all need to be combined. So maybe the real solution here is to copy what audio does: take the oversampled display signal, and resample it (same terminology for the same thing) to something reasonable and fixed.

Bad resamplers are of course obvious, but a mathematically correct one is completely transparent. You'd never know it was there, even after many trips back and forth between different sample/frame rates, provided that all of them are faster than we can respond to.
 
Last edited:

TsuZing

New Member
That may be true, and a big benefit at one specific point in the chain - the recording is no longer jittery - but the problem does not go away. It only moves to another part of the overall system.

You now have a variable-rate recording to play back, or often, to edit. Playback *might* be able to adjust the monitor's refresh rate to match the recording, but the editor is likely using a fixed rate. Playback on a casual viewing platform, like YouTube, Twitch, etc., is guaranteed to have a fixed rate for display.

Maybe the editor and distribution platforms handle it gracefully, maybe they don't. If they don't, we'll get a bunch of complaints about OBS making a bad recording because it doesn't edit or watch well and others do.

---

By the way, there's no visual purpose to running anything higher than 60fps. You already can't see that fast. Any perceived benefit comes from anticipating, not reacting, and you can do that just as well at 60 as you can at 120, 144, etc.

Slow motion would benefit, but that's pretty much it. *Displaying* faster than 60fps is entirely placebo.

---

Digital audio has had this problem for a long time: different independent sample rates from different sources, that all need to be combined. So maybe the real solution here is to copy what audio does: take the oversampled display signal, and resample it (same terminology for the same thing) to something reasonable and fixed.

Bad resamplers are of course obvious, but a mathematically correct one is completely transparent. You'd never know it was there, even after many trips back and forth between different sample/frame rates, provided that all of them are faster than we can respond to.
Frames are discrete rather than continuous, which makes it difficult to resample video to other frame rates while maintaining video quality. only modern solution is variable refresh rate, which results in us being able to record and play videos at any frame rate without stuttering and resampling.
For video itself, little benefit from high frame rate. But for gaming, obvious benefits from high refresh rate. Any player who often plays games such as FPS or Minecraft PVP can feel difference between 120Hz and 60Hz. 120Hz always smoother and lower latency.
If a player uses 144Hz or 165Hz monitor but records video at 60FPS, this will also cause stuttering.
 

AaronD

Active Member
Frames are discrete rather than continuous, which makes it difficult to resample video to other frame rates while maintaining video quality. only modern solution is variable refresh rate, which results in us being able to record and play videos at any frame rate without stuttering and resampling.
Audio is discrete too. Everything digital is. Resampling anything is completely transparent (to continue to use an audio term; you can imagine what that really means and apply *that* meaning to the frame rate of a video), as long as you actually get the math right!

Not everything does get the math right, and *that's* where the artifacts come from. The process itself is not flawed, only certain implementations are.
 

rockbottom

Active Member
A better solution is to avoid using 144/165HZ monitors altogether. Get a 240 or 360HZ monitor & run it at the highest "Fixed" refresh rate that the installed hardware can support without overload.
 

TsuZing

New Member
A better solution is to avoid using 144/165HZ monitors altogether. Get a 240 or 360HZ monitor & run it at the highest "Fixed" refresh rate that the installed hardware can support without overload.
Just theory, high resolution and 240Hz are incompatible unless you have high-end monitor and gpu, and if game cannot stabilize at 240FPS, stuttering still exist
 

TsuZing

New Member
Audio is discrete too. Everything digital is. Resampling anything is completely transparent (to continue to use an audio term; you can imagine what that really means and apply *that* meaning to the frame rate of a video), as long as you actually get the math right!

Not everything does get the math right, and *that's* where the artifacts come from. The process itself is not flawed, only certain implementations are.
Audio comes from reality, it must be continuous, ADC makes it discrete. But video does not necessarily come from reality, game and other computer graphics are very random and discontinuous, these videos are not suitable for resampling, resulting in severe artifacts.
 

AaronD

Active Member
Audio comes from reality, it must be continuous, ADC makes it discrete. But video does not necessarily come from reality, game and other computer graphics are very random and discontinuous...
No, they're both the same, whether they come from physical measurements or from a computer. They're both physically continuous, and anything discrete that is sampled faster than Nyquist appears continuous to us. Any content that is itself faster than we can perceive, gets blurred or lost, even if it's presented to us perfectly, so there's no point in presenting it to us. The only difference is the minimum required sample rate to appear continuous, and thus the point where the system might as well drop the details because we physically can't catch them anyway.

So, there's the highest frequency that we can perceive (20kHz for audio, and about 20Hz for video), and Nyquist from that. (must sample greater than 2x the highest desired frequency, with a mathematical/engineering understanding of "greater than", which is simply a binary yes/no, *not* bad/okay/better) Works exactly the same way for both eyes and ears. Same unit, with the same meaning: only a different number to go with that same unit and meaning.

Sampling faster than Nyquist, plus a comfortable engineering margin to make the system easy, whether for sound or for picture, is a waste of data, unless you have a specialized use for it. Slow-motion and ultrasound are common reasons to sample faster, but direct perception, even if it comes from that or a similar process, benefits nothing from sample rates beyond Nyquist.

Even though it's discrete-time sampled, that does NOT mean that you're limited to events happening at those discrete time intervals. A lot of people get hung up on that, but you really can, and almost always do actually, have events preserved *between* samples.
That's entirely an audio demonstration, but again, the exact same thing applies to video too. Both frames and pixels.

...these videos are not suitable for resampling, resulting in severe artifacts.
You must have had some bad resamplers. It's easy to cut corners or misunderstand the math, and get it wrong. THAT is where the artifacts come from, not the process itself.
 
Last edited:

rockbottom

Active Member
Just theory, high resolution and 240Hz are incompatible unless you have high-end monitor and gpu, and if game cannot stabilize at 240FPS, stuttering still exist

Get better hardware that can actually do what you are trying to do or run the monitor @ 180, 120 or 60, all are much better options than 144 or 165. Just do the math.
 

ItalicMaze

New Member
With regards to the OP, I think we've been approaching this issue the wrong way. The OBS team can only do so much, and we're probably better off redirecting our attention to Microsoft and hardware manufacturers.
 

Gora

New Member
No, they're both the same, whether they come from physical measurements or from a computer. They're both physically continuous, and anything discrete that is sampled faster than Nyquist appears continuous to us. Any content that is itself faster than we can perceive, gets blurred or lost, even if it's presented to us perfectly, so there's no point in presenting it to us. The only difference is the minimum required sample rate to appear continuous, and thus the point where the system might as well drop the details because we physically can't catch them anyway.

So, there's the highest frequency that we can perceive (20kHz for audio, and about 20Hz for video), and Nyquist from that. (must sample greater than 2x the highest desired frequency, with a mathematical/engineering understanding of "greater than", which is simply a binary yes/no, *not* bad/okay/better) Works exactly the same way for both eyes and ears. Same unit, with the same meaning: only a different number to go with that same unit and meaning.

Sampling faster than Nyquist, plus a comfortable engineering margin to make the system easy, whether for sound or for picture, is a waste of data, unless you have a specialized use for it. Slow-motion and ultrasound are common reasons to sample faster, but direct perception, even if it comes from that or a similar process, benefits nothing from sample rates beyond Nyquist.

Even though it's discrete-time sampled, that does NOT mean that you're limited to events happening at those discrete time intervals. A lot of people get hung up on that, but you really can, and almost always do actually, have events preserved *between* samples.
site
That's entirely an audio demonstration, but again, the exact same thing applies to video too. Both frames and pixels.


You must have had some bad resamplers. It's easy to cut corners or misunderstand the math, and get it wrong. THAT is where the artifacts come from, not the process itself.
It's great that you mentioned the possibility of resamplers making mistakes. It's a really important point, and I can see how misunderstanding the math can lead to artifacts.
 

JoobieDoobieDoo

New Member
By the way, there's no visual purpose to running anything higher than 60fps. You already can't see that fast. Any perceived benefit comes from anticipating, not reacting, and you can do that just as well at 60 as you can at 120, 144, etc.

Slow motion would benefit, but that's pretty much it. *Displaying* faster than 60fps is entirely placebo.
Oh here we go. This one again.

Play a moving white line on a blank screen. One moving at 165hz, the other at 40hz. Tell me they look exactly the same.
Open up a smart phone made in the last 2-3 generations. And then open up an Iphone 8. Scroll up and down on a white screen with black text on both. Tell me they don't look different at all.

It's just literally not true.
 

AaronD

Active Member
Oh here we go. This one again.

Play a moving white line on a blank screen. One moving at 165hz, the other at 40hz. Tell me they look exactly the same.
Open up a smart phone made in the last 2-3 generations. And then open up an Iphone 8. Scroll up and down on a white screen with black text on both. Tell me they don't look different at all.

It's just literally not true.
I think what you're missing there, is the required lowpass that's involved in resampling. With video, that's especially not trivial, given the 3 dimensions that you have to keep track of simultaneously: 2 in space and 1 in time.

The entire process is not to plot the (wrong, but popular) stair-step function, and then sample that at the new rate. TONS of artifacts when you do it that way! Instead, the process is to:
  1. Find a common multiple of both the input and the output sample rate.
  2. Stuff zeros (yes, zeros, not the previous value) between the original samples to get that common-multiple intermediate rate.
  3. Lowpass, to satisfy Nyquist at the new rate (both frequency and slope), and gain it up by the ratio of stuffed zeros to original samples.
  4. Pick samples out of that, at the new rate, and throw the rest away.
For video, you're sampling both in time AND in space. Not accounting for those multiple dimensions when doing that process, even if you're only resampling one dimension, does produce artifacts. But if you do it right, then both the lower rate and the higher rate produce the same persistence of vision, and therefore look identical.

If you design the moving bar test so that it only appears in certain positions on the screen, trying to trick the resampler into producing artifacts: that's aliasing on purpose, and not representative of real-world content, or it's a high enough frequency to be hard to perceive anyway. Kinda like an audio sinewave at 12kHz, sampled at 48kHz. How easily can you hear that, and how important is it that you hear that? Likewise for the visual frequency of events.

If you're using the native phone screen to test this, or a tablet, or desktop PC, etc., they're not intended to produce buttery smooth motion with the native UI. (except for perhaps one, and it cheats: see below) They're intended to "just get stuff on the screen that you can interact with." Quick-and-dirty, "just put it there." Hardly any math at all, and certainly not what I described above! So phone-scrolling is not really a good test either.

Technically, those devices (all devices, really) also "just put the frames of a game or movie in front of you," with no more consideration than that, and as long as the display's refresh rate matches the content that it's showing, that's all they're expected to do. At that point - just playback - anti-aliasing the motion is supposed to already be done and included in the frames themselves, for display at whatever rate that is.

I think the insanely high frame rates that are becoming popular, are really just a way to avoid a bunch of math. It's still "quick and dirty" for each frame, completely ignoring the math to do it right, and relies on POV to "fix it up" at the VERY end of the process by effectively smearing a bunch of frames together.

Then people think it's absolutely required to have such high rates, or that they can literally see 500Hz or something like that. Neither of which is true.

If you can run the math do it right, then you can get the exact same perception with a much lower rate, and correspondingly less data.
 

Pug

New Member
the Interval stutter issue, mentioned by the original poster. Does everyone using OBS suffer from this? As i do, on multiple computers, multiple monitors, after a period of time of complete smoothness, i will get a period of stutter until it clears for the next cycle like clockwork. Does every streamer online suffer the same? is there no solution? The OP's solutions really just put back the stutter so you will suffer longer from it later. I find it baffling as i am sure i have watched online game streamers and dont remember their stream stuttering for 5mins after being smooth for 2hrs and then repeating.

I just find it shocking to believe that everyone is suffering from this. And if some people arent, why arent they?
 

rockbottom

Active Member
OBS does not cause stuttering. Some AMD systems can suffer from intermittent stutters (BIOS update fixes it), it could also be a setting(s) issue, driver, or hardware... Post a complete log with a recording/streaming session.
 

rizbon

New Member
I m suffering with frame duplication every 7min, but it is completely gone if im enabling DEINTERLACING MODE: DISCARD for PS5 source.
Also the strange fact: If im playing 60 fps (59.94) i have buttery smooth gameplay, but if im enabling 120hz on PS5 it jittering in obs. For Windows games there is no such issue, im playing at 120fps, and recording at 60 with buttery smooth frames.
 
Top