Refresh Rate/Frame Rate

Ultramarine

New Member
I've just started using OBS and have already learned a great deal browsing through the forums here. Nonetheless, I have a couple of questions, which I'm hoping someone knowledgeable might take the time to answer:

- I'm using Windows 11. My display (a Samsung Smart TV) has a recommended resolution of 3840x2160, which has a maximum refresh rate of 30hz. My (limited) understanding is that a higher refresh rate equals smoother motion, so I've lowered the resolution to 2560x1440 (the highest available resolution to offer 60hz, although in reality it's actually a choice between 60.1hz or 59.95hz). Is this the right approach?

- What I want to record is a video streaming in my web browser (Firefox). As far as I can tell, it's streaming at 25FPS, so I'd planned to record at 25fps in OBS. However, if the display refresh rate is set at 60hz, should the screen recording frame rate be either 30 or 60, irrespective of the original video frame-rate? Will the 60.1hz rather than a straight 60hz cause issues here?

Many thanks in advance!
 

qhobbes

Active Member
You should really contact the owners/creators of the content on how to obtain copies of their content. OBS is for creating content, not copying others content.
 

AaronD

Active Member
You should really contact the owners/creators of the content on how to obtain copies of their content. OBS is for creating content, not copying others content.
Absolutely yes! Somehow I missed that part the first time around, that this is a common way to steal copyrighted work.

For others who might read this and have a legitimate reason (they do exist, but they're rare), please state that reason up front. If I'd been thinking a bit clearer, I'd have given the same response and left without help.

And after a few seconds more thought, I'm deleting that help. If it turns out that there *is* a legitimate reason, I can type some more.
 

Ultramarine

New Member
I assure you that my reason is legitimate. A friend was playing at a badminton tournament in Belgium and wanted copies of her matches for review from the live-stream; there's no issue of copyright here.

Regarding the refresh-rate/frame-rate issue, if the refresh rate is 60.1hz, would it be best to have the frame-rate also be 60fps? Or, as I'm in PAL country, would I be better going for 50hz/50fps?

Windows 11 states that 2560x1440 is available at 60hz, but once that is selected, it then only offers the choice of 60.1 or 59.95hz (along with other, lower refresh rates). The only other resolution that offers a higher refresh rate is 1920x1080, which DOES actually offer an exact refresh rate of 60hz.
 

AaronD

Active Member
I assure you that my reason is legitimate. A friend was playing at a badminton tournament in Belgium and wanted copies of her matches for review from the live-stream; there's no issue of copyright here.
Is there a recorded copy, separate from the stream?

This is a major reason to always record locally as well, at the same time and in addition to the original stream, even if the intent in the moment is only to stream. If the original stream used OBS to produce it, there's a setting to automatically record when streaming.

It's better to use a separate encoder with higher quality, to make the recording, but even if you just siphon off the stream data to a file as well as sending it out, that's still a lot better than having to mess with this, and convincing people that have seen the same question many times before, that you're actually not stealing someone else's content.

It can be surprising too, who actually owns something. Your friend was playing in the tournament, but that doesn't mean that she owns the video. In fact, most of the time, she won't. Who produced it? Was it for hire? Was it sold to a distribution network in exchange for royalties? Etc. All of that affects who the likely owner is, that you need to track down.

For a slightly different but similar example, most bands actually have to license their own songs in order to play them in concert, because they did just that: sold them to a distributor to get better royalties, which means that the distributor now owns the songs and not the band anymore. Did that happen with the video that you're looking for? Who ACTUALLY owns it?
 

Ultramarine

New Member
The entire tournament is available to replay, but each court's coverage in the early rounds is about nine hours long, so I was hoping to be able to just record her matches (which are only about 30 minutes or so) for training purposes. As the footage is freely available (though not conveniently accessible) and with no apparent commercial application, I wouldn't think ownership is an issue. Many thanks for taking the time to respond, regardless!
 

rockbottom

Active Member
Geez, you guys sound like the FBI.

If it's available for replay, try 4k Downloader see if you can get it. I use Jaksta but this one is free.


If it doesn't work with the site hosting the replay, match framerates as close as possible when you record.
 

AaronD

Active Member
Unfortunately, I think their lawyers are better than yours. Or ours for that matter. Regardless of what anyone thinks of them otherwise.

One person may or may not be worth destroying at a complete loss to the company, just to make an example of (like random Napster users), but publicly encouraging it makes for a much bigger and more attractive target.
 

AaronD

Active Member
If it really does STAY personal, then yes. But you know people: give them an inch and they'll take a mile, or "I have this, no idea where or how I got it, but I'll use it," etc.

Anything at all beyond your own personal archive and nowhere else at all, ever, needs to have clearly documented permission from the actual owner to do what you're actually doing with it. Again about asking for an inch and taking a mile: you can still get slammed for that, because you don't actually have permission to do *all* of what you actually did with it.

Fair Use does exist, but it will always be a legal grey area, and each instance must be proven in court. Not the sort of thing you want to find the boundary of by trial-and-error.

I do have a personal archive, along with a list of when and where I got each thing. I haven't yet, but if I want to use something in a way that is not clearly Fair Use, I have that record of where I got it, and I can start the process from there of getting permission and maybe even a better quality original file.

What I'm concerned about here, is not so much the OP anymore, but others who see an unqualified encouragement to "Just go download stuff!", and then they use it to plug their Patreon... That is definitely NOT Fair Use, and blatant copyright infringement if they just grabbed it and used it with no more consideration than that. Not the sort of thing we want to promote here.
 

rockbottom

Active Member
I won't stop helping people. I provided a link to an alternate app that may or may not do the job, nothing more. Not sure why you care what other people download or do with it.
 

Ultramarine

New Member
Many thanks for your help, rockbottom! The footage is solely for personal use, so I don't envisage any issues.

Thanks also to everyone else who has taken the time to respond.
 

Ultramarine

New Member
As a postscript to all of the above, my original inquiries still stand regarding my own screen recordings of my desktop etc. Is it always best to use the highest available display refresh rate (60.1hz in this instance)? If so, what's the best frame-rate to use for recordings? 30?

Are PAL/NTSC irrelevant factors these days? Is there ever an instance when, say, it might be better to use 50hz/25fps in PAL territories?
 

AaronD

Active Member
As a postscript to all of the above, my original inquiries still stand regarding my own screen recordings of my desktop etc. Is it always best to use the highest available display refresh rate (60.1hz in this instance)? If so, what's the best frame-rate to use for recordings? 30?
Generally, use the lowest settings for both size and speed that produce an acceptable result, while sticking to integer scaling factors. If your screen is 60Hz, then run your screen capture at 60 or 30, depending on how much motion you have.

A *slight* error isn't going to hurt much. A 60.1 display captured at 60, drops one frame every 10 seconds. Probably not noticeable. The same display captured at 30 will normally drop every other frame, but every 10 seconds it'll drop two to get back in sync.

Running that display at 59.95 while capturing at 60 will duplicate one frame every 20 seconds. Capturing at 30 will grab adjacent frames then, instead of skipping one. Again, probably not noticeable.

Likewise for the display end. If it's going to be displayed at 60fps, then your content should ideally be 60 to match, or an integer scale factor away from it like 30. But small errors probably won't be noticed.

I'm still curious why your system doesn't offer an exact 60, and why it gives you two that are so close. Maybe it has something to do with a convenient divisor from an internal clock somewhere???
(Kinda like MIDI runs at 31250 baud, which is pretty far from any of the standard serial rates, simply because it's exactly 1/32 of 1MHz, which was itself a practically universal system clock at the time. (integer multiples of 1MHz are still common today, for low-cost embedded stuff) And not only is 32 an integer, but it's actually a power of 2! Even better! (2^5, specifically) That makes the divider plumb easy: run a counter at the 1MHz system rate, and every time the lower 5 bits roll over, send another bit of data.)
With such a close choice between 59.95 and 60.1, it's probably not doing integer division, but fractional schemes exist too, which more-or-less increase the clock to an integer multiple and then integer-divide that. My guess is that you're actually choosing the pixel clock, not the frame clock, and that difference is 1 least-significant bit in a divisor somewhere. I'd be interesting to know for sure though.

Are PAL/NTSC irrelevant factors these days? Is there ever an instance when, say, it might be better to use 50hz/25fps in PAL territories?
You'd think so, but since everything is an incremental improvement all the way back to the original TV broadcasts, parts of the old mindsets still hang on. The framerate is part of that.

Composite -> VGA -> DVI at least, all keep the exact same concept of sending only one color at a time, and using logically-separate sync signals to sweep that one color across the entire screen as it changes. All of them even keep the "porches" of undisplayed time between scan lines and between frames, which can be thought of as effectively a "picture frame" around the intended image that you're not supposed to see. Originally, that was meant to give the TV circuitry some time to reset so it can draw the next line or frame, but it became used for other things too, like closed captioning as digital information on one of those undisplayed lines:

DVI is interesting because it converts 8-bit color into 10-bit codes, both for error-correction and to embed both video-sync and bit-sync signals (and probably other information as well) as additional codes on the same wires during the porches. But the general concept that it uses to draw the picture itself is still exactly the same as in the first TV's.

Modern HDMI is backwards compatible with DVI, so those adapters really are just "dumb wires" from one connector to another with no logic at all. (and a number of graphics cards can output full-spec HDMI on their DVI ports: all you need is that adapter) The actual spec for HDMI is behind a paywall, but DVI and previous are open if you're curious and ambitious enough to look them up and wrap your head around them.
 

Ultramarine

New Member
Many thanks for the very informative post! It's greatly appreciated. Re: the display situation, I'm also at a loss, given that it's a 4K Samsung television and I would have expected it to offer at least 60hz from the highest resolution down. That fact that I need to reduce it to 2560x1440 is frustrating; the fact that I then get 60.1hz even more so!

For what it's worth, the television is a Samsung Ue43tu7100 and it's connected to the computer via HDMI to a display port input via an adaptor (this may or may not be significant) as the computer has no HDMI inputs.

One separate question: is it generally good policy to output to a lower resolution than the base canvas? I don't need 2560x1440 video, but I've seen several posts elsewhere in the forum advocating making the base canvas and output resolution the same to obviate the need for scaling. Does the processing required for scaling outweigh the processing required for recording enormous videos? Assuming not, what scaling method is best? Bicubic or Lanczos?
 

AaronD

Active Member
One separate question: is it generally good policy to output to a lower resolution than the base canvas? I don't need 2560x1440 video, but I've seen several posts elsewhere in the forum advocating making the base canvas and output resolution the same to obviate the need for scaling. Does the processing required for scaling outweigh the processing required for recording enormous videos?
I really don't see a reason to make the canvas and output different. If you scale up, you're not adding anything useful, and if you scale down, you're calculating more than what you keep. In both cases, you're doing more work than necessary.

Maybe someone has a reason, beyond fooling an algorithm somewhere (people have said that about YouTube), but I can't think of one.

If your original content is small, set everything to be that small. If you want your result to be smaller than the original, set everything to be that small, and scale it down on the input end.

what scaling method is best? Bicubic or Lanczos?
Generally, the options are in order of quality. If you really care, you might look up what each one actually does, where it works well, and where it doesn't.

But if you're not scaling anyway, then that point is moot.
 

Ultramarine

New Member
How does one "scale it down on the input end"? If I want to record my whole desktop, but not at the display resolution of 2560x1440, what would be the way of doing this?

Regarding my refresh rate issues, as far as I can tell, the television should be able to support 60hz even at 4k. Presumably this means there's a bottleneck being caused either by the graphics card (Intel HD Graphics 630), the display port/hdmi adaptor or the hdmi cable itself.
 
Top