Meanmom
New Member
Hi all, my first post here!
I just found OBS a few days ago, and wow i'm blown away. Went thru the usual challenges like the "encoding overload" which gave me cause to clean my windoze down to bare metal hah, all is well now. Absolutely love simplewall (replaces windows firewall) now I can see every time anything tries to access networking. Followed this brilliant piece of work here to clean up windows:
beebom.com
Anyways, I've been testing OBS (totally thrilled btw) recording a stream, and I'm playing around with the bitrate settings, to see differences in playback quality. I recorded a 1080p60fps stream at "stream", "HQ" (high quality, medium file size), and "IQ" (large file size) to compare. Saved those 3 mp4 files to my Emby server and watched them on my HD tv (via Roku), and I noticed something curious... all 3 look pretty much the same! I'm old, so maybe my eyes ain't what they used to be.
That got me thinking... if it records a higher bitrate than the stream (if it is recording just the actual stream), where are all of those extra bits & bytes coming from? Over sampling the stream? Why do that?
I'm a typical older frugal person, "waste not and want not", just wondering why people would oversample, since OBS can record the actual stream.
Thanks for reading, and I'd love to hear your thoughts on this!
Cheers
I just found OBS a few days ago, and wow i'm blown away. Went thru the usual challenges like the "encoding overload" which gave me cause to clean my windoze down to bare metal hah, all is well now. Absolutely love simplewall (replaces windows firewall) now I can see every time anything tries to access networking. Followed this brilliant piece of work here to clean up windows:

How to Speed up Windows 10 (Effective Methods)
Use our effective ways to speed up Windows 10 PC. From basic to advanced steps, we have covered everything to make your Windows 10 PC faster.

Anyways, I've been testing OBS (totally thrilled btw) recording a stream, and I'm playing around with the bitrate settings, to see differences in playback quality. I recorded a 1080p60fps stream at "stream", "HQ" (high quality, medium file size), and "IQ" (large file size) to compare. Saved those 3 mp4 files to my Emby server and watched them on my HD tv (via Roku), and I noticed something curious... all 3 look pretty much the same! I'm old, so maybe my eyes ain't what they used to be.
That got me thinking... if it records a higher bitrate than the stream (if it is recording just the actual stream), where are all of those extra bits & bytes coming from? Over sampling the stream? Why do that?
I'm a typical older frugal person, "waste not and want not", just wondering why people would oversample, since OBS can record the actual stream.
Thanks for reading, and I'd love to hear your thoughts on this!
Cheers