Uxmbra
New Member
Hi - Been using OBS for years, experienced my fair share of problems over the time, but this one in particular has me scratching my head.
I have a dual PC setup but as of recent my partner has shown some interest in gaming and schooling/business stuff could really benefit from the extra horsepower compared to the existing mac book from god-knows when.
I found a good deal on an old think-server and upgraded it to some of the better parts for the existing sockets etc.
The specs of the think-server now:
2x XEON E5 2695V4 18C/36T
128GB DDR4 2133 ECC
Intel ARC 310 (Upgraded from the 2060 for AV1 recording and quick-sync streaming)
970 EVO M.2 1TB boot drive (Windows Server 2022 - All updates)
Elgato 4K60 Pro MK.2
Elgato Facecam (1080p60)
Logitech C922 (1080p30)
The issue is with the rendered frames and the average time to render frames.
On all my other machines, I have no dropped frames unless I absolutely go mental with the quality settings but with the server, even without having a recording active, with a single scene, it can sometimes drop frames and have upwards of 2ms of avg time to render a frame, yes I know 60fps is somewhere near 16.666ms but, it seems unusual that while just having OBS open with a capture card scene can warrant almost 2ms worth of time to render a frame or even drop frames without any active recording.
I have found disabling preview does help marginally but only removes about 0.1-0.3 ms of avg render time and makes it less common for a frame to drop.
Now I understand its older hardware and I am not especially knowledgeable on how NUMA aware OBS is. but even single 2695 V4s should be fine considering they wipe the floor with some CPUS I've seen run similar setups just fine.
I'm willing to give anything a shot, I will also upload a log file when I have access to both my PCS again.
I have a dual PC setup but as of recent my partner has shown some interest in gaming and schooling/business stuff could really benefit from the extra horsepower compared to the existing mac book from god-knows when.
I found a good deal on an old think-server and upgraded it to some of the better parts for the existing sockets etc.
The specs of the think-server now:
2x XEON E5 2695V4 18C/36T
128GB DDR4 2133 ECC
Intel ARC 310 (Upgraded from the 2060 for AV1 recording and quick-sync streaming)
970 EVO M.2 1TB boot drive (Windows Server 2022 - All updates)
Elgato 4K60 Pro MK.2
Elgato Facecam (1080p60)
Logitech C922 (1080p30)
The issue is with the rendered frames and the average time to render frames.
On all my other machines, I have no dropped frames unless I absolutely go mental with the quality settings but with the server, even without having a recording active, with a single scene, it can sometimes drop frames and have upwards of 2ms of avg time to render a frame, yes I know 60fps is somewhere near 16.666ms but, it seems unusual that while just having OBS open with a capture card scene can warrant almost 2ms worth of time to render a frame or even drop frames without any active recording.
I have found disabling preview does help marginally but only removes about 0.1-0.3 ms of avg render time and makes it less common for a frame to drop.
Now I understand its older hardware and I am not especially knowledgeable on how NUMA aware OBS is. but even single 2695 V4s should be fine considering they wipe the floor with some CPUS I've seen run similar setups just fine.
I'm willing to give anything a shot, I will also upload a log file when I have access to both my PCS again.