Nick Colombia
New Member
I have a project that I am trying to complete for my office by Jan 10th. Currently there is a software limitation on a key piece of software that we use, and the development timeline to resolve that limitation is unknown. As a result, we need near continuous recordings of a minimum of 6 1920x1200 windows (or displays) with a potential maximum of 10 1920x1200 windows (or displays) at 30 FPS. The "Indistinguishable Quality, Large File Size" setting on OBS Studio provides the same quality that we are looking for. I need the recordings to be split into 1 hour segments. These recordings do not need to be done on the same machine, but fewer machines are preferable.
I have experimented with several potential solutions. I've ruled out any kind of cloud setup-- none worked well. I tried both Windows and LInux, and both OBS and ffmpeg. I also tried running ffmpeg on a bare metal Linux server that we had with the same result. The only thing all of these setups had in common was the Xeon processor. I was surprised that even the bare metal dual Xeon machine struggled.
I am now using an AMD Threadripper 1950X with 3 GPUs as my test machine. OBS runs fantastically well on the physical machine -- 1.6% CPU at 30 fps. This led me to try a Hyper-V setup. That worked surprisingly well too (did not think that would be the case after the cloud experiment). Each entire VM uses 7% CPU while recording at 30 fps. On both setups, I was able to use the OBS Web Socket plugin and the OBS Command Line tool to automatically stop and start the recording every hour (via Windows taskscheduler). At one point I even had 3 VMs running, all recording, with no noticeable degradation in performance.
Here are the two solutions I'm now considering:
Am I missing anything here? Am I way off, or is there a better way to do it? I am open to any advice.
Thanks!
I have experimented with several potential solutions. I've ruled out any kind of cloud setup-- none worked well. I tried both Windows and LInux, and both OBS and ffmpeg. I also tried running ffmpeg on a bare metal Linux server that we had with the same result. The only thing all of these setups had in common was the Xeon processor. I was surprised that even the bare metal dual Xeon machine struggled.
I am now using an AMD Threadripper 1950X with 3 GPUs as my test machine. OBS runs fantastically well on the physical machine -- 1.6% CPU at 30 fps. This led me to try a Hyper-V setup. That worked surprisingly well too (did not think that would be the case after the cloud experiment). Each entire VM uses 7% CPU while recording at 30 fps. On both setups, I was able to use the OBS Web Socket plugin and the OBS Command Line tool to automatically stop and start the recording every hour (via Windows taskscheduler). At one point I even had 3 VMs running, all recording, with no noticeable degradation in performance.
Here are the two solutions I'm now considering:
- Use a single AMD Threadripper workstation (1950X or better) for all of the necessary recordings. Possible problems:
- Would I need to have 6-10 physical displays to do this? Or can Window Capture record windows that are maximized yet behind other windows? Alternatively, could I use RDP to create a "virtual dipslay"?
- Would I be able to run / control multiple instances of OBS with different settings on the same machine? I have seen a few different ways of doing this, but I'm not sure any of them would allow me to automatically start / stop recordings via a script as I've been doing with the Web Socket plugin and the Command Line tool. Maybe I could just install multiple instances of OBS in different directories?
- Use 2 upgraded workstations, perhaps the highest end Threadripper, and Hyper-V VM's. Possible problems:
- It's possible that the surprising performance I saw with running OBS on the Hyper-V VM's was not due to the processor but rather the GPU's, and so maybe an upgraded processor won't get me any more performance. According to everything I've read on running Hyper-V, this doesn't seem to be the case. I don't have "GPU Passthrough" enabled on my test machine (it doesn't even have that option as it is Windows 10, and not Windows Server).
- Obviously, this would cost more than option #1.
Am I missing anything here? Am I way off, or is there a better way to do it? I am open to any advice.
Thanks!