FieldOfSins
New Member
In order to use 6 monitors with my PC, I need to use two Nvidia cards (as each card is limited to 4 displays, even with MST). But when I add a Display Capture, it's only showing monitors connected to the second card (none from the first), leading me to believe OBS is "running" on the second GPU.
Here's my single PC setup..
Asus X399-E with a Threadripper 2970WX and 64GB Memory running Win10 Pro.
1660 Super in 1st PCI-E 16x slot
2080 Super in 2nd PCI-E 16x slot
Elgato 4k60 Pro Mk2 Capture Card
Monitors 1,2 & 3 are connected to the 1660, which I display OBS on.
Monitors 4,5 & 6 (monitor 4 cloned via HDMI for capture card) are connected to 2080, which I use for gaming.
When I look at Task Manager, I see the 2080 is assigned ID 0 and the 1660 ID 1. I can specify "1" for GPU in output settings to ensure the 1660 NVENC encoder is used, and that works fine ( I see video encode utilization on the 1660 in Task Man when I record/stream). When I move OBS between monitors, I see the GPU utilization caused by OBS rendering change between the two in TaskMan but regardless of which monitor OBS is displayed/starts on (ctrl click close to remember which monitor to open on) I only see monitors from the 2080 when I add a Display Capture source in OBS.
I want OBS to "run" & encode on the 1660, so the 2080 isn't burdened with rendering utilization and use a capture card (plus 1660 connected displays/window sources) to avoid having to transfer frame data between GPUs (negating benefits of Turing NVENC improvements and adding additional GPU overhead).
Any ideas?
Here's my single PC setup..
Asus X399-E with a Threadripper 2970WX and 64GB Memory running Win10 Pro.
1660 Super in 1st PCI-E 16x slot
2080 Super in 2nd PCI-E 16x slot
Elgato 4k60 Pro Mk2 Capture Card
Monitors 1,2 & 3 are connected to the 1660, which I display OBS on.
Monitors 4,5 & 6 (monitor 4 cloned via HDMI for capture card) are connected to 2080, which I use for gaming.
When I look at Task Manager, I see the 2080 is assigned ID 0 and the 1660 ID 1. I can specify "1" for GPU in output settings to ensure the 1660 NVENC encoder is used, and that works fine ( I see video encode utilization on the 1660 in Task Man when I record/stream). When I move OBS between monitors, I see the GPU utilization caused by OBS rendering change between the two in TaskMan but regardless of which monitor OBS is displayed/starts on (ctrl click close to remember which monitor to open on) I only see monitors from the 2080 when I add a Display Capture source in OBS.
I want OBS to "run" & encode on the 1660, so the 2080 isn't burdened with rendering utilization and use a capture card (plus 1660 connected displays/window sources) to avoid having to transfer frame data between GPUs (negating benefits of Turing NVENC improvements and adding additional GPU overhead).
Any ideas?