Two GPUs, Six Monitors, One OBS

FieldOfSins

New Member
In order to use 6 monitors with my PC, I need to use two Nvidia cards (as each card is limited to 4 displays, even with MST). But when I add a Display Capture, it's only showing monitors connected to the second card (none from the first), leading me to believe OBS is "running" on the second GPU.

Here's my single PC setup..
Asus X399-E with a Threadripper 2970WX and 64GB Memory running Win10 Pro.
1660 Super in 1st PCI-E 16x slot
2080 Super in 2nd PCI-E 16x slot
Elgato 4k60 Pro Mk2 Capture Card

Monitors 1,2 & 3 are connected to the 1660, which I display OBS on.
Monitors 4,5 & 6 (monitor 4 cloned via HDMI for capture card) are connected to 2080, which I use for gaming.

When I look at Task Manager, I see the 2080 is assigned ID 0 and the 1660 ID 1. I can specify "1" for GPU in output settings to ensure the 1660 NVENC encoder is used, and that works fine ( I see video encode utilization on the 1660 in Task Man when I record/stream). When I move OBS between monitors, I see the GPU utilization caused by OBS rendering change between the two in TaskMan but regardless of which monitor OBS is displayed/starts on (ctrl click close to remember which monitor to open on) I only see monitors from the 2080 when I add a Display Capture source in OBS.

I want OBS to "run" & encode on the 1660, so the 2080 isn't burdened with rendering utilization and use a capture card (plus 1660 connected displays/window sources) to avoid having to transfer frame data between GPUs (negating benefits of Turing NVENC improvements and adding additional GPU overhead).

Any ideas?
 

koala

Active Member
On startup, OBS chooses a GPU to use for compositing. It's the GPU "obs is running on", the GPU OBS is creating its internal frame buffer on. As far as I see it, OBS chooses the GPU your main monitor is connected to as default. This is not the GPU where OBS displays its app window. You can move the OBS app window freely between all your monitors, but it will continue to run on the GPU it chose at startup.
You can display capture only monitors that are connected to the GPU OBS chose at startup.

There is a Windows setting, primarily made for laptop users, to choose on which GPU an app will start. It's made for laptops to let users choose if an app should run on a powersaving internal GPU or a powerful external GPU.
This setting is available for desktop PCs as well, it's Windows settings->system->display->graphics (the setting on the bottom of that window). May be you can choose between your two powerful GPUs for OBS as well.

If you change this setting, you lose the ability to display capture from the previous GPU, and gaining the ability to display capture monitors from the other GPU.
 

FieldOfSins

New Member
Unfortunately, Windows doesn't consider the lesser GPU to be less power hungry and therefore doesn't offer the option to select the 1660. If there was a way to force Windows to designate a GPU as being the better Power Saving option this would probably work, so thanks for the suggestion.

GPU Windows App Selection.PNG
 

FieldOfSins

New Member
I've found a work-around (with a little inspiration from Koala) until I can figure out a better solution. Set one of the 1660 monitors to be the "main display" before opening OBS. After OBS has loaded, switch "main display" back to a monitor on the 2080.

I now have all OBS compositing, rendering, display capture sources & encoding on the 1660, with practically 0% utilization on the 2080! Kinda clunky but it works.

Will probably write a script to perform these steps and replace my OBS shortcut with it... MultiMonitorTool from NirSoft can handle setting the main display from a command line.
 

llinfeng

New Member
Inspired by FieldOfSins' post, I swapped the cables so that the main display is attached to the stronger card (GTX 970). This also leaves me with a yet-vacant DisplayPort, which is suitable for my new dummy DP-adaptor adventure :)

Though, Windows won't display the GTX 970 card when the primary display was attached to the GTX 1650 card. It suggests the integrated Graphics card as the power saving option instead.
1601305435108.png
 

cordite

New Member
@FieldOfSins did you find a better solution than setting the primary display in a script? I just tried doing that and while I think it worked it royally messed with an active RDP session. I really would like OBS to capture form the card that is not hosting the primary display.
 

cordite

New Member
@cyclemat thanks for the suggestion. I haven't had a chance to try other options. I needed it for capture purposes and so for now I have just accepted that I can't capture from as many sources.
 

cordite

New Member
I don't use OBS for gameplay. I use it for remote instruction and have multiple screens that need to be captured. Screens for RDP sessions, code samples, slides, etc. If I could shift my primary monitor away from the graphics card that OBS allows capture from that opens an additional screen for capture. Currently it's not an issue, but it might become an issue in the future so I just wanted to be prepared.
 

loyukfai

New Member
Hi there, did you get anywhere at last? Is OBS still incapable of handling dual (or for that matter, triple) GPU?

Cheers.
 

FerretBomb

Active Member
Hi there, did you get anywhere at last? Is OBS still incapable of handling dual (or for that matter, triple) GPU?

Cheers.
OBS still can only handle a single GPU. This is unlikely to change for some time; as I understand it, it will require a complete re-code of some core modules to allow it to happen, and the team has an over-full plate already as it stands.
 

loyukfai

New Member
OBS still can only handle a single GPU. This is unlikely to change for some time; as I understand it, it will require a complete re-code of some core modules to allow it to happen, and the team has an over-full plate already as it stands.

Understood and thanks. Will re-think the setup with this limitation in mind.

Cheers.
 
Top