Why do I have to change OBS graphic settings to "Power saving" in order for display capture to work?

ClimbersOfIce

New Member
Here's the thing. There is a thread called "Laptop? Black screen when capturing? Read here first", which is pinned to the top of this forum's main page.
It explains that if I want OBS display capture to actually recognize and display my second monitor, I must go (on windows 10) to graphic settings and set OBS to run in "Power saving" mode.

My simple question is: why?
I followed the instructions in that thread, and it did work in getting display capture to work. However, OBS is now running and rendering (graphics wise) on my onboard Intel HD graphics instead of my NVIDIA 1650.

My Intel HD graphics are much weaker than my NVIDIA, needless to say. Task manager reports that my Intel HD graphics usage is about 80% when using OBS alone. My NVIDIA still does the encoding with NVENC, but it no longer helps with the actual rendering of video on OBS, resulting in my Intel graphics being overwhelmed pretty easily, resulting in some slow down in some ways.

The NVENC encoder only uses up 11% of my NVIDIA's power. If I go back to Windows graphic settings and change it back from powersaving to high performance (meaning it'll use my NVIDIA to render), the whole system's workload is distributed more evenly and everything works better.

Mind you, I'm running a desktop i5 CPU, not a laptop one, so I'm unsure as to why this is a problem for me given that that thread implies this is a laptop issue.

I'm confused
 

ClimbersOfIce

New Member
Just to clarify, I would prefer to have the NVIDIA be the machine that OBS defaults to and uses, since it has a lot of power to give. The Intel HD graphics have very limited power and reach their limit very easily.

However, if I set OBS in windows graphics settings to run in high performance mode (so that it uses the NVIDIA card I have to render video in OBS) the display capture will no longer see my second monitor.

Is there a workaround for this?
 

R1CH

Forum Admin
Developer
You could try disabling the Intel HD GPU in device manager and seeing if it somehow passes through, but it might be a design limitation of that model (external monitor only connects to the iGPU).
 

ClimbersOfIce

New Member
Thanks for the response. Alas, there was no way to get my PC to project to a second monitor unless Intel HD graphics are enabled. In fact, OBS won't even recognize a connected second monitor if Intel HD graphics are disabled.

Seems like Intel HD graphics are responsible for the projecting of the image onto the second monitor, as well as recognizing what's connected into the HDMI ports on the PC. There's no way in my particular case to make the NVIDIA be the one that displays to the second monitor, it seems like it can only be Intel HD graphics.
 

St.Robbie

New Member
one monitor must be using integrated and one is using the card make sure both are using the same thing and if you cant do that then make sure to enable both integrated and card allowed in your system bios on your board
 
Top