Question / Help How to run OBS on BOTH GPUs (Integrated + nVIDIA)

Polda18

New Member
Hi. I want to record some videos on my computer screen, and since I am using mostly laptop, I am concerned about the performance as well as to make it work. I want to record multiple genres of video tutorials. First being gaming. I am playing some games, one of them being Minecraft. I want to start recording single player Survival tutorials, however I ran into an issue: Minecraft actually outputs the screen image into the Integrated GPU and OBS doesn't like to run on the Integrated GPU, because it actually drops down the performance. Well, it didn't seem so severe when I was just recording the desktop, it had actually full 60 fps. The issue started when I launched Minecraft. The direct output was fine, the game was at 60 fps (OptiFine used, otherwise it's 30 fps, not so good, but still useable). However, the recorded image became choppy when the game started. I guess it was like 10 fps. Not good at all! And that's because it was running on Integrated GPU. If it was running on the natural choice of OBS, the discrete GPU (nVIDIA), it would work far better. But I don't know if Minecraft actually uses the discrete GPU and I would see the black screen, eighter. I might run some more tests, but other than that, it feels just bad :(

So therefore I am asking: Can I somehow run OBS on both of the GPUs? Somehow link them together, so I can run OBS on the discrete GPU, but it will see the screen even though it runs on the integrated GPU? Because I do not feel right to disable integrated GPU just for the sake of recording and it might actually even drop the performance more, because it will require more power and that might actually drain more than the laptop can handle, resulting in quicker death of motherboard :(

I don't want to use other recording software. I have Streamlabs, which is actually based off the OBS Studio, it only includes some more functions, like advanced transitions, layouts, and some widgets, like Twitch subscriptions and YouTube perks and much more. Other than that, it's just OBS, just redesigned. The reason is, that OBS is completely free and yet it's even far better than most of commercial paid recording softwares (like Fraps or Bandicam). Is OBS even capable of using both GPUs? Or could be? And if not, can I make the integrated GPU share its render image with the nVIDIA GPU for the OBS recording? So it can actually see what's going on at the screen and record it with the proper performance it requires? I don't mind running in on integrated GPU if I was only to record the development (programming) tutorials I am also planning. But since I am in terms of computer use rather the user than developer, and especially gamer, it makes sense that I want better performance also for the gaming recording. And since I am naturally really lazy kind of person, I don't generally like to change my settings so much to switch between different modes.

What makes it difficult for me is the fact I don't know if I can actually capture Minecraft with Game Capture (Java Edition, I don't think that Java actually understands the performance GPU and tries to force the apps to run on integrated GPU instead) while the OBS is running on the high performance GPU and therefore it would solve my issue, and the fact that I can't see the screen and I can't assign the game to the capture device before I run the game, which requires me to turn the full screen mode off and fiddle my settings to the OBS and turn the full screen mode back on. And third issue - I am using Discord widget that allows me to turn the Discord overlay on. Will it be visible in the Game Capture device or not? I don't know actually if it's just a separate layer on the screen, or it's actually cast directly in the game image, so depending on these conditions, it may or may not appear in the video. Sometimes I would like to display the Discord overlay, because it's actually essential for the viewer to see like who's talking right now for example. Sometimes I do not wish to display the Discord overlay, but I can actually solve that by turning the Discord overlay feature off and it won't appear in eighter...
 

Narcogen

Active Member
No. There is no performant way to do this.

Game capture requires OBS be running on the same GPU as the captured app. Display capture requires running OBS on the integrated GPU. Running two instances of OBS, or splitting tasks between GPUs either doesn't work at all on laptops, and even on desktops just creates bottlenecks on the PCI bus as OBS tries to send data between two GPUs.

About the only way to use the integrated GPU this way is to use the QuickSync encoder instead of NVENC. While that will technically put some load on the Intel GPU, so it is being used, quality won't be as good in most cases.
 

Polda18

New Member
Hmm, I need high quality video, so this isn't actually good. So basically I could try to run OBS on discrete GPU and use Game Capture, but I have to actually first run the game before I can set it up in OBS. Yeah, might work... It's a pity that there is no way to run it on both GPUs or in other way connect them to achieve that goal. I mean, if you run OBS on desktop that uses only one GPU (integrated GPU on motherboard is mostly not in use, because it needs the screen to be connected to that one in order to be used), both Display Capture and Game Capture work, because everything runs on the same GPU. Desktop doesn't actually need to save energy since it doesn't run from a battery. It might be easier to record on desktop computer than on laptop then.

But is it even possible to implement both GPU usage for OBS? Because the Display Capture will work for the integrated GPU all the time, including apps that require really high performance from the strong discrete GPU (basically most of modern games). Is it possible to run OBS image processor on the discrete GPU while the app GUI runs on the integrated and sort of both of these communicate with each other? I don't know how to explain it more clearly. Basically 3D software uses both. The 3D image processor computes the nodes and leaves to create a 2D image for the screen, which is then sent to the integrated GPU to display.

Is it really hard to use the encoder with the strong power GPU and the screen capture with the integrated GPU? As far as I know, it only captures screen 60 times and feeds it to the encoder. Now, because the app requires it to be running on the integrated GPU, the encoder isn't strong enough to encode the screen at 60 fps, especially when an app requires more power for the graphics. It's easy to encode apps that are somewhat easier to the GPU (just basic 2D apps, like MS Paint, video playback apps or Desktop itself), but it's harder to encode apps that actually require monitor syncing and require to display the generated image (mostly from 3D) in real time = games, and the encoded video becomes choppy, because it doesn't have enough power to encode it with the desired amount of fps :(

I am no development expert, nor I am an expert in programming, so I do not know how difficult it is. But I may suggest to take a peak at how Blender for instance uses the GPU layout. It runs on both of them, splitting their tasks so the app itself runs on the integrated GPU (the actual app window and the viewport) and the 3D processor actually runs on the discrete powerful GPU (the 3D calculator of the viewport and renderer). Blender is both free and open source. OBS is both free and open source. Don't tell me it's impossible to split tasks of OBS between these two GPUs...
 

Polda18

New Member
Okay, quick heads up. Minecraft doesn't run on discrete GPU at all! When I used Game Capture, it didn't detect it while it was in full screen mode. It didn't detect it and didn't even display the process in the list of running processes to select it. So in order to capture Minecraft (and pretty much any other 3D app) at the proper speed, I have to actually disable integrated GPU so the laptop is forced to use the powerful one is is actually supposed to use for 3D games, including Minecraft. This is pretty much upsetting. To realise that a 3D game (which with proper mod = OptiFine, allows to use more sophisticated and therefore more difficult to process textures and also shaders) doesn't know to use the more powerful GPU and instead forces to run on integrated one, it's an outrage. I know that the game isn't particularily detailed, but it complicates things even more! Bedrock edition actually respects that and runs over discrete GPU, which also means that it's also cleaner and smoother. So I'll do one more test and that's with the integrated GPU disabled. I'll see if it actually solves my issue with choppy video. But if it turns the entire performance down, I'm less likely to record with laptop, therefore less likely to record at all. Pity :(
 

Narcogen

Active Member
Laptops are less than ideal for streaming/recording. The best setup is a single GPU as powerful as you can justify (as long as you're using a single machine. Multiple CPU setups are possible that split the load.
 

Polda18

New Member
Well, I am still less likely to record during working days, due to my upcoming apprentice, because I need money for living and further investments into my equipment (which also includes setting up my home server for Discord bots and possibly even gaming server), and also purchasing another games, like PUBG (which will actually require to use the powerful GPU to record at, the integrated GPU won't handle it). So if it doesn't work on laptop, it might work on desktop. The only issue is to actually sync the time I might be use the computer with the time my dad will likely to be in charge, because the desktop actually belongs to him.
 
Top