Question / Help A question about "GPU" in H.264 Settings.

MrBayeasy

Member
Hey guys,

So recently I switched from dual PC streaming to single PC streaming due to all of the headaches I've had trying to get it to run consistently and smoothly. So to maintain the same performance I have decided to use a spare GPU (1050ti) that I had laying around to put into my gaming rig to act as the encoder and take the load off of my main GPU (1080ti).

I am aware that the only thing I really need to do in OBS is to set the "GPU" number in the H.264 settings, however, is there any additional setup I need to do in my Nvidia Control Panel AND more importantly which number should I use in the available selection (0,1,2,3,etc?) I am assuming that I use "1" since I have seen people say "0" is the main gpu that I am using and that 1 would be the secondary GPU. The problem is there is no real way to check this (that I am aware of) and when I open MSI afterburner to check the numbers start at 1 not 0. So any direction would be greatly appreciated. I will leave pictures of my Motherboard below as well as how my config is set up.

In addition, I am going to be streaming at 1080p60 at 4.5mbps, while at the same time recording at 1080p60 25mbps while using 2 pass encoding and 4 b-frames. Do you guys feel that the 1050ti can handle both at the same time considering I won't be doing anything else on that GPU aside from letting it handle the work while my main GPU games? Also if I am solely using h.264 to encode (which I will be) would increasing the "process priority" of OBS as a running program increase my CPU usage in a negative way, like say taking it from "normal" to "above normal"? That is as high as I am willing to go mind you, I won't be making it go any higher than that.

Thanks to all that help me with this issue, thankfully I seem to have it all set up and ready to go, I'm just trying to make sure that the 1050ti is handling the work while the 1080ti handles the games. Have a great day!

Motherboard: Asus Z170-AR, white circled port has the 1080ti in it, red circled port has 1050ti in it. The red circled port was the next available full sized port due to 1080ti's size. Just wanted to include this info in case the "number" is contingent to the slot its in on the motherboard as I'm not sure if it goes in order of slots. If it DOES go in order of slots then wouldn't that make the tiny x1 slot above my 1080ti number "0"? Thanks again.

-MrBayeasy
 

Attachments

  • z170-ar.jpg
    z170-ar.jpg
    58.7 KB · Views: 34

BK-Morpheus

Active Member
Don't forget, that the scene composition + downscaling + filtering + scene rendering is done by your main GPU, so there will still be a little more load on your gaming GPU, even if you use NVENC on the 1050TI for encoding. So an fps limit in your games is still needed to avoid lagged frames in the rendering process.

Also, I saw, that you want to use 1080p 60fps via GPU / NVENC at only 4500kbit/s. If you want to play shooters like PUBG, the stream will get very blurry with those settings.
League of Legends, Hearthstone and similar content doesn't need a lot of bitrate, so they might look "okay" with those settings.
 

MrBayeasy

Member
Don't forget, that the scene composition + downscaling + filtering + scene rendering is done by your main GPU, so there will still be a little more load on your gaming GPU, even if you use NVENC on the 1050TI for encoding. So an fps limit in your games is still needed to avoid lagged frames in the rendering process.

Also, I saw, that you want to use 1080p 60fps via GPU / NVENC at only 4500kbit/s. If you want to play shooters like PUBG, the stream will get very blurry with those settings.
League of Legends, Hearthstone and similar content doesn't need a lot of bitrate, so they might look "okay" with those settings.

Ive done quite a bit of research on the whole bitrate topic and it seems that at 4500kbps it levels out and there is little to no visible difference betweeb the x264 and h.264. If I have it set so that the 1050ti is doing the work how will my main GPU be doing any of it? The way its supposed to be set up is so the 1080ti handles the games and the 1050ti handles encoding. It wouldnt make sense for the 1080ti to handle any of the OBS related stuff if I'm not deliberately telling it to, especially since a GPU isnt even required to stream. Even so are you saying that even if I get the 1050ti to handle the streaming/recording there will STILL be more load on the 1080ti? Also, while I'm intrigued this doesnt really answer most of my original questions.

Also do you mean lag that is visible in the stream/recording or lag that is only perceivable on my end?
 

Osiris

Active Member
Encoding and rendering (the scene,downscaling etc.) are not the same things, rendering is preferably done on the GPU that is also being used by the game, it is the most efficient way. For encoding, offloading it to another card is also unnecessary, since NVENC is done on a seperate chip that doesn't impact the general performance of the GPU.

And also depending on the CPU you have, if it's one with 16 pci-e lanes then having 2 GPU's also isn't very efficient, both will have to run at 8x instead of one at 16x.

Any fps loss you had when you were only using the 1080ti is not related to the encoding done by the card, but because of the fact that OBS needs some GPU power for rendering the scene and running games at uncapped framerates will cause those games to use 100% of the GPU's power, leaving nothing for OBS.
 

MrBayeasy

Member
Encoding and rendering (the scene,downscaling etc.) are not the same things, rendering is preferably done on the GPU that is also being used by the game, it is the most efficient way. For encoding, offloading it to another card is also unnecessary, since NVENC is done on a seperate chip that doesn't impact the general performance of the GPU.

And also depending on the CPU you have, if it's one with 16 pci-e lanes then having 2 GPU's also isn't very efficient, both will have to run at 8x instead of one at 16x.

Any fps loss you had when you were only using the 1080ti is not related to the encoding done by the card, but because of the fact that OBS needs some GPU power for rendering the scene and running games at uncapped framerates will cause those games to use 100% of the GPU's power, leaving nothing for OBS.

So if this is the case how do you limit frame rates in games? Is it done via nvidia control panel? Also if its unnecessary why is it even an option?

Also once again it would be great if some of my original questions could be looked at, thanks.
 
Last edited:

BK-Morpheus

Active Member
My guess (why multi GPU settings are an option) is just, because it can help people who already have multiple GPUs (for example for SLI ) to optimize their settings.

Regarding your question about how to limit frame rates in games, there are many options:
- many games have at least Vsync or even an fps cap as an built in graphics option
- in most games an fps limit can be set via a config file
- with the nvidia inspector tool you can set an fps limit that is done by the Nvidia driver itself (not sure, if this fps limiter is accessible through the Nvidia system control directly)
- Rivatuner Statistics Server (RTSS in short) provides the option of an general fps limit or even an individual fps limit for specific processes

RTSS is part of MSI Afterburner and its clones.

Last but not least let's go back to your feedback to my concern about 1080p 60fps @4500kbit/s...
I did not only want to point out, that NVENC is slightly less efficient for encoding at those low bitrates (compared to x264 @very fast preset), I also wanted to tell you, that 1080p 60fps @ less then 6000kbit/s will look blurry anyway, if the content is bright, colorful, detailed and fast moving.
 

MrBayeasy

Member
My guess (why multi GPU settings are an option) is just, because it can help people who already have multiple GPUs (for example for SLI ) to optimize their settings.

Regarding your question about how to limit frame rates in games, there are many options:
- many games have at least Vsync or even an fps cap as an built in graphics option
- in most games an fps limit can be set via a config file
- with the nvidia inspector tool you can set an fps limit that is done by the Nvidia driver itself (not sure, if this fps limiter is accessible through the Nvidia system control directly)
- Rivatuner Statistics Server (RTSS in short) provides the option of an general fps limit or even an individual fps limit for specific processes

RTSS is part of MSI Afterburner and its clones.

Alright, and based on the info given in my OP is it likely that the 1050ti is the "1" gpu while the 1080ti is the "0" gpu? Since i am planning on streaming AND recording (with fairly high/demanding settings on the recording side of things) would you guys recommend using the 1050ti at all or just doing it all through the 1080ti? I was under the impression that using a second "encoding" gpu would take the load off of the main gpu. Ive seen a few videos where people would get encoding errors when usimg just one GPU but when they used a second encoding gpu the encoding errors wouldnt happen. These tests were done with just recording in mind and since im going to be doing streaming at the same time too I figured i could use all the power i could get to maintain the gaming performance.
 

BK-Morpheus

Active Member
As Osiris said, many Boards/CPUs will reduce PCIe lanes from 16x to 8x, when two GPUs are installed.
If this is the case for you (the mainboard manual should tell something about this), you will be better off with just the GTX1080ti.

If the board from your picture is an ASUS Z170-Deluxe, according to the manual you will drop to PICe 8x with two GPUs:
dualgput6oll.png


Your previous dual PC streaming setup sounds better for your needs (no need to cap ingame fps etc.), but you said, it wasn't working well...

Right now, if your CPU can handle it, you could stream with x264, while recording with NVENC@high quality. With 6 cores / 12 threads or even better 8 core / 16 threads, you should be able to pull this off on one machine, pretty easy.
 
Last edited:

MrBayeasy

Member
As Osiris said, many Boards/CPUs will reduce PCIe lanes from 16x to 8x, when two GPUs are installed.
If this is the case for you (the mainboard manual should tell something about this), you will be better off with just the GTX1080ti.

If the board from your picture is an ASUS Z170-Deluxe, according to the manual you will drop to PICe 8x with two GPUs:
dualgput6oll.png

In terms of GPU performance in games there is less than 1% of difference on average so unless you mean the encoding performance will suffer than going down to x8 isnt really an issue. I'm mainly concerned about maintaining as much performance on my end as possible while keeping a smooth and consistent stream. In regards to your previous comment about res and bitrate, are you insinuating that downscaling to 720p would make the image look better? I cant imagine dropping the resolution would make things look less blurry, its going to have a degree of blurriness and lack of clarity whichever way you slice it when considering livestreaming bitrates.
 

MrBayeasy

Member
As Osiris said, many Boards/CPUs will reduce PCIe lanes from 16x to 8x, when two GPUs are installed.
If this is the case for you (the mainboard manual should tell something about this), you will be better off with just the GTX1080ti.

If the board from your picture is an ASUS Z170-Deluxe, according to the manual you will drop to PICe 8x with two GPUs:
dualgput6oll.png


Your previous dual PC streaming setup sounds better for your needs (no need to cap ingame fps etc.), but you said, it wasn't working well...

Right now, if your CPU can handle it, you could stream with x264, while recording with NVENC@high quality. With 6 cores / 12 threads or even better 8 core / 16 threads, you should be able to pull this off on one machine, pretty easy.

The only thing that wasnt working consistently was my capture cards, my elgato developed a horrendous stuttering issue that I finally determined to be driver based, and my avermedia live gamer hd 2 started to have an incredibly bad signal interference when I was using my display port/playing at high refresh rates. So i was essentially limited to 60fps for my personal experience, not terrible I know but it really frustrated me that it was "supposed" to be working and it just wasnt. If all else fails I could go back to it. However this is the second time ive been down for weeks at a time due to a capture card issue. I just didnt want to have that kind of inturruption again. I'll have it to fall back on though.
 

MrBayeasy

Member
Also lastly once more, based on the info given does it seem likely that the 1080ti is gpu 0, and the 1050ti is gpu 1? Thanks for all the help and info thus far everyone.
 

BK-Morpheus

Active Member
I'm not sure, about the GPU Number, but I would also think that GPU 0 = 1080ti and GPU1=1050Ti.
720p 60fps will look less sharp than 1080p 60fps, but when you are moving fast ingame, the pixelation/blurriness will be a lot worse on 1080p 60fps, as long as you compare both resolutions on the same bitrate.
Things that increase the need for more bitrate are:
- higher resolutions
- higher framerates
- amount of moving details

PCIe x8 on its own won't have a real noticeable performance impact (neither for gaming, nor for encoding via NVENC). But the problem gets real, when you are gaming on PCIe x8 while trying to grab the footage via game_capture/display_capture/monitor_capture while gaming.
The amount of GPU bandwidth that is needed for both is significantly higher and that's why x8 will be worse at that point (for gaming experience and grabbing the frames fast enough to OBS as well, because at this point, your 1050TI isn't doing anything, as the scene must be rendered, before your 1050TI NVENC chip will encode it).
 

MrBayeasy

Member
Is this all measured performance or is this theoretical? I understand that in theory this makes sense, but have people tested this to see how much is really necessary? If a x1 PCIe capture card has no problem at x1 bandwidth it seems strange that even at x8 there would be that big of an issue. Especially since capture cards have a similar way of encoding as the hardware based encoding that is in these gpus dont they?

Just like people used to think that running a gpu in x8 configuration would be bad for performance, logically you'd think that less badwidth meant less performance, but once it was tested it was shown that in practice it basically made no difference. Is this another one if those scenarios or has this beeb proven? Thanks for your help thus far I really appreciate it.
 

BK-Morpheus

Active Member
Capture cards don't hook into your main GPU videomemory to grab the footage, they capture what's coming into the card via HDMI / DP, that's why that situation is not comparable.

My info about the impact on PCIe x8 for OBS is based on some posts from the moderators, I only use one GPU with x16.
I think they said something like the PCIe bandwidth usage will double, when OBS hops in to hook the footage from you graphics card.
Maybe some one else could hop in here and give a little deeper look inside this specific topic.
 

MrBayeasy

Member
Im going to do some benchmarks tonight and tomorrow to see the difference. I'll do the benchmarks with OBS recording & streaming (ill do both in tomorrows test) and without to measure any FPS drops if any. I'll also take note of any encoding lag if any that happens during the OBS runs. It might shed some light on how much it really impacts things. Even if that particular utilization does double the amount of bandwidth needed its likely that it wont matter if there is enough overhead as I suspect there is. I'll report back soon.
 
Top