Question / Help 750ti or 1050/1050ti for Dedicated OBS encoding with NVENC

What about adding a second card, a 750Ti, a 1050 or even a 1050Ti just for OBS encoding, so I would free my GTX 1080 just gaming. Would that work? Anyone here already did that?

I know about QuickSync, please, I don't want to sound like an a-hole, but I just wanna know about a second card for OBS encoding with NVENC.

Is it doable? Would it free the main card just for gaming?

I'm asking this because I noticed the GPU option, highlighted in the image below. Don't mind the other options, I just installed OBS to capture the screen.


tZckXbP.png
 

alpinlol

Active Member
NVENC runs on a sperate part of the Chip and only taxes the performance withing a 1% area. Meaning your ingame performance shouldnt be decreased in any way. So there is no real gain when throwing in a second Nvidia card only for NVENC.
 
NVENC runs on a sperate part of the Chip and only taxes the performance withing a 1% area. Meaning your performance shouldnt be decreased in any way. So there is no real gain when throwing in a second Nvidia card only for NVENC.

1%? That's far from my reality, there are games that I notice precious 15FPS drop from a total of 60~65 FPS, using a GTX 1080 G1 Gaming.
 

alpinlol

Active Member
1%? That's far from my reality, there are games that I notice precious 15FPS drop from a total of 60~65 FPS, using a GTX 1080 G1 Gaming.

Well I guess you already did a clean Driver install and made sure that anything left from your old GPU was removed.

Also please post a Log with an recording attempt of 5-10 Minutes.
 
Well I guess you already did a clean Driver install and made sure that anything left from your old GPU was removed.
I did an OS clean install :)

The card is performing alright, like it should I stress tested it and all that good stuff we do when we upgrade our glorious graphics card. The FPS drop is really noticeable in more intensive games, like Arma 3. In games like Overwatch it's not that taxing on the card if I use Shadowplay or OBS. Oh yes, this drop also happens when I use Shadowplay alone.

I'm using an SSD, that perform alright while transferring files, so I don't think the SSD is the problem. But if you guys see the log, maybe you can say what the problem is... I don't know...

Also please post a Log with an recording attempt of 5-10 Minutes.
I can do that, but only later tonight.

But about my main question, can OBS handle a second card to encode using NVENC?
 

Sapiens

Forum Moderator
This sounds like a complete waste of money and resources unless you're building a second, dedicated encoding PC, in which case you wouldn't want to use a hardware encoder at all. Putting a second GPU in your streaming PC isn't going to behave any differently than using the NVENC support on your existing 1080.
 
This sounds like a complete waste of money and resources unless you're building a second, dedicated encoding PC
.How come just an extra GPU dedicated for encoding is more of a waste then a whole dedicated PC just for that?
you wouldn't want to use a hardware encoder at all
No? Why not? I genuinely don't know why I wouldn't.
Putting a second GPU in your streaming PC isn't going to behave any differently than using the NVENC support on your existing 1080.
So Osiris was wrong? OBS wouldn't use the second card exclusively for encode? On those games I lose 15 FPS just by capturing with OBS (using NVENC) or Shadowplay, I'd still lose those FPS if I use a second card dedicated for encoding? OBS have that option to specify the GPU I want to use but it pretty much ignore that (for now)?
 

Sapiens

Forum Moderator
.How come just an extra GPU dedicated for encoding is more of a waste then a whole dedicated PC just for that?
It provides no benefit of any kind for OBS. If you have an existing NVENC-capable GPU the performance will be the same. Why spend the money on a second card if you don't get anything out of it? You already have a 1080, use it.

No? Why not? I genuinely don't know why I wouldn't.
Because the main benefit of having a dedicated streaming PC separate from the gaming PC is that you can use x264 without affecting performance. Hardware encoders don't compress video as well, so building a second PC just to have it use a hardware encoder is a waste.

So Osiris was wrong? OBS wouldn't use the second card exclusively for encode? On those games I lose 15 FPS just by capturing with OBS (using NVENC) or Shadowplay, I'd still lose those FPS if I use a second card dedicated for encoding? OBS have that option to specify the GPU I want to use but it pretty much ignore that (for now)?
That isn't what I said at all. Will it work? Yes. Will it improve performance? No.
 

Kescarte_DeJudica

New Member
Hi!

I am going to necromance this thread, because it can still be found via a related Google search, and I would like to clear up some common myths here.

Now, first of all, it is true, NVENC does not affect in-game performance, at least not directly. Directly, it does, and that can become a big problem.

Having a dedicated streaming/recording PC with a NVENC encoder is not a waste of money, and it does provide benefits. LEt me explain why.

Now, if I am running a very graphically demanding game at very high settings, which my video card would normally be able to support just fine without any problems, and I decide to encode my videos with the same card, the encoding is not going to affect the gameplay, but the gameplay will affect how the video looks. When running a game at super high settings, your card doesn't have a lot of power left to put into the encoding. So, the video you're recording will look very choppy, as if the FPS is poor. And the only way to fix this is to turn down your game settings, until the video you are encoding looks normal.

If you had a separate, dedicated video card to handle this, this would not be a problem. You could run your game at whatever quality you know your card can produce, without having to worry how it will make the video turn out in the end. As long as the second card is capable of outputting the kind of quality you desire, you're golden.

Hopefully that clears up some of the myths you often hear about NVENC. And hopefully it will help you from making costly mistakes, like I have unfortunately made.

At the end of the day, NVENC can indirectly affect game play, because it can force you to turn down game settings to preserve video quality, and a dedicated PC with NVENC encoding can server it's purpose well.

One final note: Many people will tell you that if you are going to build a secondary streaming/recording PC, it should be designed for x264 instead of NVENC, due to it being more efficient of an encoder option. If you're only planning on recording and not streaming, don't do this, go for NVENC. As long as you're recording bitrate is high (30,000 to 40,000 kbps) and you use a newer card (such as from the 10xx line), the quality will be just as good as x264 and much cheaper to setup.

If you're planning on streaming, then it depends on your upload rate for your internet speed, and what platform you are using. If you are streaming on Twitch, go with x264, because Twitch limits your bitrate to 6,000 kbps (which is much too low, regardless of encoder used). IF you are streaming on YouTube, and your upload rate is pretty high (18 to 35 mbps, or more) go with NVENC.

Hope that helps!
 
When running a game at super high settings, your card doesn't have a lot of power left to put into the encoding
That alone explain why it's worth using a second card. It's been a while since I wrote in this thread...

I tried QuickSync for some time and for whatever reason, it didn't work for me, it should, I think... I mean, I know it should, but it didn't, so I gave up.

Using NVENC from my GTX 1080 while gaming (YES, don't be a prude, a GTX 1080 is not like some overpowered God), being that game taxing on the card to render all those gorgeous 18X heavy-dutty-AA on textures and shadows, could impact performance. It actually did, it happened. Some games can use 100% of your card, so what's left for encoding? Think about that for a moment... Something gotta give...

So I had a GTX 750ti to test and, oh boy, it lifted the heavy encoding task from the GTX 1080 like a boss. And old card, probably inexpensive in some countries...

But I couldn't dedicate the GTX 750Ti for that, so I bought a a RX 460 4GB OC from Gigabyte, probably another inexpensive card in some countries. And oh boy, I don't know why but AMF + OBS is actually great. Like if you know what you're doing, even the low end model, the RX 460 has A LOT of options to fine tune inside OBS.

Now, keep in mind, I was using that for capturing the game at this point, not Twitch or some other live service. So it was good, I captured at absurd 50.000 bitrate, fooled around with QPS at 22 and it was great. But then I decided to stream.

So let me tell you my 50 cent about streaming with a encoding API on you graphics card: it's not great. I'm not technical, but putting it simply, not using technical jargon, it's like the quality per bitrate is better when you use the CPU, even when you use that veryfast profile. And now that I started streaming, I went live, I noticed that QuickSync can actually do something, it can encode for stream, since I was streaming at miserable 4.200 bit rate.

So I got a Ryzen 2700X now and I use that for streaming, 720p@60, 4.200 bitrate and the AMD card is used for capture.

Other thing I'm doing is keep OBS open on a display connected to the RX 460 (before the Ryzen 2700X upgrade I used the intel GPU for that, so the RX was also not bothered with that), so not even the OBS Preview is rendered on my GTX 1080 and yes, that's actually a thing, I monitored the cards and you can actually see how OBS Preview use a good percentage your card.

UPDATE, forgot to mention something.

Another thing that I tried was OBS to OBS from one computer to another, AMF with the RX 460 on the gaming PC to a second PC using x264 on a i7 4770K over wired gigabit LAN. Technically it should work. The results were not good. I tried that using that nginx+RTMP trick someone wrote on the forum. Something I think Linus from LinusTechTips is doing with their live video, but they probably took it to a whole new level.

Again, I'm not technical, I'm practical :)

So this is just an update to the thread, resuming my experience living through this debacle myself :)
 
Last edited:

BK-Morpheus

Active Member
rendering /= encoding

It would help, if the second GPU would render your OBS scene, which causes the performance impact, that you notice while gaming+streaming.
Just for encoding via NVENC, it does not really help. In fact, in most scenarios, it will be worse, as many mainboards will reduce the PCIe bandwidth (from 16x to 8x) when putting in an additional GPU.
Yes, for gaming alone, the impact of reducing the PCIe bandwidth from 16x to 8x is not a big deal, but as OBS has to access the framebuffer, the load on the PCIe bus is way higher.
 
rendering /= encoding

It would help, if the second GPU would render your OBS scene, which causes the performance impact, that you notice while gaming+streaming.
Just for encoding via NVENC, it does not really help. In fact, in most scenarios, it will be worse, as many mainboards will reduce the PCIe bandwidth (from 16x to 8x) when putting in an additional GPU.
Yes, for gaming alone, the impact of reducing the PCIe bandwidth from 16x to 8x is not a big deal, but as OBS has to access the framebuffer, the load on the PCIe bus is way higher.

Render = showing on my screen? Explain that in video-layman terms.

If rendering = showing on my screen, well, I actually use the second graphics card for that. AFIK Windows render whatever it's showing on the graphics card that's outputting that signal to the screen, I can actually see the graphics card usage going (considerably) up just by placing OBS on the screen that's connected to that card.

Also, encoding alone is very taxing, specially if you're capturing at higher bitrate.
 
Last edited:
Just for encoding via NVENC, it does not really help. In fact, in most scenarios, it will be worse, as many mainboards will reduce the PCIe bandwidth (from 16x to 8x) when putting in an additional GPU.

So gamers using triple SLI at 4x 4x 4x shouldn't even bother installing OBS?
 
Last edited:

Harold

Active Member
If you're on a 16-lane CPU, there's no benefit to having a second card at all, let alone a third.
Best case you gain no performance, more commonly you bottleneck your primary card between the scene composition and the game.
 

BK-Morpheus

Active Member
Render = showing on my screen? Explain that in video-layman terms.

If rendering = showing on my screen, well, I actually use the second graphics card for that. AFIK Windows render whatever it's showing on the graphics card that's outputting that signal to the screen, I can actually see the graphics card usage going (considerably) up just by placing OBS on the screen that's connected to that card.

Also, encoding alone is very taxing, specially if you're capturing at higher bitrate.
The OBS scenes usually have multiple sources in them (for example game capture, webcam, text, browser sources) and may also use downscaling from your base resolution including a filter (bilinear or maybe lanczos) or resizing of your webcam etc.

OBS needs to render a picture of your scene including the scaling/filtering. If it renders enough frames per second, the encoder can then encode the separate frames into a video stream.

OBS rendering is done, as soon, as OBS is opened, no matter if the live preview is on/off, recording/streaming is on/off or OBS is minimized.
Rendering is done via GPU, no matter if you encode with it as well or with x264.
 
The OBS scenes usually have multiple sources in them (for example game capture, webcam, text, browser sources) and may also use downscaling from your base resolution including a filter (bilinear or maybe lanczos) or resizing of your webcam etc.

OBS needs to render a picture of your scene including the scaling/filtering. If it renders enough frames per second, the encoder can then encode the separate frames into a video stream.

OBS rendering is done, as soon, as OBS is opened, no matter if the live preview is on/off, recording/streaming is on/off or OBS is minimized.
Rendering is done via GPU, no matter if you encode with it as well or with x264.
So yes, according to the monitoring I did, all that rendering is done on the card where the display/monitor that's showing OBS is connected to, in my case not the one doing the game but the one dedicated to encoding. That actually add some good usage percentage to the card.

One thing I noticed is that the bigger the preview is the more GPU it uses just to show OBS on the screen.

But my question is... Is rendering really as taxing as encoding at 1080p@60 50.000 bitrate? Because you all are talking about rendering as if it was the real heavy duty task in OBS, not the encode itself and by the monitoring I did here, I'd say it really isn't what makes capturing with OBS stress the PC.
 

Osiris

Active Member
Rendering is not a heavy duty task for the GPU, but if your game is using all of the GPU's power, then there will be nothing left for
OBS. Unfortunately there is currently no way to prioritize GPU tasks.
 

BK-Morpheus

Active Member
If you encode with NVENC, there is no real stress for encoding, but as many people pay their games without any FPS limit (at least the many threads in here are showing this), they let their GPU run into max load and therefore cripple OBS rendering performance.
 
Top