You have to use NVENC on the GPU that OBS is running on. Furthermore, dual GPU systems in general end up leading to much worse performance when it comes to using OBS due to the need to copy textures back and forth from GPU to GPU. Trying to run the game on one GPU and running OBS on another GPU is actually a bad idea for this reason, and isn't even really possible.
I've been using this sort of setup for almost half a year now.
Initially in 2013-2014 or so, I was using NVENC directly on the card playing the game.
This had some frame drops when streaming at 60fps.
The card at the time was a GTX680.
When I started streaming again I used QuickSync for sometime but decided the quality wasn't good enough.
So I used a complicated setup where I would record with QuickSync at 50,000kbps locally, send it over RTMP to a Linux VM in my Homelab with FFMPEG to encode with CPU, and then send it to Twitch. The quality was decent in some games; but anything with lots of grass and foliage was pretty bad. I also got tired of messing with the MASSIVE amount of custom flags for FFMPEG. Though this method was the best in-terms of no FPS-loss in-game.
My current GPU is a GTX1080. I did find that there was some noticable FPS loss in-game when streaming directly on my 1080, but I later noticed that you can change which GPU is used for NVENC encoding; so I grabbed an old GTX750 and have been encoding on that since. The GPU usage on the GTX 750 will be a solid 30% the whole time. Too bad it can't use more than 30% for NVEC. My brother recently got a new card so I'll be swapping the GTX750 with a GTX1060 tonight or sometime very soon. After a decent amount of Googling, it seems the newer the GTX, the better encoding done by NVENC. Especially with regards to text and motion; I look forward to seeing the improvement.
With ALL that said, even if the game itself is running at a smooth 60fps; unfortunately the stream still seems to suffer FPS loss. This is due to the OBS preview using a 5-10% usage of the primary GPU; regardless of selected encoder. If the preview window lags, the stream that is sent to Twitch is also laggy.
I've start to somewhat mitigate this by disabling the preview, but some of the GPU is still used even when preview is disabled (2-7%)
This can be further mitigated by putting OBS into the SystemTray instead of leaving the window open.
Though this is only viable to me due to using an Elgato StreamDeck to switch my scenes.
Unless there are further ways to improve performance that I'm not aware of?
I'm hoping this performance increase makes single-card encoding more viable. I know it's more pointed towards the 20xx series cards but everything helps. With that said, why would you say that "it isn't even really possible" when there's a GPU selection within the NVENC settings? I've only seen performance
gains from this method. Instead of both the game AND OBS lagging, it's only OBS that loses some frames; though this will happen regardless of using single or dual cards due to preview lag.
Unless you're suggesting that the entirety of OBS is rendered on the second GPU? I've tried this method by manually selecting the secondary GPU for OBS in the NVIDIA Control Panel settings. OBS unfortunately still uses the primary card.
There's also the Windows method where you set OBS to use a "low-powered device" (aka the integrated-gpu) for rendering (not to be confused with QuickSync), this works but breaks webcam usage and some games.
On a side note, how does the new "Max Quality" Two-Pass differ from the already existing Two-Pass encoding option?
Why was the checkbox removed entirely? What if a user wants to use low-latency but also have Two-Pass?
I'm not necessarily upset about this change, but I am curious.
(I apologize in advance at the length of this reply. prior to this I had thought of making a new thread but dumped most of it into this while asking questions relevant to the post)