OBS Stream Performance: AMD vs Intel using NVENC?

Cascoh

New Member
Hey guys.. I have a Ryzen 3900x & an ASUS 2070 Super in my Rig for gaming and streaming. When I did the build NVENC wasnt out and thats pretty much all I’ve been using with no issues at all since its been out. With that being said would it make more sense to go with Intel for the single core performance & fps gain since im not using x264? I max most games at 144fps but some titles can take a big perf hit (Warzone and Apex) Any suggestions or experiences would be a great help. Im looking towards the i9-10850k at the moment for gaming and streaming. Will I notice any differences in terms of rendering and stream quality/stability going with Intel? Im no pro editor but I do use sony vegas and photoshop fairly for my clips. Thanks guys!
OBS Settings:
NVENC
936p
6000 bitrate
Max Quality
Games:
Cold War, Apex, CSGO and WoW!
Any more info can be provided later as im at work at the time of this post :)
 

Lawrence_SoCal

Active Member
Hey guys.. I have a Ryzen 3900x & an ASUS 2070 Super in my Rig for gaming and streaming. When I did the build NVENC wasnt out and thats pretty much all I’ve been using with no issues at all since its been out.

I don't follow. So you have an nVidia RTX 2070 Super, right? And NVENC has been part of OBS, and usable with Turning NVENC since before RTX release. (sorry, not a gamer if there was an issue at first RTX release with using NVENC at that time that I'm not aware of)

In general, offloading real-time video encoding to a GPU (with dedicated encoding circuitry) vs using CPU cycles means more CPU cycles to use elsewhere. It does mean a PCie bandwidth consumption, but PCIe bandwidth constraints are typically rare. So, why aren't you using NVENC now? Or are you? And why that odd video resolution (936p)? If you search in this forum on high refresh rates, you see cautions about matching (iirc) screen refresh and OBS fps (or at least even multiples)... sorry not my thing, s I don't recall details off the top of my head)

And for better settings advice, follow pinned post in this forum to post an OBS log AFTER running a stream/recording for more than a couple of minutes.
 

Cascoh

New Member
I don't follow. So you have an nVidia RTX 2070 Super, right? And NVENC has been part of OBS, and usable with Turning NVENC since before RTX release. (sorry, not a gamer if there was an issue at first RTX release with using NVENC at that time that I'm not aware of)

In general, offloading real-time video encoding to a GPU (with dedicated encoding circuitry) vs using CPU cycles means more CPU cycles to use elsewhere. It does mean a PCie bandwidth consumption, but PCIe bandwidth constraints are typically rare. So, why aren't you using NVENC now? Or are you? And why that odd video resolution (936p)? If you search in this forum on high refresh rates, you see cautions about matching (iirc) screen refresh and OBS fps (or at least even multiples)... sorry not my thing, s I don't recall details off the top of my head)

And for better settings advice, follow pinned post in this forum to post an OBS log AFTER running a stream/recording for more than a couple of minutes.
Okay so I apologize if my original post wasnt clear enough so to answer you:
1. Yes I use ENVENC on the 2070 series.. they had ENVENC on the 10 series cards but it wasnt as good as it is now. People still stuck with x264 and with the new ENVENC encoder people are using it for its 1:1 quality with x264 medium preset and better performance for fps compares to 264 (about a 35-45% fps increase).
2. I stream at 936p @6000 bit rate at 60fps cap lock and lock my 144fps in game to preserve leftover reaources to OBS.. I never try to record at the same refresh rate as my monitor..
3. Why 936p vs 900 or 720? A:
900p is a weird resolution that seems to cause issues with scaling on the client side. 936p is cleanly divisible by 8 which doesn't have that problem.. 936p is a true 16:9 resolution. 1600x900 is not.
Will post a LOG when I get home soon :)
 
Top