OBS benchmarking: 1080p@60FPS, CPU vs NVENC vs Quick Sync

Skymirrh

New Member
Hello,

I didn't know where to post this so I thought the feedback section was as good as any... If I was wrong please feel free to relocate this thread :)

I did some benchmarking this afternoon, out of curiosity, so in case anyone is interested here are the results:





If anyone want the Excel file just ask.

You can find screenshots from roughly the same position in the videos ("00:00:41" differ a bit depending on the encoding conditions) at the following address: http://imgur.com/a/Xs3fO#0

Random things (I am by no means an encoding expert, please feel free to correct/complete) :
  • I know 1500 kbps is way too low for a decent 1080p encoding, my goal was just to have a common comparison basis as well as not going too high in bitrates, since it is my opinion that it's the streamer duty to enforce a cap on bitrate (you have to take into account your viewers download bandwidth, not only your upload bandwidth).
  • I haven't done Ultrafast VBR as it would have yield the same results as for other presets.
  • I haven't done Ultrafast without OpenCL since it didn't seem to take it account anyway. Notice how the GPU load for Ultrafast rests at 10%, just like the other presets without OpenCL, whereas with OpenCL all other presets had a 30%+ GPU load. Maybe the CPU is not loaded enough and decides it's not worth to offload some work to the GPU?
  • OpenCL seems to ease between 5 and 10% off the CPU. Although I don't have numbers to prove it, I usually stream at 720p and have noticed OpenCL performing better than that: maybe OpenCL works better when used in conjunction with downscaling or smaller input resolutions? Would need another benchmark to check.
  • I haven't done any other presets on NVIDIA NVENC and Intel Quick Sync but the High/Best Quality because the quality was already not that good (you don't even want to see what it's like on High Performance) and there was no load bottleneck, so no need to use a faster preset.
  • As expected, NVIDIA NVENC releases all pressure off the CPU and does not use GPU neither since it's done on a hardware chip. However this comes at the cost the quality and requires you to use a very high bitrate to achieve decent quality. I wouldn't recommend using it unless you have an absolutely shitty CPU, a tremendously good upload bandwidth, and know for sure your viewers have a tremendously good download bandwidth.
  • Intel Quick Sync performs way better than NVIDIA NVENC quality-wise, and ease off the same amount of CPU load. However I have no idea why it pressures NVIDIA GPU while encoding (notice the 35% load when using Intel Quick Sync). The Intel HD GPU seems happy to rest at 60% load whatever the settings. I would recommend using it if your CPU can't handle the encoding and you don't need a crystal clear image.
I obviously must have missed a lot of things in this short analysis, feel free to complete ;)
 
Last edited:

Jack0r

The Helping Squad
Forum Moderator
Maybe it would be a good idea to capture your test footage from vlc instead of using the video plugin, that way the cpu load from the video is removed from obs and your duplicate frames might also go down.
Then I would like to know what the CPU/GPU values are. Average? Max? How did you calculate them? Is the load only OBS or full system load?
Filesize could probably be removed.
Quicksync quality (and also probably speed) depends a lot on the intel HD version. So you can only benchmark intel hd 3*** quicksync capabilities. (I have 2*** and 4*** available and the difference is huge)
You could do 720p and maybe try one or two more x264 presets.
Quality/Stuttering should be two different fields with quality being maybe a scale from 1-10 ?

Definitely thanks for the time you invested!
 

Skymirrh

New Member
Hey, thanks for your input, these are good ideas!

CPU and GPU loads are average load in percentage of max capabilities, as stated in the column header ;) I have measured them using CPU Usage Logger and GPU-Z. Both record full system load (so not only OBS, but I only had that opened along with Excel) and log them in text files, which I used to compute average on the recording period.
 

FerretBomb

Active Member
Yeah, just mentioning for anyone just getting into streaming who stumbles across this thread; 1080@60 is not a viable streaming setting with current-day technical limitations, and an i5 definitely isn't going to have the juice to do it in realtime without going to super/ultrafast... which will look pretty crappy anyway.

Still some valuable benchmarking, if nothing else to quantify the common-knowledge that x264 is the way to go for quality, QSV is a somewhat passable second-best (until you get to the 4500+ series, then it actually gets good), and NVENC/shadowplay is just crap in general for the time being, but might actually help if you're on an i3 potato without QSV.
 

Boildown

Active Member
How many times did you do each test? I once tried to do some OBS benchmarking but as I tested I noticed I got wildly inconsistent results with the same exact settings because my test video was too short. And doing all those tests with a video long enough to produce consistent results, about five minutes minimum, wasn't something I decided I wanted to do at that time, so I aborted the whole endevour.

(My test was going to be of x264 and NVEnc on various video cards, I had a GTX260, GTX 560Ti, GT630 (Kepler), GTX680, and GTX750Ti. I was hoping to measure the effect video card speed had on OBS and where the cutoff was where increasingly faster video cards no longer provided a benefit for dedicated OBS boxes.)
 

Skymirrh

New Member
I did each test once, sometimes twice because I messed up and forgot to write down something in the Excel, but I didn't see any inconsistent result... Video length was 1m 12s, and with each test the CPU/GPU load charts always had the spikes roughly at the same position, so I don't know.

Maybe you had something running a background process which you forgot about? The first time I did some benchmarking, to compare OBS/XSplit/another software which I don't remember, I had a program running to cycle desktop backgrounds (dual screen background cycling is not very well supported by Windows 7) and switching backgrounds every 30 seconds or so. So every 30 seconds there was a CPU spike while the software was appending two random images and setting it as my desktop background :D
 
Top