Question / Help High CPU Use with Capture Card

I've seen a few threads about this but most of them aren't about my specific issue. Either that or they are about my specific issue but don't have enough details to yield an answer.

I do a moderate amount of streaming and recently got affiliated on Twitch (yay me). I thought I should start stepping up my game to celebrate this new status by utilizing an older machine I have for a dual PC setup, but I'm a bit confused about how OBS interacts with capture cards.

Put simply, when streaming through a capture card, specifically an Elgato HD60, OBS CPU use goes much higher than I've seen it go when simply encoding on the same machine I'm gaming on. On a still menu screen, OBS (using the capture card) sits at around 31% use and, when playing, sits at around 49% to high 60%, sometimes even 70% and above, depending on the game. This is using x264 encoding, 720p @60fps at 4500 bitrate, the "veryfast" cpu preset and with a 1080p input down scaled to 720 with Bicubic down scaling.

More details below:

The setup consists of a freshly built machine running an i7-7700k / GTX 1080ti and a dedicated stream PC that (brace yourselves) is actually a 2013 era iMac running bootcamp into Windows 10 with an i7 4771 and a GTX 780m.

I used to be a Mac guy, it's been a big year of changes for me.

I figured I could use my old Elgato HD60 I got a few years ago for some light capture work to take load off the main PC I game on and put the encoding work on the iMac PC (again, this machine IS running Windows 10, in case you skipped ahead).

However, when playing some games CPU use on the iMac jumps to high 60%, even low 70% use at 720p and 60fps. Obviously this varies from game to game, but on average CPU use will never drop below about 31% while streaming on the 4771.

This seems high to me for a CPU that, while a few years old, isn't a slouch and is only doing the one thing. No other background programs are open aside from a popout chat window through Chrome. I never saw CPU use that high when encoding directly on the same PC I was playing games on.

I've since run many streams across various games (same settings in OBS every time) and consistently see the iMac stream machine hit an average of 45% - 51% CPU use on any given day/game. Games like Overwatch and Dishonored 2 see it jump to the higher 60% and 70% values I mentioned before.

For giggles I ran a short 3 minute stream on my 7700k machine and compared it to another 3 minute stream of the same game/level I did on the 4771 and saw max CPU use on the 7700k hit 31.6% and 51% on the 4771. I made sure OBS was using the same settings on both machines before running the tests. Those log files can be found below.

LOG FILES:

7700k test.

4771 test.

So what gives? Does OBS have to work that much harder to encode from a capture card? Would a different capture card lower CPU use? Is the difference I'm seeing in performance between the two systems (both utilizing the same capture card/settings) about standard for their difference in power?

Don't get me wrong, the stream works, but seeing CPU use hit ~60% -~70% is weirds me out. If I ever want to add more to the stream in the way of background programs or better settings, it doesn't seem like, at the moment at least, I'll be able to on this machine.

I hope I haven't left any details out, please let me know if I did. Thanks for your time.
 
Top