Question / Help Higher CPU usage in later versions?

Bkid

New Member
I've been using OBS 0473b Test 11 for a while now. My friend just recently got better Internet so he's able to stream now, and when he went to download OBS, I went ahead and updated mine to the newest (0.51b).

I noticed when changing versions that the newest versions would use about %20 more CPU than the version I was using before. (from ~41% to ~62%). When I was streaming my game (LoL) to test some things, I found that the game was lagging pretty badly due to the high CPU usage of OBS.

My Computer: https://secure.newegg.com/WishList/MySa ... D=21661325 (Minus the graphics card since I've been poor recently :/ )

OBS_0473b_test11 Log: http://pastebin.com/UGTnc6HG

OBS_0_48_018_test Log: http://pastebin.com/rK330UPe

OBS_0_51b Log: http://pastebin.com/Sz4X7tUn

I previewed my stream for 30 seconds, watching the same video on Youtube each time. I noticed that some x264 stuff was missing from the newer version logs. Could that be related to the issue I'm having? If so, how I can get those setting back again so it's not taking up so much CPU?
 

Lain

Forum Admin
Lain
Forum Moderator
Developer
If you look down at the bottom in the profiler and where it records lagged frames, you'll see that you are actually performing better in 0.51, you have shorter frame times and have fewer lagged frames in 0.51.

Now you may think "shouldn't that be better?" The answer is actually it will cause you to encode more frames, and more frames means more x264 CPU usage, ironically, which can eat up a little more CPU, especially at 1080p 60fps which is ridiculously intense. If you look at some of the test threads with the newest builds, almost everyone reports better frame times, more efficient capturing, but some in similar situations report higher CPU usage because they are actually encoding more frames because they aren't lagging frames like they used to. I made a lot of capture optimizations and it seems that this was an interesting side effect from it for some people.

35% of your frames lagged in test 11 where as in 0.51 you only had 16%, average frame time in test 11 was 16.789 milliseconds where in 0.51 it was 13.157 milliseconds (note: this is actually really big, 27% reduction), so more frames are being encoded and that's probably why it's using more CPU. That's just my best guess based upon the data. In test 11 35% of your frames weren't meeting the 60fps goal you had set, which is down significantly in 0.51. x264 is completely multithreaded so it doesn't usually affect the frame timing itself directly, but it could potentially effect other things on your computer. So you have to take that into consideration. You're doing some seriously powerful HD streaming, 1080p 60fps.

My recommendation would probably be to turn down the framerate just a bit to leave some room for LoL or try messing with process priority in advanced or mess with it in task manager for LoL. Messing with the priorities can shift CPU to focus more on one thing or another, so it's something you might have to experiment with.
 

Krazy

Town drunk
As someone who has a lot of experience with trying to do 1080p60fps, I will just chime in here and say: Don't do it.

There are several reasons for this, but the main one is that Flash player basically cannot handle it well, and it's very demanding to decode, so viewers with weaker PCs won't be able to watch very well.

Another is the bitrate requirement. You are looking at 3000 bitrate MINIMUM for a watchable quality stream with a game like LoL

Even if you are ok with that and still want to do it, please for the love of god do not use Monitor Capture on Windows 7 while trying to do 1080p60fps.
 

hilalpro

Member
I would like to add that higher settings set requirements for the user aswell as the viewer. turning down the fps or downscaling would benefit you both.
 

Bkid

New Member
I'll probably end up turning down the framerate a bit, and possibly downscaling (although I'd like to avoid too much of that, if possible). Also, My bitrate was kept the same from before my Internet speeds got better. I have about ~12Mbps upload, so I think it's safe to increase my bitrate a little (but I guess I shouldn't bother if I'm turning down settings anyway?).
 

Lain

Forum Admin
Lain
Forum Moderator
Developer
You could also try out 1.25 downscale as well with 60fps instead. Reduces resource usage a fair amount and still keeps the image fairly high resolution.
 

Bkid

New Member
Krazy said:
As someone who has a lot of experience with trying to do 1080p60fps, I will just chime in here and say: Don't do it.

"Starcraft 2 stream @ 1080p60fps: http://www.twitch.tv/krazytrumpeter05"

xD

Also, I worked pretty well at 1.25 downscale and 60fps, but I went ahead and dropped it to 1.5 for increased performance..and so people will quit asking me what 864p is. :P

edit: Also, how do other streams give their viewer options to choose from as far as quality? Mine is just stuck at 720p+ on Twitch, and it's not changeable.
 

Xphome

Member
Bkid said:
edit: Also, how do other streams give their viewer options to choose from as far as quality? Mine is just stuck at 720p+ on Twitch, and it's not changeable.
That is a partner feature. The channel has to be partnered with Twitch.
 

Krazy

Town drunk
Bkid said:
Krazy said:
As someone who has a lot of experience with trying to do 1080p60fps, I will just chime in here and say: Don't do it.

"Starcraft 2 stream @ 1080p60fps: http://www.twitch.tv/krazytrumpeter05"

xD

Also, I worked pretty well at 1.25 downscale and 60fps, but I went ahead and dropped it to 1.5 for increased performance..and so people will quit asking me what 864p is. :P

edit: Also, how do other streams give their viewer options to choose from as far as quality? Mine is just stuck at 720p+ on Twitch, and it's not changeable.

Damn, I need to change that signature, forgot I still had it.
 
Krazy said:
Even if you are ok with that and still want to do it, please for the love of god do not use Monitor Capture on Windows 7 while trying to do 1080p60fps.

Ever since I started using OBS, I have only had BAD experience with Game Capture. I'm not sure why, but my stream is definitely worse when using Game Capture. The fps is very inconsistent and the stream just stutters.

I've done 1080p 60fps, NO PROBLEM! Only thing is... for a quality 1080p stream, especially at 60fps, a very high bit-rate is required (around 5000 I'd say).

I'm not sure why, but using ONLY Monitor Capture and playing ALL games in windowed fullscreen is the absolute only way for my stream is be 100%.

Here's a recent Log file.
 

Attachments

  • 2013-04-28-1913-51.log
    3.1 KB · Views: 35

paibox

heros in an halfshel
Game capture is more load on your video card, especially at 1080p60. Seeing how you have lagged frames already, it might be that whatever game you're streaming is taxing your video card too much at the settings you're running it at, resulting in even more lagged frames.

It's hard to say without seeing a log where you actually use game capture.
 

Bensam123

Member
Is it possible to offload this GPU strain to a different GPU with the GPU selector in video options?

Also Windowed mode is more CPU dependent and Gamecapture is more GPU dependent? I've never heard of that before nor has it been explained anywhere... What is the ratio of workload split? Is window like 100% CPU and gamecapture like 90/10?

Edit: A quick test shows gamecapture uses less CPU (about 10%) and less GPU (about 10%) then screencap. Tried this at 1080p@60fps on a 7870 and a FX3850. In high action scenes this range goes up to about 20-30% for the CPU, GPU remains unchanged.
 
Top