Question / Help Best Settings for streaming Xbox One?

pervysage

New Member
Hey guys, just recently got myself an Avermedia ExtremeCap U3 capture card.

My PC specs are as follows:
CPU - Intel i7 2600k @ 3.40ghz
RAM - 16GB DDR3 Corsair Vengeance
GPU - 2GB Sapphire AMD Radeon HD 6970

My internet speed:
3475793149.png


What would be the best settings for streaming? (streaming my Xbox One game console only at the moment).

My goal is to stream 720p @ 60fps and have a really great looking stream.

I'm a bit confused about the "Base Resolution" setting in OBS. I currently have it set to 1920x1080 and then have it downscaled 1.50 (1280x720). Is that the correct setting if I am streaming a game console? My Xbox One display settings are set to output at 1080p, however I think the game I usually play (Call of Duty Ghosts) runs at 720p on the console.

Under my capture card's properties, should I have it set to custom resolution or just have it use default? Should I be using the Change Output Format and Buffering options under the capture card properties?

Any other settings I should take a look at?

Thanks for your time!
 

Krazy

Town drunk
That is the correct way to downscale, yes. I would recommend trying either the Bicubic or Lanczos downscale filter to clear up the image. Whichever one you use is purely personal preference.

You'll want to go no lower than 2500 bitrate for 720p60fps, and the closer you can get to 3000 without dropping frames or causing your game to lag, the better (if you are playing online, anyway).

Depending on the capture card, you may have to choose custom resolution and match it to the source you are capturing. In this case, you would set the card to 1920x1080 and then do the downscaling in OBS. For your particular card, you shouldn't need to mess with Output Format or Buffering options.

Once you've got everything working smoothly, and not dropping frames or taking too long to encode, then you can start experimenting with lowering the x264 preset. This allows you to compress the video more efficiently and possibly save on some bitrate. Since you aren't playing a game on the same computer you are using to encode, you have a lot more CPU power free to dedicate to encoding.
 

pervysage

New Member
When running at 60fps I seem to be having issues with the stream skipping a little bit and it just doesn't look smooth like a 60fps stream should. When looking at the OBS window, no frames are being dropped.

When running at 30fps, everything seems to run pretty smooth. Is 2500-3000 bitrate really enough for 60fps?
 

Krazy

Town drunk
Have you updated to the very latest version of Flash? Check on Adobe's website, the latest update actually had a pretty noticeable playback performance increase.
 

pervysage

New Member
Have you updated to the very latest version of Flash? Check on Adobe's website, the latest update actually had a pretty noticeable playback performance increase.

I am starting to be able to get things more smooth. However, I have a problem with my bit rate fluctuating quite a bit, causing it to max out my upload speed and dropping frames.

If I have it set at 2500 and begin streaming, I have seen the bitrate jump as high as 3000+. Is it supposed to fluctuate that much even though my settings are at 2500?
 

FerretBomb

Active Member
I'd strongly recommend going to 30fps instead. Aside from certain circumstances (and 'playing an FPS' is not one of them) 60fps is just stat-jerking. It'll make your stream look poorer/more artifacted, increase the CPU load to encode it, and is more psychosomatic than anything else. You'll get better results all around at a lower framerate and dropping your encoding preset by a step or even two with the CPU load you'll save.

There's good reason that the non-partner 'sweet spot' is 720p@30fps at 2000kbps. Decent to good visual quality while not sending most of your viewers into buffering-hell.

Also, speedtest.net is worthless for livestreamers, aside from saying 'yes, my connection is live'.
 

Krazy

Town drunk
I have to disagree. The extra load going from 30 to 60 fps is not really that severe, nowhere near the jump from 720p to 1080p. Assuming you are using reasonable bitrates, it can absolutely improve perceived quality. The smoother motion and additional frames per second actually reduce the impact/obviousness of artifacting/pixelation.

Anyway, bitrate fluctuations like that are normal, even if you have enabled CBR. Constant Bitrate isn't truly constant, the encoder does have to be allowed SOME leeway.
 

FerretBomb

Active Member
I'd have to disagree via testing on that one. The load difference is less than 10% delta when I swap between the two (almost matching the pixels-per-second rate, though 1080 *is* slightly heavier), and artifacting becomes pretty severe. Then again, I suppose I'm on an older CPU, though I really don't know how much of a difference that would make, given that it can handle both without a problem.
30/60fps is mostly psychosomatic (or stat-jerking, again), unless you're dealing with a game that blits sprites for transparency.
 

Boildown

Active Member
Pretty sure it depends on which x264 settings you're using (i.e. preset), as to whether its more CPU intensive to go from 30 to 60fps or from 720p to 1080p. At 60fps motion vectors are halved and Twitch's 2 second keyinterval is effectively doubled, both of which mitigate but don't eliminate the loss in quality due to fewer bits per frame. At 1080p larger macroblocks are more likely to be used, doing the same for 1080p. I would guess that slower presets would favor larger size more than faster frames.
 

pervysage

New Member
I have to disagree. The extra load going from 30 to 60 fps is not really that severe, nowhere near the jump from 720p to 1080p. Assuming you are using reasonable bitrates, it can absolutely improve perceived quality. The smoother motion and additional frames per second actually reduce the impact/obviousness of artifacting/pixelation.

Anyway, bitrate fluctuations like that are normal, even if you have enabled CBR. Constant Bitrate isn't truly constant, the encoder does have to be allowed SOME leeway.

Should I just lower bitrate even more to prevent the spikes to max out the upload? Earlier you did say no lower than 2500 for 720p 60fps. But most of the time when I run at 2500, the spikes usually cause the bitrate to be much higher than that.
 

Boildown

Active Member
Personally I stream 720p at 3000 bitrate and 2000 buffer (important part of this is buffer smaller than bitrate) to decrease the spikiness of the stream. My upload is nominally 5000 but in reality I get somewhat less than that, and when I'm gaming and VOIPing and streaming all over the same link, if I try to do 3000/3000, it affects my gameplay. 3000/2000 works great. So you could try lowering your buffer and see if it works for you as well. Note that lowering your buffer will decrease quality, especially in highly-complex scenes, but its probably better than the alternative.
 

pervysage

New Member
Personally I stream 720p at 3000 bitrate and 2000 buffer (important part of this is buffer smaller than bitrate) to decrease the spikiness of the stream. My upload is nominally 5000 but in reality I get somewhat less than that, and when I'm gaming and VOIPing and streaming all over the same link, if I try to do 3000/3000, it affects my gameplay. 3000/2000 works great. So you could try lowering your buffer and see if it works for you as well. Note that lowering your buffer will decrease quality, especially in highly-complex scenes, but its probably better than the alternative.

Interesting.

I experimented with the bitrate and setting buffer lower and sure enough it works. Not dropping frames and the huge bitrate spikes have been reduced greatly.

It's kind of weird though. I started off at like 2600 bitrate and 2200 buffer.... and kept moving the buffer slowly up. I ended up matching both bitrate and buffer and am not dropping frames anymore... spikiness of the bitrate is still reduced.

Could this have to do with simply just checking the "Custom Buffer Size" box? Doesn't OBS just match the bitrate and buffer size by default when the box isn't checked? Doesn't make any sense, heh.
 

Boildown

Active Member
Yeah that's probably it. You could have saved yourself some trouble and guesswork by posting a log file, as that probably would have stood out.
 

pervysage

New Member
Haha, my bad. So you are supposed to have the custom buffer size box checked off at all times and just match the bitrate/buffer?

Just curious as to what OBS does by default if you don't have the box checked off and only have the bitrate field filled in?
 
Top