Question / Help Which is better?

720p30 is historically one of the best compromise-points, allowing low bitrate needs (2000-2500kbps) to make your stream as accessible as possible.
480p60 would work if you were playing retrogames and needed to have 60fps to display sprite-blitting transparency properly. Otherwise, 60fps is a luxury that is frankly a waste of bitrate, roughly doubling the needed bitrate.
 
How does one set, let's say, OBS, to stream at 720p30?

I have been doing 720p60@2500 bitrate. But it seems the stream quality is not always optimal (as in, there seems to be some blocks on screen). As such, I thought going 720p30 with similar bitrate would make the viewing smoother, am I wrong?
 
Guess I found it, but now If I could get some feedback, if hijacking this thread is not against rules:

Base Resolution: 1080; Output Resolution: 720; Common FPS Values: 30.
Encoder: x264; Bitrate 3000.


Will this work? I'm using AMD Hardware, been streaming with AMD encoder, but changed for the x264 to see if it puts more work on the CPU, as my Vega 56 seems to be doing much of the work, while my Ryzen 7 2700 should be the one carrying most of the streaming I guess.
Thank you and sorry @Jkingstan
 
Correct. Think of bitrate like paint. The resolution is the size of the wall you have to paint, and the framerate is the number of walls. If you have half as many walls, you can do a better job and apply twice as much paint, making them all look better.

AMD's AMF encoder is utterly abysmal. AMD care precisely enough about AMF to say the words "we care about AMF" and nothing more. It's the reason an nVidia card is strongly recommended for any livestreaming setup at present... NVENC is amazing at this point, and has pretty much negated the need for a 2PC setup entirely, with very few exceptions. If you have an nVidia 20-series card, there is no need for CPU encoding... the on-card encoder does as good or better a job than 95% of the setups out there, aside from the $5000+ dedicated encoder machines that give only a marginal improvement that would only be noticed with side-by-side video. Maybe.

You'll want to use x264 Software encoding though, not AMF, so your CPU handles the encoding, yes. It'll be much better quality, and have fewer problems.
 
Amazing, Ferret. Thanks for the tips!

That being said, would it be worth it to raise the bitrate to 3000? The idea of having lower bitrate would be to remove the "bandwidth barrier" from those with poorer internet connections, specially during this period of time where we have loads of people stuck at home using internet.

And yeah, I think I was using the AMD encoder because of DOOM16 (which I was never able to stream when I went from a gtx970 to my current Vega 56). However, with some tweeks here and there, I actually at some point managed to get it running ok, on AMD encoder, which seems to be the only way for OBS to capture DOOM. But it's a pain, I may try it again one of these days, but for now I'm at x264 as my CPU is also supposed to be good at it as it's a high core CPU (one of the reasons I got it).

Thanks!
 
If you mean DOOM Eternal, Game Capture support was added for Vulkan in OBS v25. Not entirely sure about DOOM 2016, that one should just work (though IIRC it was selectable between the two?).

Mitigating bitrate is almost ALWAYS a good idea. It increases the accessibility of your stream A LOT, as more people are able to watch smoothly.
3000kbps wouldn't be too bad, but I always just shake my head at new streamers running 900/1080p60 at 6000-8000kbps, not realizing that they're not just shooting themselves in the foot, they're blowing off their whole leg when it comes to potential growth. It's easy to get lost chasing numbers.
I'm a Twitch Partner, and if I'm playing a game that doesn't need a ton of bitrate, I'll still drop down to the 2000-2500kbps range just to open things up for more potential new viewers... the transcode stacks don't result in great quality a lot of the time.
 
Well as far as I can tell, not many people managed to capture DOOM 2016, but I may try again because yeah, haven't tried in a long time now, and the problem was exactly that OBS couldn't capture the game being played using Vulkan (on my case, atleast).

Ok got it. I may stick to 2500bitrate on Jedi Fallen Order for a while, and will adjust (I only got this notion when playing watch dogs as people told me because the game was fast paced, it looked bad).

So am now sitting at 720p30, 2500bit rate on Fallen Order. Let's see how this rolls! Thanks a lot for the help!
 
Well yeah, but I think there's some advantage on streaming at 30fps. I just did today, and my channel had more options of quality (not sure if coincidence, or not), but before when I was at 60fps, people could only chose one stream quality; this time, they had options from 720 down to 160, which is a nice feature, so I'm def gonna stick with 30fps for a while, as I think it didnt hurt the stream much.

Thanks for the help and overall comments!
 
Well yeah, but I think there's some advantage on streaming at 30fps. I just did today, and my channel had more options of quality (not sure if coincidence, or not), but before when I was at 60fps, people could only chose one stream quality; this time, they had options from 720 down to 160, which is a nice feature, so I'm def gonna stick with 30fps for a while, as I think it didnt hurt the stream much.

Thanks for the help and overall comments!
Those quality options are 'transcodes'. Unless you're a Partner, you get them somewhat randomly as resources are available (some say Affiliates get them more often, but there's no data to support that). Your resolution/framerate might play a role in making it more or less likely that you get them, but Twitch keeps the special sauce under their hat as far as the actual specific selection criteria, to avoid having people game the system. So yep, coincidence.
 
Ok i did some test livestream
does it look good for 720p30fps?
 
Back
Top