Question / Help Stream's quality

Nekoyasha

New Member
Oh hai! :)
I've got a question, actually three... First of all, my comp's specs:
CPU: Intel Core i7 3770 3,40GHz
MOBO: Gigabyte GA-Z77-DS3H
RAM: Kingston 4x 4GB DDR3 HyperX
GPU: Asus Strix GeForce GTX 970
MONITORS: LG 22EA53 21,5'' (1920x1080 - I'm using this one to play games) & Iiyama PL2202W 22'' (1680x1050 - for chat and other stuff XD)
OS: Windows 7 Professional
INTERWEBZ: 150 DL / 15 UL

What are the best OBS settings for me? :o I'm partnered with Twitch btw :)

Previously, I was using i5 and gtx 660 and started using CAPTURE CARD: AverMedia Live Gamer HD Lite. When I've changed these components to the ones above I haven't really noticed change in quality of my stream. BUT, when I've increased the bitrate to 5k (from 3,5k) I saw that the stream looked way better.
My monitor is still connected to AverMedia, although while streaming I'm not capturing the game through it... Can that "block" the increase in quality?

BUT (another but... sowwie ;-;) since a month I'm having ISP problems and I'm dropping packets for some reason. Tried streaming on the laptop, but the issue remains... Dunno what's going on. Sometimes I can't even stream at 3,5k bitrate...

So...What does really increases stream's quality? >__< Maybe I dun really need 5k bitrate to stream in awesome quality, but dunno how to achieve it... What's the problem in my case? ;c


Thankies for all the answers in advance~
 

FerretBomb

Active Member
Please post a logfile, using the 'upload' function in the Help menu and pasting the log here. We really need it to offer solid advice based on your current settings.

Why are you using a capture card? They don't reduce any CPU or GPU load whatsoever if you're streaming from the system you're gaming on (in fact, they can INcrease load due to additional steps in the capture pipeline).

5000kbps is excessive even as a Partner, unless you have a specific reason (and generally have cleared it with Staff, if you plan to stream 5mbps regularly).

In the absence of a logfile:
- What resolution are you streaming at/do you intend to stream at?
- What framerate?
- What x264 preset?
- Are you streaming over wifi? (Don't do this. Seriously, don't. Lots of good reasons why it's a terrible idea.)

Even with the logfile:
- How much CPU load do you normally have while streaming?
- Are you monitoring system vitals like temperature, throttling, load?
 

Nekoyasha

New Member
Posting the log file (was streaming in 1080p@60fps)
And I wanna stream in 1080p@60fps >__<
The thing is...that the quality of the stream is really bad compared to other ones... No matter if I stream in 1080p@60fps, 720p@60fps or 1080p@45fps...I dunno why my stream looks so bad. Other stream are razor sharp, but not mine... And I'm wondering if that's avermedia's problem...I'm not using it for streaming as I've said, my monitor is just connected to it ;c
I'm not streaming over wifi, dun worry :D

CPU is loaded ~50-60% during streaming. And yes, vitals are fine ;]

Halpu~
 

Attachments

  • 2015-10-18-2310-48.log
    22.5 KB · Views: 16

FerretBomb

Active Member
You don't have the hardware for 1080p@60 with decent quality (x264 Veryfast is the benchmark, and even an i7-5960X may struggle with that depending on the game). Also, 1080@60 needs around 6000kbps to start, which is also the area where you can get banned as a denial-of-service attack (talk to your Partner contact beforehand). It's happier at 8000+.

45fps should not be used. 60, 30, 20, 15, 10. Even divisors off a standard 60fps refresh. You'll get less judder, and look smoother.

That Avermedia card can only capture up to 1080@30. But as you're not using it for capture, and it's just in-line? Doesn't matter. It's literally not being used. Just being between the GPU and monitor should have no effect whatsoever.

1080p@30fps, 3500-4000kbps should work out nicely. 5000kbps should look pretty good, but again should probably only be used for a game that needs it (lots of on-screen motion/rapid color and lighting changes/etc). With the left over CPU, move to the Faster, Fast, or Medium preset instead, testing ~20 minutes at a time and watching temp/throttling. Most likely the razor-sharp streams are using a slower preset, which look better. Especially at higher resolution/bitrates. Would recommend aiming for 80-90% CPU usage while in-game.

Far as starting to drop frames at 5000, that's between you and your ISP most likely. Network fluctuation is a thing after all; the reason 1/2-2/3 of your max upstream is advised as a secondary cap, if that's lower than the rate Twitch can handle. Alternately, the Twitch ingest could be mishandling the stream and be unable to keep up with the higher rate. They ARE only rated to work properly up to 3500kbps after all; going past that is "here there be dragons" territory.


(Disclaimer: These recommendations are aimed at Partnered casting; if you are not a Twitch partner and are reading this, it is bad advice and should not be used. Stick to 720p@30fps, 2000kbps, x264 with the slowest preset your rig can handle while in-game.)
 

Nekoyasha

New Member
Thankies for your answer :)
Couple months ago I was streaming games in 1080p@60 at 5k bitrate with no problems (stream was looking really really good), but since I've started to have problems with loosing packets, I'm kinda confused to what to do to stream in nice quality at 3,5-4k bitrate...

Yesterday I've set the x265 preset to faster (at 1080p@60fps). CPU was loaded ~90-97% and got scared :p Even OBS was giving me warnings, so I've turned OBS off XD

Edit: I've made some tests while ago. I've been switching between 1080p@60/30fps and 720p@60fps (with different presets). You can check the videos here (last 7) -> http://www.twitch.tv/neko_hime/manager/past_broadcasts
 
Last edited:
Top