Question / Help OBS / Avermedia LGX motion pixelation question.

JoshOrnelas

New Member
Hi everyone!

So i've been streaming for a few months now using an Avermedia LGX capture card (USB 3.0) and i've been wondering if there's anything that would make an external capture card pixelate during motion, even if at higher bitrates. My current issue is that i'm running everything at a standard 2400kb/s (using 3500 for testing), with every other setting standard per twitch stream protocol all at 720p @ 60fps (happens at 30fps as well), and i still get pixelation with console play, but not as much with PC play.


Log file of 3500bitrate test stream: https://gist.github.com/ca91e15c44a6884d81ec

Sample footage at 3500: https://www.youtube.com/watch?v=EKYR6pRp8D4


PC Specs:

CPU: Core i7 4790k
Memory: 16GB HyperX DDR3
GPU: GTX 670
HDD: 2TB WDD Mechanical.

Download speeds around 100mb/s, Upload around 10-12mb/s


I ask because I noticed during playing PC games, i have less pixelation/artifacting during motion, as opposed to the Avermedia LGX, which has more at the same bitrates. Is this normal? Local recordings at 15000-20000kb/s through OBS look fantastic, so not sure if the card is the bottleneck, or my settings.

Thank you for your time.

-J
 

Boildown

Active Member
Need a longer log file. You only streamed for about 80 seconds. 5 minutes would give more reliable results.

Assuming I can believe what I see in your log file, your stream is fine. x264 encoding at low bitrates just look like that when there's motion. You can try moving to Faster preset instead of Very Fast, but make sure that you're not getting more than 1% duplicated frames or the trade-off won't be worth it.

Also, you can encode 30fps instead of 60fps. This allocates more bits to each frame of video, improving quality.
 

JoshOrnelas

New Member
Understood, i can chuck out a longer log file when i get back home later. Unfortunately it looks about the same when using 30FPS, but i've also been scrubbing around the forum for settings, and i've noticed people are talking about bilinear/bicubic vs lanczos. I've been using lanczos for everything, so i'll try and see if switching off of it gives me any sort of improvement. Thanks for the feedback! I'll do a few tests later when i get back home and if i don't get better results, i'll post the longer log file.

-J
 

JoshOrnelas

New Member
Made a few changes: went from lanczos to bicubic, chaned CPU preset to faster (will probably switch to very fast, no marginal different there), tried out 30fps, same results, went back to 60fps for the purpose of trying to fix it for 60fps, and i experimented with higher buffer to see what happens.

Basically, same results, i'm getting blocky-type of blur everywhere. I've also run tests with deblock=1:2 (and various numbers) on, just to see what would happen. Pretty much same thing.

This log contains a test over five minutes, running with everything standard, with a change to bicubic, and faster.

https://gist.github.com/56c94ccf6dc6e8606716dba0df0a558a

If everything seems fine, then i guess it's the nature of the video card running with OBS. I figure it's something with my settings because i see people with less powerful machines, run games at the same resolution, same framerate, same bitrate, but look much, much cleaner.

Once again, thank you for your time.

-J

Edit: I'm an idiot and made a log using 32bit OBS. Here's the log from 64 bit.

https://gist.github.com/93e3ace059cc4a3dbf80312bf823a206

I can also provide a log from studio if that helps any.

-J
 
Last edited:

Boildown

Active Member
23:59:19: Scene buffering time set to 400

Try setting that to 700. 400 was the old default, but 700 has been found to work better.

But I don't think that will have any dramatic effect. You may just have too high expectations for how good video encoded real-time can look, or are overly critical of your own video (because you know what it looked like originally) compared to other people's (where you don't know this).
 
Top