Question / Help 2 Issues

Techno2k9

New Member
Hi,

I use OBS as my only recording software for both PC games and also when playing games on a Xbox One or PS4, but recently I have noticed 2 issues.

1st issue is that when I open OBS I have to open the properties for my capture card, and change the resolution from "Device Default" to "Custom" and then back to "Device Default" in order for it to be captured at 1080p, however according to the log files it's only actually receiving at 720p.

2nd issue is that if I capture from the capture card itself the quality is really low, but if I open the cards own software and put that full screen and then capture that video window the quality is really good.



Now the first issue does not present itself when using the cards own capture software, nor does the 2nd issue.

I have found sort of a way around the 2nd issue when capturing from the card itself but it add's another issue in which means it's not a viable solution.

The way around it is to increase the recording Preset from "Default" to "High Quality" Profile from "Main" to "High" and Level from "Auto" to "5.1", then in the capture properties for the device enable buffering, and this gives the quality that has always been available prior, but it also introduces a sound bug whereby the sound from gun shots will be a couple of milliseconds to seconds ahead of the video whilst all the mic and party chat from the console is perfectly in time, and this happens on multiple games from the Xbox One to the PS4.

Now when I record PC games it's spot on the video and sound quality are where they should be.


So on to hardware:

CPU: Intel Core I7 7700K @ stock

Motherboard: Asus ROG Strix Z270H

Memory: 32Gb Corsair 2400Mhz DDR4 @ Stock

GPU: GTX 980Ti

SSD: Western Digital Blue 500Gb (will be replaced when funds are available, as it's slow compared to my old Samsung 840 Evo)

HDD: Western Digital Red 3TB

Capture Card: Avermedia LGP Lite (would have better but currently no funds to replace it)



Now Recording Settings

Encoder: NVENC

Format: MP4

Audio Tracks: 6

Bit-Rate: 50,000 (Overkill I know but I don't like low quality videos)

Preset: Default

Profile: Main

Level: Auto

B-Frames: 2




Now those settings are what I have always used with OBS and they have worked flawlessly forever when I was recording console gameplay.

I actually don't have a video uploaded which shows the difference between the 2 recordings as of yet but I can upload one, but the closest example is this: https://youtu.be/QNNxsTXHOLY Excuse the language from my friend, in that video it's blurry at the start and then right at the end it goes clear, and this is basically what it is like capturing directly from the card through OBS and then recording through the capture cards own software.

This is what those settings give me when I record PC gameplay. https://youtu.be/fek40Yr4jkY


Now youtube makes it look worse, but from the pc the recordings look spotless with no issues to video quality or sync issues, however the console just looks really bad.

I do intend to replace the card because I want a 60fps card but currently I cannot do that, so the only work around that actually works is to record the video window of the Avermedia software using OBS, as that has the options I want to use.

I was just wondering if anyone else has come across these issues and maybe found a fix for them.
 

Attachments

  • 2017-09-18 14-42-45.txt
    10.1 KB · Views: 8

DeMoN

Member
Bit-Rate: 50,000 (Overkill I know but I don't like low quality videos)
Thats very low for NVEnc and even just main profile.
Come back with at least 200.000. But bitrate coding is shit anyway. I would switch to CQP with a factor with 19 or lower.

Uploading just 1080p to youtube is the next fail.
Use at least 2048x1152 to get at least their 1440p Encoding preset which has 3-times higher bitrate.
 

Fenrir

Forum Admin
Thats very low for NVEnc and even just main profile.
Come back with at least 200.000. But bitrate coding is shit anyway. I would switch to CQP with a factor with 19 or lower.

Uploading just 1080p to youtube is the next fail.
Use at least 2048x1152 to get at least their 1440p Encoding preset which has 3-times higher bitrate.

50k bitrate is more than enough, even for nvenc.

The youtube comment is just nonsense. Can you please cite your source for this?
 

DeMoN

Member
50k bitrate is more than enough, even for nvenc.
NVEnc is even with 130 mbit not enough. It has a reason why I avoid Shadowplay.


The youtube comment is just nonsense.

@Fenrir really? what a disappointment you just wrote. woohooo. ok...

I personally use 2800x1750 (its a 16:10 resolution, for 16:9 monitors you could use 3200x1800, 1750pixel is enough to get their 4k encoding preset. (label on player stays 1440p, but internally you get ID 315 which is their 4k VP9 HFR encoding.

Put my video
https://www.youtube.com/watch?v=rzKQcuTp1mo
into this
https://www.h3xed.com/blogmedia/youtube-info.php
and compare the bitrate of the normal 1440p (ID 308) and the bitrate of ID 315 2800x1750p and compare the labels.

Ok. Lets choose a 1080p comparison:
https://www.youtube.com/watch?v=UGu0ENA-RaI
Same - copy this url into youtube info.
Compare the bitrate, resolution and label of ID 303 with ID 308
Here also the filesize difference
https://abload.de/img/unbenannt5134ck7f.png

But yeah I just tell nonsense.
THANK YOU for telling me that I tell nonsense! I love you too. Maybe ask/test next time instead of throwing out such an insinuation? ...
sorry but this made me a bit angry :(

1920x1080 will never look good with its weak bitrate. This Rise of the triad video is in 45fps. At 60fps the difference of quality would be even higher, because the bitrate of their 303 encoding would be even more too weak.
 
Last edited:

Fenrir

Forum Admin
@DeMoN You seem really, really confused so I don't have anything further to say on this subject.

@Techno2k9 For high-quality, no fuss recordings, use the recording quality presets under Settings > Output when in Simple output mode. In the recording section, change the Recording Quality to Indistinguishable Quality, and then select your encoder.
 

DeMoN

Member
@Fenrir Whats up with you? I gave you so much info and proves but you still deny it? whats up with you? What you need now? screenshots?

and that from a moderator thats really sad.

But I imagine you even didnt do what I wrote you.... to see the prove ....
Oh you even could try yourself, but..... would be too easy?
 

Xaymar

Active Member
@Fenrir Whats up with you? I gave you so much info and proves but you still deny it? whats up with you? What you need now? screenshots?

and that from a moderator thats really sad.

I personally use 2800x1750 (its a 16:10 resolution, for 16:9 monitors you could use 3200x1800, 1750pixel is enough to get their 4k encoding preset. (label on player stays 1440p, but internally you get ID 315 which is their 4k VP9 HFR encoding.

You do not get higher video quality by uploading a larger size video to anywhere, all videos will be resized before encoding, essentially resulting in wasted bandwidth and storage space.

This myth started sometime ago with people misunderstanding what Totalbiscuit actually meant - people understood it as getting better 1080p while in reality all it did was give users the option to choose the higher video quality options (1440p, 2160p, ...).

1920x1080 will never look good with its weak bitrate. This Rise of the triad video is in 45fps. At 60fps the difference of quality would be even higher, because the bitrate of their 303 encoding would be even more too weak.

1920x1080x60 at NV12 with 50mbit is more than enough to represent any image feature with a decent encoder. Even x264 crf=11 will rarely if ever jump above 50mbit and you will not notice any difference between 1080p game footage at 50mbit or at 100mbit. The difference is much clearer at lower bitrates like 2500 vs 3500 vs 6000 vs 12000 for the same frame size.

This is based on data gathered from encoding with x264, NVENC, AMF and QuickSync. Of all three, QuickSync on Haswell and AMF were the only ones requiring higher bitrates to actually produce good quality, both eventually peaking out at 120mbit for intense game footage.

Additionally uploads to YouTube will always murder your quality. A 50mbit upload will be indistinguishable from a 200mbit upload at the same resolution and framerate.
 

DeMoN

Member
Xaymar said:
You do not get higher video quality by uploading a larger size video to anywhere, all videos will be resized before encoding, essentially resulting in wasted bandwidth and storage space
DO WHAT I WROTE ON MY POST DAMNIT

holy moly. You guys get annoying.

Thats the SENSE of the slight upscale - to get the better encoded videos by youtube. You can upload 1080p to youtube with 500 mbit it will look garbage after youtube, but doesnt on their ID 315 encode.
Whats so hard to understand that?
And play a round dirt 4 or also a bush heavy game and you wont believe how much bitrate you'll need for that even with x264 crf20. Shadowplay already lost with project cars 1 without motion blur at 130 mbit 2560x1600 to stay free from visible quality loss.

TEST YOURSELF a 1080p vs 2048x1152p uploaded video and the 2048x1152 video will LOOK BETTER ON YOUTUBE DUE TO BETTER BITRATE BY THEM

It cant be that hard to understand. for fucking sake.
 
Last edited:

Xaymar

Active Member
@DeMoN Please take the time to actually read the post before replying in a fit of rage. This place will be better for everyone if you calm down and actually write something that isn't in all-caps.

Here is a list of footage that was used to test what I just said:
  • Action Sequence with large color variations, high speed movement and lots of little detail at 3840x2160x60 RGB.
  • Forest Sequence with extreme amounts of foliage, transparent effects, postprocessing and color grading at 3840x2160x60 RGB:
  • 1920x1080x60 footage of a full 2h30m session of ARMA 3 on the Apex maps.
  • 1920x1080x60 footage of a full PUBG match.
  • 3840x2160x60 footage of a GTA 5 session.
All of these have shown the same result, those with foliage were even able to compress much further with CRF based x264 (average bitrate as low as 4mbit with no quality loss).
 

DeMoN

Member
4mbit with no quality loss. Your eyes are dead.

and
sorry for my rant, but this is getting annoying to get treated here like a little child while I put a lot of effort in getting the most quality out of youtube since years by analyzing myself their encoding and I did the mistake and even did effort to you to show you even with a tool to see yourself how youtube encodes their videos. and then you even dont use this tool apparently. thats not fair.

__
Also we get now a mix of youtube re-encoding and local recording and local encoding and I lose the traction what we are talking about.
 

Xaymar

Active Member
4mbit with no quality loss. Your eyes are dead. and sorry for my rant, but this is getting annoying to get treated here like a little child

The bitrate is a result of x264 CRF compression of lossless footage, with qval set to 11 (way below what you'd normally use). The highest average bitrate was observed in the Action Sequence and GTA 5 session, with 37.5mbit. After that there is ARMA 3 and PUBG with around 11mbit and finally the Forest Sequence at 4mbit.

The flaw lies with your example, which is comparing quality options on YouTube, instead of the actual bitrate difference. If you uploaded two videos, one 1920x1080x60 and the other 2560x1440x60, then downloaded the 1080p60 quality option for both, you would see that they are almost identical in bitrate, with only very small variations from YouTube's rescaling.

Additionally, no matter what resolution you upload at you will always get the same bitrate after YouTube transcodes the file. Here is a list of approximate video bitrates you get from various resolutions:
  • 144p30 = ~100 kbit/s
  • 240p30 = ~200 kbit/s
  • 360p30 = ~350 kbit/s
  • 480p30 = ~650 kbit/s
  • 720p30 = ~1250 kbit/s
  • 720p60 = ~1500 kbit/s
  • 1080p30 = ~2500 kbit/s
  • 1080p60 = ~3500 kbit/s
  • 1440p30 = ~5500 kbit/s
  • 1440p60 = ~8000 kbit/s
  • 2160p60 = ~21000 kbit/s
You would only get a quality benefit if the user can actually watch a higher quality. The "auto" setting will automatically switch between quality settings depending on screen resolutions and buffering speed, so unless a user explicitly sets their quality preset higher they will not see a difference.

However back to the original issue now.

I actually don't have a video uploaded which shows the difference between the 2 recordings as of yet but I can upload one, but the closest example is this: https://youtu.be/QNNxsTXHOLY Excuse the language from my friend, in that video it's blurry at the start and then right at the end it goes clear, and this is basically what it is like capturing directly from the card through OBS and then recording through the capture cards own software.

Your video doesn't really show the issue, it looks fine to me. Do you perhaps have the quality option set to "Auto" instead of the maximum possible?
 
Last edited:

DeMoN

Member
The flaw lies with your example, which is comparing quality options on YouTube, instead of the actual bitrate difference
You didnt use the youtube-info tool, did you? Put my video url in there and there you see the bitrate it got on every quality level by youtube.
If you uploaded two videos, one 1920x1080x60 and the other 2560x1440x60, then downloaded the 1080p60 quality option for both you would see that they are almost identical in bitrate
On what point I told anything else?
But 2048x1152 is already enough to get their 1440p encoding. You'll now see in youtube-info that there is not just the ID 303 variant (1920x1080p60 VP9) You'll see another variant with 2048x1152p60 VP9 with the ID number 308. And 308 preset is their 1440p encoding. But you'll also notice that the player still shows 1080p label as maximum. Now go back to youtube-info and you'll notice that the ID 308 2048x1152p60 VP9 also says that the label of it will be 1080p, but HAS the higher bitrate. You can also open my video and switch to 1080p and do rightclick into video and select stats for nerds and you'll see that it plays the 308 variant, and not 303. (or in case of the first video the ID 315 variant).

And thats why I recommend to do at least upscale the video to at least 2048x1152, if you cant capture directly higher than 1080p. Because the normal ID 303 is absolutely way too low bitrate.

Nothing more I wanted to say...

PS: before mid-july 2016 4k was at quantizer encode (CRF or CQP, one of the both, and 1440p60 was 15mbit instead of 10mbit, but since then they unfortunately reduced it to 10 mbit for 1440p and to ~20 mbit at full 4k and 2800x1750 is around just 15mbit :(

But still better than ID 303 with its 3500 kbit.....
 
Top