ToastyPillow
New Member
There's just so much conflicting information out there that I don't know what to do anymore. So I've finally decided to just make my own thread. I will keep this post as clear and concise as I can. I will split it into two halves.
I will start by simply posting some info about my rig.
Processor: i7 3930k CPU @ 3.20GHz
GPU: x2 GeForce GTX 770s
RAM: 15.94 GB
Monitor: x2 1920x1080, 60Hz LG
Okay, so as far as I'm aware, my rig (although starting to show its age) is still more than capable of great quality recordings. The key word there is recordings; right now I am *only* trying to do local recordings to upload to YouTube. No streaming right now. Specifically, I am doing gameplay videos for games such as Battlefield or Assassin's Creed. It's just for fun, a hobby, but if I'm going to do it, I want to make sure that they're good quality, HD, and meet my own standards.
Unfortunately I have run into a LOT of confusion, and I have many questions. Please note that while I am more than willing to learn, I am not an extremely tech-savvy guy, so I'd appreciate it if you could do your best to explain everything as simply as you can. Or at least include an additional explanation / answer in lay-man's terms.
1. Many say that, when using the x264 Encoder, lowering the "CPU Usage Preset" will result in better quality at the cost of more CPU usage. However, I've come across numerous people saying the exact opposite for *local recordings*. These people say that a higher preset will result in better quality for local recordings only. What's up with that? Who is correct, and why?
2. What does deblocking mean? What does it do, or look like? For example, what does a video with good deblocking look like, in comparison to a video with poor deblocking?
3. What is a good CRF tuner for gameplay?
4. Why is bitrate important for local recordings? I thought bitrate and buffer size were only for live-streaming, not recording gameplay footage offline? Having asked that: what exactly *is* bitrate?
5. Why does it force me to set a Downscale Filter if I'm not downscaling?
6. Lastly, I'm absolutely lost when it comes to the color settings under the Advanced tab. How do I know what Color Format to use? What does YUV Color Space / Range mean? What do I set these to, and why?
---
Now, the second half of my post. After I give some background information, I will list my settings.
Using CBR for "Rate Control", with Bitrate and Buffer Size both set to 2500, I was getting weird footage. The game in question was Assassin's Creed 1, so not a very demanding game for my PC. During scenes of high-movement, like when running around and parkour sequences, textures - and really the gameplay in general - would get even more "blockier" and "pixelly". That's the best way I can describe it: blocky and pixellated. I tested with Rocket League, another game that isn't very demanding, with the same results and general poor image quality. Can anybody explain what was causing this, and / or how to fix it?
Since then I changed Rate Control to CRF. I have it set to 10, because I was told the lower the better (just like with the CPU Usage Preset). While the general image quality seems to have improved with Rocket League (haven't retested Assassin's Creed yet), the footage does not look smooth. I don't know how else to explain it, other than by saying it doesn't look smooth. I mean, it doesn't look like an old silent-film either, but a lot of YouTubers have video quality so smooth that it looks exactly like you're playing the game yourself.
And *this* is when some confusion happened. I read several threads where people were saying to put the preset back to veryfast or even superfast. So I did that, and it appears smoother. But now I'm not sure if it's just all in my head and I'm tripping myself out. Maybe I'll need to privately upload some test footage of both games, and link the videos here to see what other people think. Guess it won't really matter what I think, if 100 other people think it looks fine.
My settings as promised. I'm running the latest update for OBS Studio, the 0.16.4 update on Windows 7.
-Output-
Mode: Advanced
Type: Standard
Format: mp4
Encoder: x264
Rate Control: CRF (value set to 10)
CPU Usage Preset: veryfast
Profile: high
Tune: None.
VFR: unchecked
x264 Options: N/A
-Video-
Base Resolution: 1920x1080
Output Resolution: 1920x1080
Downscale Filter: Bicubic (Sharpened scaling, 16 samples)
FPS: 60
Aero: Disabled
-Advanced-
Process Priority: Above Normal
Renderer: Direct3D 11
Color Format: NV12
YUV Color Space: 601
YUV Color Range: Partial
---
My ultimately goal here is to try and get these questions answered. If more information, such as a log file or test recording, are needed then please just let me know. Help is greatly appreciated, and I'll try to be equally as helpful in getting my issues resolved.
I will start by simply posting some info about my rig.
Processor: i7 3930k CPU @ 3.20GHz
GPU: x2 GeForce GTX 770s
RAM: 15.94 GB
Monitor: x2 1920x1080, 60Hz LG
Okay, so as far as I'm aware, my rig (although starting to show its age) is still more than capable of great quality recordings. The key word there is recordings; right now I am *only* trying to do local recordings to upload to YouTube. No streaming right now. Specifically, I am doing gameplay videos for games such as Battlefield or Assassin's Creed. It's just for fun, a hobby, but if I'm going to do it, I want to make sure that they're good quality, HD, and meet my own standards.
Unfortunately I have run into a LOT of confusion, and I have many questions. Please note that while I am more than willing to learn, I am not an extremely tech-savvy guy, so I'd appreciate it if you could do your best to explain everything as simply as you can. Or at least include an additional explanation / answer in lay-man's terms.
1. Many say that, when using the x264 Encoder, lowering the "CPU Usage Preset" will result in better quality at the cost of more CPU usage. However, I've come across numerous people saying the exact opposite for *local recordings*. These people say that a higher preset will result in better quality for local recordings only. What's up with that? Who is correct, and why?
2. What does deblocking mean? What does it do, or look like? For example, what does a video with good deblocking look like, in comparison to a video with poor deblocking?
3. What is a good CRF tuner for gameplay?
4. Why is bitrate important for local recordings? I thought bitrate and buffer size were only for live-streaming, not recording gameplay footage offline? Having asked that: what exactly *is* bitrate?
5. Why does it force me to set a Downscale Filter if I'm not downscaling?
6. Lastly, I'm absolutely lost when it comes to the color settings under the Advanced tab. How do I know what Color Format to use? What does YUV Color Space / Range mean? What do I set these to, and why?
---
Now, the second half of my post. After I give some background information, I will list my settings.
Using CBR for "Rate Control", with Bitrate and Buffer Size both set to 2500, I was getting weird footage. The game in question was Assassin's Creed 1, so not a very demanding game for my PC. During scenes of high-movement, like when running around and parkour sequences, textures - and really the gameplay in general - would get even more "blockier" and "pixelly". That's the best way I can describe it: blocky and pixellated. I tested with Rocket League, another game that isn't very demanding, with the same results and general poor image quality. Can anybody explain what was causing this, and / or how to fix it?
Since then I changed Rate Control to CRF. I have it set to 10, because I was told the lower the better (just like with the CPU Usage Preset). While the general image quality seems to have improved with Rocket League (haven't retested Assassin's Creed yet), the footage does not look smooth. I don't know how else to explain it, other than by saying it doesn't look smooth. I mean, it doesn't look like an old silent-film either, but a lot of YouTubers have video quality so smooth that it looks exactly like you're playing the game yourself.
And *this* is when some confusion happened. I read several threads where people were saying to put the preset back to veryfast or even superfast. So I did that, and it appears smoother. But now I'm not sure if it's just all in my head and I'm tripping myself out. Maybe I'll need to privately upload some test footage of both games, and link the videos here to see what other people think. Guess it won't really matter what I think, if 100 other people think it looks fine.
My settings as promised. I'm running the latest update for OBS Studio, the 0.16.4 update on Windows 7.
-Output-
Mode: Advanced
Type: Standard
Format: mp4
Encoder: x264
Rate Control: CRF (value set to 10)
CPU Usage Preset: veryfast
Profile: high
Tune: None.
VFR: unchecked
x264 Options: N/A
-Video-
Base Resolution: 1920x1080
Output Resolution: 1920x1080
Downscale Filter: Bicubic (Sharpened scaling, 16 samples)
FPS: 60
Aero: Disabled
-Advanced-
Process Priority: Above Normal
Renderer: Direct3D 11
Color Format: NV12
YUV Color Space: 601
YUV Color Range: Partial
---
My ultimately goal here is to try and get these questions answered. If more information, such as a log file or test recording, are needed then please just let me know. Help is greatly appreciated, and I'll try to be equally as helpful in getting my issues resolved.
Last edited: