Question / Help From my drawing board, sumup and planning: 4K playing, 1080p streaming.

Hello everybody.

I'm new here, so let me first of all say hi and thank you all for all the information that I found in this forum. And also excuse me, English is not my first language, so I will definetly make some grammar and spelling mistakes.

I'm new not only here but also in streaming, so after reading tons of threads here and playing a little with OBS Studio, I think I got some points but also I have some doubts. So I invoke your knowledge and help to understand and plan my streaming configuration.

I have no issue (yet) so no log file to include. I just want to understand crystal clear some options, what to look for and what to expect for my build when streaming. So I'll try to write down here my own brainstorm from what I've learned so far (with my questions and doubts coloured in red):

  • Beginning with my current setup:
I'm playing in a PC with 5930K (3,75Ghz), 32 Gb DDR4 RAM (3000Mhz), SLI 2x Titan X (Pascal) GPUs, all of that custom watercooled. My Internet connection is 300 Mbps up / 300 Mbps down.

I play almost every single game in 4K >60fps with all their in-game graphic settings maxed out, HDR when the game has the option. In a 4K HDR10 single-monitor setup.

Sometimes I need to find a custom SLI profile to achieve that because of the lack of SLI profiles for some games. But this PC do a really decent job to achive that resolution, FPS and settings, in a lot of games as BF1 it goes up to >100 fps. I don't use V-Sync, so my GPUs are the more stressed component in the PC, with the CPU having some spare power.

  • What I would like to do:
I would like to stream using that single PC. The same PC to play the game and stream.

I'm not planning to use a camera. I just want to stream the game and a web browser source to display Streamlabs layers (in order to show some customization with username, stream alerts and all that).

And I want to stream it on Twitch at 1080p 60fps, 6000kbps bitrate.

  • And that's what I know about streaming so far:
Yes, 1080p 60fps 6000kbps is too much for a beginner. I know.

And that will make my stream buffer for possible viewers without a good internet connection. But I don't care about that. According to the last Speedtest.net global index report, in my country (where there is the most large group of targeted viewers) the average fixed broadband download speed is 96.3 Mbps, and the average mobile download speed is 32.9 Mbps, so viewer's bandwidth doesn't seem to be a problem. My Internet bandwidth doesn't seem to be a problem either. So I would like to show the maximum image quality in the stream.

I know I must forget about HDR while streaming. It's something really sad... but no streaming service has an option for it as far as I know, the required bandwidth would be insane, and HDR displays are a overpriced niche today... so no point in streaming HDR.

And streaming in Twitch makes me forget also about streaming 4K. Some time ago, I did some streaming in YouTube in 4K directly from Nvidia Shadowplay, and I think that YouTube allowed viewers to reduce the resolution to adjust their required bandwidth and to fit their screens. But Twitch bitrate and resolution limits are what they are, and I'm new there, so my viewers will have to receive my stream at same resolution and FPS that I will use.

That set my target in 1080p, 60fps, 6000kbps bitrate streaming to try to show the maximum image quality as I'm playing the game.

I know it will impact directly in my PC performance and reduce the quantity of possible viewers. But there are tons of people streaming, I think that everyone needs to be somekind different to have something to show to that possible viewers. And I love to play every game at the maximum graphic quality, detailed image and maxed graphic setting, even knowing that none of that is necessary to enjoy a game but for me it adds more pleasure to gaming. So I would like to show that image that I really enjoy to all my possible viewers.

So, 1080p 60fps 6000kbps bitrate is not negotiable as far as my PC can maintain that stream while maintaining a good quality game performance.

  • How to achieve that 1080p, 60fps and 6000kbps stream from my PC:
  1. First of all, I need to re-scale the game's image, as I play in 4K and I want to stream 1080p.
The best way to do it stream-wise is setting my game's resolution to 1080p, that will also add a lot of spare power in my GPUs to do stream rendering... but that would be the last option to me, because I want to play the game the way I like to play it, not only stream it.​
Then, I need to rescale the image at some point within OBS Studio. There are three possible ways to do that, right?.​
  • In a scene, when I add game capture as source, I can rescale that source from 4K to 1080.
    • In OBS settings -> Video, I can set the scaled resolution to 1080.
      • In OBS settings -> Output, streaming tab, I can set to rescale output to 1080.
I've read here, that the flow is Source scale => Video scale => Output scale, with sideo scale done in the GPU before eoncoder so it's much more efficient that output scale.​
How is "source rescale" efficiency?. I undesrtand that it is also done before sources are laid out on the canvas, so it's done in the GPU as "video escale", right?. So, performance-wise, do they have same impact?, and same image quality?.
I would like to maintain the best image quality with the rescale but without costing too much performance, that's why I prefer to use GPU to rescale. I know that setting the rescale in source scale or video scale will set my maximum output video to 1080, for streaming and also for recording. But I'm planning everything for streaming, when I want to record 4K gameplays to upload them to YouTube, I'll use another settings.​
By scaling in game capture to 1080, I think I will be able to use also 1080 web browser sources to insert Streamlabs 1080 lawyers. Everything in a 1080 canvas.​
If I use 4K canvas and scale video to 1080, I'll need to use 4K web browser sources to fit the game capture's 4K without up-scaling the web browser sources, right?. I'm afraid that up-scaling web source graphics will show a worse image than down-scaling game capture. And also a 4K canvas will consume some more power to be processed, right?.​
And also talking about game capture, I saw that there is a "SLI capture mode (beta)". I've read here that activating "Multi adapter compatibility" will be far less efficient, does SLI capture mode have the same problem?. As I play in a PC with SLI, I do not want OBS Studio to capture only half of my frames, the ones rendered in only one of the two GPUs I'm using, so do I need to activate SLI capture mode?, what would be the difference between activating it or not?.​
With all that, I was planning setting a game capture source rescaled to 1080p with SLI capture mode (beta) enabled, canvas size of 1080p and output resolution of 1080p. Would it be alright?, is there any other way to the rescale better / more efficient?.

2. With a proper 1080p output, I need to encode it to be sent to Twitch.
As GPUs take the heaviest load from my PC while playing games (both with 95% - 100% load), I thought that it woulud be more efficient if the encoder work is done by my CPU.​
I see no point in setting my stream encode to NVENC instead of x264 if I have to lower my game settings in order to leave more GPU power to OBS Studio. I would like to maintaint the game's graphic settings as higher as I can... So I understand that I need to encode with x264, right?.
Also, I've read that NVENC can be more efficient but will have less quality at lower bitrates. So, ¿x264 should show a better image at 6000kbps?.​
I understand that it will depend on the preset chosed, and that preset will obviously impact directly in the CPU performance. I don't want to loose game's FPS and stutter because loading too much my CPU. Heat is not a problem as everything is custom watercooled. So, what preset do you think / recomend to have the best balance between quality and load?.
With all that, I was planning a x264, 6000kbps bitrate, fast preset. Would it be alright?, is there any other way to show a better image without exceding my PC performance?.
And that's all about my stream's drawing board.

If you have read everything, thank you very very much, I know i wrote too much. But I'm a totally noob and I really want to understand and be sure of what I'm planning.

Maybe this brainstorm and your help could be helpful to people in a similar situation as I am now, so thank in advance for your attention and your help.
 

koala

Active Member
Capturing and recording/streaming from the same PC will always have an impact on the captured application, because capturing and encoding is an extremely stressing thing for the hardware. You have to live with this. In some cases you have to trim back tuned game configurations to get a good stream. Truly invisible capturing is only possible with a capture card and streaming from second PC.

Rescaling by changing the size of a source or rescaling with Settings->Video takes place on the GPU and probably has the same quality. You can set rescaling methods for both. It's probably the same GPU subroutine used for both and requires the same GPU utilization.

Rescaling in the encoder settings takes place on the CPU and its impact is huge. The larger the resolution, the larger the impact, since this rescaling is probably single-threaded in comparison to GPU rescaling, which is massively multiprocessing (shader-based). The difference in efficiency is probably an order of magnitude - I assume a rescaling of a 1920x1080 source takes perhaps 0.1 ms on GPU, which is negligible, while the same rescaling on CPU will take perhaps 1-5 ms, depending on the power of your CPU and the CPU load from other applications.

About SLI game capture: unfortunately, this is not efficient. See this thread from not long ago: https://obsproject.com/forum/threads/the-ongoing-saga-with-sli.96068/
I will not comment further on it. If you insist on using SLI, and do not accept any trimming down your game, I recommend using a capture card and stream from a second PC.

About nvenc, you can safely use nvenc on a heavily loaded GPU. It will not add to the GPU load, because nvenc is implemented as hardware circuit (ASIC) and does not use any computing resources on the GPU. If a GPU is 100% loaded from the game, every encoding will suffer, nvenc as well as x264, because OBS is not able to composite the video in time. This is due to OBS not getting enough GPU resources for itself for scene compositing. This does not have anything to do with the computing power needed for encoding. So, if you want the least impact on your machine from the encoding process, use nvenc. It produces about the same quality as the veryfast preset of x264.

By the way, streaming 1080p@60 fps with bitrate 6000 is reasonable and will produce acceptable quality, but it depends on the kind of game you want to stream. If it has high complexity, such as foliage and highly detailed textures, this might not be enough for perfect quality. People recommend to go up to 9000 in this case. For shooters, which don't usually have that high graphical complexity, 6000 should be ok. RPG games, on the other hand, with their highly sophisticated and crafted environment, may need more.
 
Capturing and recording/streaming from the same PC will always have an impact on the captured application, because capturing and encoding is an extremely stressing thing for the hardware. You have to live with this. In some cases you have to trim back tuned game configurations to get a good stream. Truly invisible capturing is only possible with a capture card and streaming from second PC.
Thank you very much for your answer.

Yes, I know that using same PC will impact its performance. I don't mind the stress suffered by the hardware, I just try to build PC that can handle such stress.

I understand and accept that my FPS will need to go down while playing and streaming. But as far as my framerate and frametime are good ones, I don't care. I play well over 60 fps in all games, so seeing the FPS count reduced even to 60 FPS will be ok for me... as long as it doesn't produce stutter, etc.

Rescaling by changing the size of a source or rescaling with Settings->Video takes place on the GPU and probably has the same quality. You can set rescaling methods for both. It's probably the same GPU subroutine used for both and requires the same GPU utilization.

Rescaling in the encoder settings takes place on the CPU and its impact is huge. The larger the resolution, the larger the impact, since this rescaling is probably single-threaded in comparison to GPU rescaling, which is massively multiprocessing (shader-based). The difference in efficiency is probably an order of magnitude - I assume a rescaling of a 1920x1080 source takes perhaps 0.1 ms on GPU, which is negligible, while the same rescaling on CPU will take perhaps 1-5 ms, depending on the power of your CPU and the CPU load from other applications.
Thanks for confirming this.

If there's no difference in quality neither in performance between source scale and video scale because both being done exactly the same way in GPU, I preffer source scaling and set canvas, web browser sources and video output at 1080.


About SLI game capture: unfortunately, this is not efficient. See this thread from not long ago: https://obsproject.com/forum/threads/the-ongoing-saga-with-sli.96068/
I will not comment further on it. If you insist on using SLI, and do not accept any trimming down your game, I recommend using a capture card and stream from a second PC.
I understand your point.

But I'm a multi-gpu user since 12 years ago, and a dual-gpu setup sometimes is the only way to achieve the performance you need. As long-term SLI user but not only SLI user, I have measured myself the bennefit of SLI. And as I said in my previous post, sometimes you have to add or tune the SLI proflie yourself because of the lack of decent SLI profiles or even no SLI profile at all for some games. And as overclocking and benching fan, I have tons of proofs about making almost double FPS count thanks to SLI.

I understand people that don't like multi-gpu setups. And they have their valid point. I would never recomend a dual GPU setup if there is a single GPU that can deliver same power (FPS). But when you need more power than the most powerfull GPU can deliver, then you need more than one GPU.

And for reaching 4K, HDR, over 60 FPS with all graphic settings maxed out in every game, even today with the 2080Ti in the market, you need two GPUs. Same for multi-monitor setups with extreme ultra-wide resolutions. Obviously you can lower some settings and done the trick, but that's not the "all graphic settings maxed out" thing that I'm talking about.

We can talk all day long about SLI, but I understand that that's not the point of this forum. SLI is sitll a valid configuration, much more efficient in games that what it's said (even knowing that it not always work)... but anyway, it's the setup that delivers the graphics power that I need to play in the resolution and setings that I want to play.

So going for a single GPU gaming setup is no option for me.

Using a second PC with a capturer can work and probably will be the best option. But I play at 4K over 60 FPS, even forgeting about HDR, that's a huge amount of data to capture. Elgato 4K60 Pro can capture it, but it can handle only 60FPS and doesn't have built-in h264 encoder. Elgato HD60 Pro has built-in h264 encoder, but only 1080 resolution and my PC video output to my monitor is 4K... maybe some HDMI active splitter with built-in downscaler for one output to 1080p while maintaining another output with the original 4K over 60 FPS can make it work. But this is something I'm not considering at this moment.

I would like to try to achieve a decent 1080, 60FPS, 6000kbps bitrate stream while playing 4K over 60FPS in this PC, with the best quality of both the game and the stream it can deliver.

My question about SLI capture mode was more because I don't understand if "normal capture mode" (non SLI) can capture all the frames rendered in both GPUs or if half of the frames (from one of two GPUs) will not be captured unless I enable SLI capture mode.

I understand that SLI capture mode download the backbuffer from VRAM to RAM and it makes that capture method far less efficient than the "normal" OBS Studio capture method. But this PC has 2 GPUs in PCIe 3.0 x16, a 40 PCIe lanes CPU and 32GB of 3000Mhz DDR4 RAM, so I was wondering if this PC can use SLI capture method without a well noticeable impact in performance, or how big that impact should be... even if it is necessary at all use SLI capture method to capture all the frames or not.

EDIT: just to understand a bit more my PC specs, this video is not from my PC, but my PC specs are identical to one of the PCs being compared on it:
https://www.youtube.com/watch?v=WKgW8iEZe3w


About nvenc, you can safely use nvenc on a heavily loaded GPU. It will not add to the GPU load, because nvenc is implemented as hardware circuit (ASIC) and does not use any computing resources on the GPU. If a GPU is 100% loaded from the game, every encoding will suffer, nvenc as well as x264, because OBS is not able to composite the video in time. This is due to OBS not getting enough GPU resources for itself for scene compositing. This does not have anything to do with the computing power needed for encoding. So, if you want the least impact on your machine from the encoding process, use nvenc. It produces about the same quality as the veryfast preset of x264.
So NVENC workload won't add more GPU rendering load. Then, if OBS Studio is using x264 (CPU) or NVENC (GPU) and taking the sources from web browser and RAM (VRAM backbuffered to RAM due to SLI Capture mode), how much GPU power needs OBS Studio and what is this GPU power needed for?

I understand that streaming while playing in the same PC will imapct PC performance, as said at the beginnig of this post. And I assumed I will lost some FPS and I don't mind loosing some FPS performance while playing and streaming (this PC has enough over 60 to accept some looses) but only if the resultant framerate and frametime of my CPU and GPUs taking load from both the game and OBS Studio is still good. The main problem could be if from this game+OBS workload results some stuttering in the image. I want a good image for both, streaming and playing, so my questions here are how to achieve that, even stressing more my PC, but without stutter.

So, what I should use?, x264 with fast/medium preset?, NVENC?...

By the way, streaming 1080p@60 fps with bitrate 6000 is reasonable and will produce acceptable quality, but it depends on the kind of game you want to stream. If it has high complexity, such as foliage and highly detailed textures, this might not be enough for perfect quality. People recommend to go up to 9000 in this case. For shooters, which don't usually have that high graphical complexity, 6000 should be ok. RPG games, on the other hand, with their highly sophisticated and crafted environment, may need more.
You're totally right.

But I thought that Twitch only allows a maximum of 6000 kbps bitrate.... If I can go any higher, I'll raise the bitrate.

YouTube allows much higher bitrates and resolutions, but YouTube is making some weird moves with content creators that are making a lot of streamers and viewers to skip it from and go directly to Twitch, Streamcraft, Mixer, etc.

I'm planning streaming in Twitch. So I have no other option than taking Twitch maximum limits as my intended resolution and bitrate.

Once again, thank you very much for taking your time to read and answer my questions!!
 
Sorry for double post, but I've tried some different configurations, and here are the logs:

- NVENC coding, game capture scaled to 1080, SLI capture mode enabled, canvas 1080, output 1080: https://obsproject.com/logs/eUWeFHjNI8OPIG0D

- x264 fast coding, game capture scaled to 1080, SLI capture mode enabled, canvas 1080, output 1080: https://obsproject.com/logs/Oju4-pafUnBom3dR

- x264 very fast coding, game capture scaled to 1080, SLI capture mode enabled, canvas 1080, output 1080: https://obsproject.com/logs/LkbGoubXQB_18qGE

- x264 faster coding, game capture scaled to 1080, SLI capture mode enabled, canvas 1080, output 1080: https://obsproject.com/logs/RqCD9fSD5DraxF17

- x264 faster coding, game capture scaled to 1080, SLI capture mode disabled, canvas 1080, output 1080: https://obsproject.com/logs/EahyWHkhlrwdKNeZ

- x264 faster coding, game capture in 4K, SLI capture mode enabled, canvas 4K, output 1080: https://obsproject.com/logs/T5OEdgWJ6VRm--Cq

- NVENC coding, game capture in 4K, SLI capture mode enabled, canvas 4K, output 1080: https://obsproject.com/logs/FgQfsJoz2J2cogta

I was watching my own stream with a tablet via mobile internet just to see how it looks. And it seemed to me that the last option was the best-looking stutter-free... but the number of lagged frames due to rendering lag/stalls in its log is the biggest one. x264 with very-fast preset looked really bad. With faster preset looks good enough. x264 with canvas 4k and output 1080p seemed to have a really noticeable stutter in the stream sometimes. A bit less stutter but also some of it in the x264 fast with canvas 1080 and output 1080.

But those are my feelings when watching the stream... I really don't know if it matches what logs says.

With all this data, what OBS Studio configuration would you choose?.

Thanks in advance!!
 

koala

Active Member
In every log, you have very consistently lost frames due to rendering lag (14-37%). Nobody who tries to get a good stream can be satisfied with any of that. With this, you are asking which configuration is the least bad, not the best good.

For technical details, see the discussion in the SLI-discussing thread I quoted above.
It seems, even with two monster cards running at x16 speed, a good capture cannot be achieved.

The only idea I have is that for your tests, remove or deactivate every source except your game capture source to prevent non-capturing sources to taint your tests. Although unlikely, it is possible that your browser sources will produce the lags and not the game capture, and to eliminate this possibility, remove these sources.
 
I've re-done my tests. Only one scene, and only one source: game capture. This time I also set Twitch to record the stream, so any of the test can be seen there to really see what impact does every single setup have on the overall stream's image quality.

1. NVENC, Game Capture 4K, SLI capture mode disabled, Canvas 4K, output 1080
FPS: 65 FPS average.​
2. NVENC, Game Capture 4K, SLI capture mode enabled, Canvas 4K, output 1080
FPS: 68 FPS average.​
3. NVENC, Game Capture 1080, SLI capture mode disabled, Canvas 1080, output 1080
FPS: 68 FPS average.​
4. NVENC, Game Capture 1080, SLI capture mode enabled, Canvas 1080, output 1080
FPS: 67 FPS average.​
5. x264 faster, Game Capture 4K, SLI capture mode disabled, Canvas 4k, output 1080
FPS: 66 FPS average.​
6. x264 faster, Game Capture 4K, SLI capture mode enabled, Canvas 4k, output 1080
FPS: 60 FPS average.​
7. x264 faster, Game Capture 1080, SLI capture mode disabled, Canvas 1080, output 1080
FPS: 71 FPS average.​
8. x264 faster, Game Capture 1080, SLI capture mode enabled, Canvas 1080, output 1080
FPS: 63 FPS average.​
9. x264 fast, Game Capture 1080, SLI capture mode disabled, Canvas 1080, output 1080
FPS: 62 FPS average.​
10. x264 fast, Game Capture 1080, SLI capture mode enabled, Canvas 1080, output 1080
FPS: 68 FPS average..​
FPS numbers differ a lot because that was actual gameplay, not a test, so in every single test there were a lot of factors that could alter that number. Use it only as low-index of in-game performance degradation. Normal FPS numbers in this game, map, mode, setings and PC without being streaming is about 75-80 FPS average.

If you see the videos, please, look for quality differences. And also be advised that all of them were recorded without a decent mouse... so try to differentiate bad mouse sensor response from stutter (try to look at the game's background to see if the character movements were due to inconsistent frametimes and all the background is also affected, or if its only affected the character and the backgrounds moving normally). In the last two test there's no doubt...

And doing only one test with each option can give non-significative numbers, I should repeat every single test as many times as possible to try to find a representative data of each OBS Studio seting in my PC. So please, forgive the lack of scientist methodology and let's try to trust those first shots.

I've already extracted my own preliminary conclusion, but I'm not an expert so it could be probably wrong. If that conclusion is faced against as much opinions as possible, and better if they come from expert people like the ones on this forum, the better answer will result.

In every log, you have very consistently lost frames due to rendering lag (14-37%). Nobody who tries to get a good stream can be satisfied with any of that. With this, you are asking which configuration is the least bad, not the best good.
Yes, you're right.

But I want to start with finding the best setings to stream from this PC in 1080p 60 FPS 6000 kbps bitrate, good loking image... and then I'll try to fine-tune my game settings to lower as much as I can the number of lost frames due to rendering lag without compromising game's image quality.

Thanks!!
 
Last edited:

koala

Active Member
I'm sorry, but every single video has not acceptable quality due to the existence of lagged frames. They are all choppy. I don't know how you rate video quality, but lost frames are an absolute no-no.
You cannot begin to rate visual encoding quality, as long as you still have lagged frames. You may presume they are something minor you can deal with later, but they are not. In fact, they are probably a showstopper for you. The measures you have to take will probably be so drastic that you have to completely do all encoding tests again - after you fixed the lost frames.
 
I'm sorry, but every single video has not acceptable quality due to the existence of lagged frames. They are all choppy. I don't know how you rate video quality, but lost frames are an absolute no-no.
You cannot begin to rate visual encoding quality, as long as you still have lagged frames. You may presume they are something minor you can deal with later, but they are not. In fact, they are probably a showstopper for you. The measures you have to take will probably be so drastic that you have to completely do all encoding tests again - after you fixed the lost frames.
Once again, thanks a lot for answering.

Obviously, I rate video quality because a lot of factors, and one key factor is framerate.

But if I'm playing at, i.e., 60 fps and loosing a 10% while capturing because lagged frames, I suposse that I´m still capturing 54 FPS, a decent number close to the maximum of 60 FPS that Twitch allow. Maybe this is a total nonsense, that's why I'm asking.

I've tried to look for those lagged frames in every single one and each of those videos above. Sometimes I can find them and I see inconsistent frametimes. Other times, I can't see them. That's why I ask here to people with more expertise.

And as I stated before, it's my fault but I used a cheap wireless mouse with no mousepad (really, I used it over my couch arm... the desk is a bit messy right now). And that "bad mouse" added a lot of choppy camera movements in all that videos. Sometimes it is not easy to see the difference, but you can see when it is a choppy mouse movement and when it is a choppy video at all.

I did some stream test some time ago to other platforms as YouTube, in higher resolutions and bitrates. And no needing to reduce resolution neither bitrates resulted in a smooth, efficient stream and easy configuration. I did it using NVIDIA Shadowplay, maybe there was also frames lagged and all that... but it looked really good.

Now this compulsory downscaling to fit Twitch's maximum resolution and bitrates are driving me crazy.

EDIT: I cannot conect to Steam right now... maintenance, I suposse. So I cleaned my desktop, connected my "gaming" mouse and using my mousepad, I repeated the test with other game:

NVENC, Game Capture 1080, SLI capture mode enabled, Canvas 1080, output 1080
Log: https://obsproject.com/logs/PF7tR8_MR1giy-W8
Video: https://www.twitch.tv/josetortola/v/332574191

Here test seems to have a consistent results pattern: 9.3% of lagged frames due to rendering lag/stalls here, 9.2% in the previous test with another game but same OBS Studio setings.

And yes, I would like to have a no lagged frames result just from the beggining to start with... but when I look at this video, it's difficult to me to say that this is a "choppy video". Do you see that choppy also in this one?.

I can be totally wrong, but I just want to learn how to take the best of my hardware. If I should use SLI capture mode or not, NVENC or x264, source scale or video scale. I want to know what are my hardware limits. I thought that it could be a good start point to plan my stream, and after that, lowering some game's graphic options or even capping FPS at 60, to try to achieve a stream that has the best quality my hardware can give and no stutter (or "choppy video").

Sorry if I'm mistaken some concepts.
 
Last edited:
1. NVENC, Game Capture 4K, SLI capture mode disabled, Canvas 4K, output 1080
FPS: 65 FPS average.​
2. NVENC, Game Capture 4K, SLI capture mode enabled, Canvas 4K, output 1080
FPS: 68 FPS average.​
3. NVENC, Game Capture 1080, SLI capture mode disabled, Canvas 1080, output 1080
FPS: 68 FPS average.​
4. NVENC, Game Capture 1080, SLI capture mode enabled, Canvas 1080, output 1080
FPS: 67 FPS average.​
5. x264 faster, Game Capture 4K, SLI capture mode disabled, Canvas 4k, output 1080
FPS: 66 FPS average.​
6. x264 faster, Game Capture 4K, SLI capture mode enabled, Canvas 4k, output 1080
FPS: 60 FPS average.​
7. x264 faster, Game Capture 1080, SLI capture mode disabled, Canvas 1080, output 1080
FPS: 71 FPS average.​
8. x264 faster, Game Capture 1080, SLI capture mode enabled, Canvas 1080, output 1080
FPS: 63 FPS average.​
9. x264 fast, Game Capture 1080, SLI capture mode disabled, Canvas 1080, output 1080
FPS: 62 FPS average.​
10. x264 fast, Game Capture 1080, SLI capture mode enabled, Canvas 1080, output 1080
FPS: 68 FPS average.​

My first time analyzing this kind of logs... but here are my conclusions:

- Game scale seems to be done with much less impact in the PC resources than video scale. The amount of lagged frames due to rendering lag/stalls is about 25% (average) when the downscale is done in the video scale (canvas 4K, output 1080) vs. 10% (average) when the downscale is done in the source scale (game capture set to rescale to 1080, canvas 1080, output 1080).​
- SLI Game capture makes lagged frames due to rendering lag/stalls go up, but not much. There are two weird measurements: difference between SLI capture OFF and ON using NVENC and 1080 game source is -6%, using x264 faster and 4k game source is +15.4%. But in the rest of the tests, difference between having SLI capture OFF and ON is +2% of lagged frames due to rendering lag/stalls (average).​
- NVENC has no skipped frames due to encoding lag but it image quality is poor with only 6000 kbps bitrate. With x264 fast the image quality (single image, not all the video) looks good... but the number skipped frames due to encoding lag takes off to 45% making the video stutter. x264 faster takes some skipped frames due to encoding lag, but just about 2.5%.​

So, I guess that my PC's best (or "least bad") OBS Studio streaming settings are x264 faster, game capture scaled to 1080 with 1080 canvas and 1080 output... and due to what seems a more fluid video in option 8 than option 7, SLI capture mode enabled. But any other thought about this and coment about all those videos' quality would be very appreciated.

I'll do more test with more games and that OBS Studio setings. It seems that I also have to lower some game's graphics settings to reduce the amount of lagged frames due to rendering lag/stalls. I want to aim to zero frames losts, but what values for lagged frames due to rendering lag/stalls and/or skipped frames due to encoding lag do you think that are acceptable?.

Thank you very much once again.
 
Last edited:
One last question.

Thinking about using a capture card, the perfect scenario is using it in a second PC. But just to discard options, would add some benefit at all using a capture card in the same PC?

I've looked at AVERMEDIA Live Gamer 4K and Elgato 4K60 Pro, and I think that the Avermedia option has more potential. Both can do HDR passthrough (something that yesterday also anounced Elgato on Twitter for the 4K60 Pro) and higher band but Avermedia can record it. That can be a good point in the future.

But both of them rely on GPU power to encode the video... so I guess that any of them used in the same computer, streaming and gaming on the same PC with those capture cards on it, will add every single frame of the game rendered in both SLI cards to the source but it won't help with the workload at all or even make it worse?.

So I looked at Elgato HD60 Pro, that has a built-in H.264 encoder. That would make me use a HDMI splitter and reducer to maintain 4K to my monitor and 1080 to the capture card. But I guess that the built-in H.264 encoder will leave on the PC "only" the workload of rendering sources into scene and encoding it to be streamed, right?.. That's not much help for a single PC to game and stream at the same time... but it will show the game with all the frames from both SLI GPUs.

Sorry for brainstorming again, I just try to learn how it all works.
 
Top