Question / Help Sli streaming Stutters

AlfaPrime

New Member
Hey guys, so I just finished setting up my stream and it runs perfectly and smooth with SLI OFF the problem is that when i turn sli on the stream suffers with stuttering(not buffering, stutter!) it simply just isnt fluid with sli, the solution is turning it off.
I wanna know if I have to alter anything specific in my options:

SPECS:

Benq 144hz
i7 4790k@ 4.6ghz
2x SLI 970
16gb 2400mhz ram
100mb down 20mb up

Some Stream Settings:
5000 bit rate (no its not to much, since it works fine with SLI off)
I use Nvenc instead of x264 since I dont really like loosing fps, so this is the best option for me

If you guys think you need any more information just ask away.
Thanks in advance!
 

Harold

Active Member
and it's not obs' fault.
It's the fault of SLI and users that install sli in 16-lane systems.
 

Osiris

Active Member
The problem is not only 16-lanes, the problem is that OBS can only run on 1 GPU, so it only gets the frames from that GPU, not the other 1. So OBS only gets half the frames, you can use game capture's multi-adapter compatibility but that comes with a performance hit.
 

pandablaster83

New Member
this is just ridiculous
You're right to feel that way, it is OBS's fault after all. They didn't make the code to support SLI properly, or refresh rates over 60Hz for that matter. It's not the amount of PCI-E lanes, since SLI is almost always done at 8x/8x, therefor still not going over the 16 lane limit. Maybe someday they'll make support for basic features.
 

Osiris

Active Member
You are talking nonsense. It is not OBS' fault, you can use SLI if you want with multi-adapter compatibility.
 

Osiris

Active Member
Not sure what you mean with proper support, using shared texture capture with 2 active GPU's is literally impossible.
 

pandablaster83

New Member
Maybe like that, but I don't see why it's impossible to pick up each frame both GPU's are rendering in the order they're being rendered. That is something that's not impossible.
 

Osiris

Active Member
It is impossible with shared texture capture. And that is the most efficient way of capturing out there, other then maybe Nvidia's NvFBC, but that can't be used by 3rd parties.
What you describe is what multi-adapter compatibility does, it downloads the frames, instead of using them directly from VRAM. But that comes with a performance penalty.
 

AlfaPrime

New Member
It is impossible with shared texture capture. And that is the most efficient way of capturing out there, other then maybe Nvidia's NvFBC, but that can't be used by 3rd parties.
What you describe is what multi-adapter compatibility does, it downloads the frames, instead of using them directly from VRAM. But that comes with a performance penalty.

And how big of a performance hit are we talking about? How can I test that?
 

Osiris

Active Member
I am not sure how much of a performance hit it is, but you can easily check that by enabling multi-adapter compatibility in game capture and checking the increase in cpu usage.
 
Top