GPU does matter. OBS uses GPU to composite your scene, and if you start adding a lot of sources and create a complex scene, it will need a proper GPU to render correctly. Especially if your goal is 60fps.
That didn't occur to me (I was thinking merely of the encoding portion).
This was true, but as of 19.0.0 (the changes are done, they are just pending the 19.0.0 release in the next few weeks at most) the BMD cards are very solid with OBS. There were some audio downmix problems, and some sync issues that we were able to identify and correct. I've been using the BMD Intensity Pro 4k for a few months now, and no issues at all after those fixes.
This is great to hear. It sounds like reviews have picked up since the troublesome launch of the card. How do you feel about the tiny fan on the heatsink? I've heard it is intolerably loud (which could be a problem streaming as I run rather quiet setups). Do you send audio to your streaming rig via HDMI as well? (I presume that can be done with the BMD).
This is where things kinda fall apart. 144fps gaming + any capture card is going to be a huge headache for you. The method you described is a super hacky workaround to the limitations of capture cards, but is your best bet. You set the capture card itself as another monitor, and project OBS' output to that to capture.
That's unfortunate. People seemed to suggest the OBS Preview method was a simple work around that didn't have problems.
Is the continued limitation even on the expensive pro/industrial capture cards a matter of pure data transfer capacity? Since it should just be passing the signal through for the CPU to encode, theoretically they should handle whatever you throw at it, right? So I presume the problem is that even with HDMI 2.0, you're looking at like 18Gbps at *best* which isn't even enough for 4k@60? And introducing capture cards with DisplayPort only opens up to 25Gbps at 8best*, I think? So until the industry comes out with a new transmission protocol/format, capture cards are literally not going to go past 4k@30.
So the only way to work around those limitations is to reduce what you are sending to the capture card to a signal it can carry. Such as gaming at 2560x1440@120hz and sending as 1080p@60hz -- which requires you to define the capture card as such a display and send to it as such, which will cause possible issues with tearing, I presume? *OR* previewing via OBS at 1080p@60hz and sending *that* . . . which can also introduce tearing and other potential complications?
Do I understand, then, that the only ideal way I can do it is to play games at exactly the same resolution and refresh rate that the capture card supports? And I presume gsync has to be disabled?
As an aside, the BMD supports 1080p@60 and 4k@30. However, it does not explicitly state 2560x1440 as a supported resolution at all. Does this mean that it outright does not handle it, even though it should technically have the data transfer capacity for 2560x1440 at (I believe) near 60hz?
Sorry for the 20 questions and sorry for continuing them *here*. I've done a lot of research for quite awhile and it is just so hard to find anything real about any of this. Most of the time, at best, it's just a bunch of cryptic youtube videos of streamers who somehow got some sort of a setup sort of working for them that they themselves don't really even understand and try to explain how it all works or how they got it to work. Actual facts and reasons for how things are seem elusive and all I've been able to do is patchwork things in my head based on my software dev background.
Anyway, thanks very much for your time again. It is very much appreciated.