Hate to say that I had to make an account just to ask this question, but this is something that's been burning in my mind for far too long, and I can't find anywhere else that can help me. So, here goes...
Former main pc was a 5820K with a 1080Ti; where possible, I often just gamed and streamed from the same computer. After all with GPU encode, what was the problem, right? well... long story short, Yakuza Kiwami 2 was the harbinger of issues. If I tried to stream and game on the same computer at the time, I noticed that my framerates got cut in half, even with GPU encoding enabled. The experience was not unlike some years back, when, my then main PC was a Core 2 Quad and a GTX 285 and trying to stream from twitch for that because that was my only pc at the time. It was very weird to see that kind of linear performance cut. Shocking, I know.
Thankfully since those days, I was able to have a separate build for the living room, for media and later, capture and encoding duties for my consoles (retro and current). *That* build, is an AMD FX cpu with a 780ti as of this writing. For what it's worth, it's handled the duties fine, and the GPU encode made seamless effort of upscaling the retro captures and even just straight passing the capture from the HDMI stuff out to Twitch. It was around then that I first heard of NDI, and since I couldn't justify the cost of a 4K capable capture card, nor have access to internet service with that much upload to stream with, it seemed within reason to use NDI to send the game feed over to this pc (scaled to 1080p for minimal distortion) and just have it treat it as just another capture source; that was the plan.
So, the good news. Starting with a higher fidelity source like that, the resulting stream as seen on Twitch was really sharp and crisp. Also, the performance hit from using NDI versus streaming from the same computer was palpably gone - at worst I was maybe like losing 1~2fps versus just gaming. It was easily the best the resulting stream's ever looked. Bad news is... after a while, simply, it was like the system just fell flat on its face with the encode. Even just dropping the connection altogether.
As a sanity check, I tried one thing. In between then and, building my new main pc, I had another system with a 2600k, slower memory, slower GPU, and on Wi-Fi, and put it on the network, installed OBS with NDI, and tried that out. Even with these disadvantages, the problems just disappeared. Everything was on the same lan, and yet the older i7 handled it fine. Made no sense to me, since I was presuming it was going to use NVENC anyway; heck, both systems were configured to use NVENC!
So it's leaving me to wonder what NDI is doing, and what I can do about it. Unfortunately I can't really use the i7 for the NDI capture/encode because, the aformentioned FX was in an mATX cube case that fits much better with the home theater arrangement, and trying to minimize the amount of towers to appease the wife (I'm sure you can understand that ;) ). The result from the stream was inarguable, and if I ever come across another demanding enough game to challenge my present main pc, I want to reserve NDI as an option.
So, the tl;dr - these symptoms make me think that I need to consider a platform upgrade, for the purpose of handling the NDI capture/encode/stream to site well. Am I correct in this hypothesis? If so, what makes a bigger difference to improving the NDI? is it more cores, more threads? does it use the GPU at all; can it be configured to use the GPU more?
Granted if I have to upgrade I'd be looking for cost effective primarily. But I'd be curious if anyone's gone the other extreme, and like... set up a threadripper build and just threw all the cores at the task to see what happens.