With OpenCL, GPU encoding (such as Quicksync and AMD/Nvidia equivelants) has fixed quality settings for the most part, but I honestly would take either of them over none of them. It would actually be pretty cool to see how all of these when put together would influence streaming. Offloading the little bit of encoding the primary computer would have to do, then shooting it over the network to do the rest of it.
I'm hoping someone would stop in with more experience pertaining to RTMP relays. Perhaps I'll have to make a new topic as this one probably wont garner a lot of new views for this particular issue.
EDIT:
Still working on this... I'm running into a few issues. Does anyone have any experience with Adobe Media Server? Using a bitrate of 900000, 120fps, quality of 10, sound 320, 1680x1050, ultrafast preset, and crf=0 I'm getting about 20% OBS processor usage. I end up with a bitrate around 17-150Mbps, but nothing unreasonable. With crf=0 excluded I get about a 8-20Mbps bandwidth usage. I'm still messing around with things and trying to figure out how to work things, but it seems like processor usage actually goes up with a higher bitrate and cfr=0 for some reason (even though the encoder should be doing less work).
What I do have issues with is adobe media server seems to start dropping connections to it with really high bit rates and I don't know why... It could be possible it's not made for bitrates about 15Mbps, but I sorta doubt that.
EDIT 2:
So an update on this... Adobe Media Server seems to crap out with bit rates higher then 20Mbps and will disconnect clients. You need a media server to serve as a relay for lan encoding, the other option being using Xsplit delay server (which is a premium service for xsplit).
I found that using 60 fps and my native resolution and crf=24 I could produce a picture that was indistinguishable from the source. It ended up with a bit rate of about 8-20Mbps, meaning AMS wouldn't disconnect the client connected to it (I set a bit rate of 20000, buffer 10000, and quality 10). On my server end I tried it with smplayer and capturing to OBS using it, but it didn't work out very well. I ended up using Xsplit to capture the stream from the media server instead.
However, my server did not have enough horsepower to power an encode of similar quality to what I was using for my normal stream. Trying to stream at 720p@30fps wasn't working out. I have a Phenom 2 x3 710 in my server. Using OBS I would suspect lower utilization, but due to the fact that you can't use RTMP streams as a source I had to forgo that option.
On my desktop side of things the biggest change was going from the veryfast preset to the ultrafast preset because of the ample amount of bandwidth available. This reduced utilization of OBS by 50-70% depending on what was going on on the screen. So this indeed does free up quite a bit of resources.
However, it doesn't seem very practical right now as software doesn't seem mature enough in regard to ultra high bit rates. I could have Adobe Media Server configured wrong, but it seems as though it's not capable of handling extremely high bit rates from a single source (20Mbps+). OBS while offering quite a bit of a reduction in processor usage was only because I selected a lighter preset, which was from the encoder preset.
So even though there was copious amounts of bandwidth available for the encoding process, the software couldn't fully take advantage of it. I'd suspect even lower processor usage if this was built into OBS either for receiving a RTMP stream or being able to stream directly from one OBS client to another. Xsplit has this functionality built in, but streaming with the higher bit rate settings yielded the same processor usage as streaming OBS with a normal bit rate. I was unable to test the delay server as I don't have a premium subscription to Xsplit, but I believe it wouldn't be worth it for reducing the workload on the primary computer.
It is still possible though... I partly blame my slow server for not being able to encode a 720p stream with xsplit.
In a nut shell:
-Lan streaming shows a 50-70% processor usage reduction for OBS in its current state
-Streaming software isn't currently optimized for extremely high bit rates (input, output, and transcoding)
-Lan streaming shows a lot of promise even with my duct taped together configuration
-There isn't a easy to use option right now (haven't tried Xsplit premium)
-Sending visually indistinguishable streams from the source doesn't require a lot of bandwidth. Even operating at crf=0 bit rates didn't get higher then 200Mbps.
--Raising the bitrate didn't reduce CPU usage, which in part could be due to streaming software not being able to properly utilize extremely high bandwidth. Changing crf to 0, 18, or 24 only changed the quality of the stream, not processor usage.
I would very much like this to be a easy to use option in OBS and would benefit anyone with a second computer capable of encoding. It would instantly obsolete capture cards used as a means of reducing primary computer streaming workloads. I can't imagine receiving a RTMP stream or sending one to another client on the network to be that much work, either, and that's not even optimizing the client for high bit rates.
Although there are a handful of people who use capture cards as actual input from a console or other outside source, the majority of capture card PCs are used strictly for removing the encoding workload from the primary PC. In that regard I see current work to make every make and model of capture card compatible as a very low priority. Something like this is a universal option open to everyone and would replace all said capture cards in that usage scenario anyway.