So in a effort to get around the degradation of my capture card (c985) and not wanting to purchase a XXXX dollar card, I've been messing around with lan encoding again. It seems as though it's matured enough (monaserver, quicksync, video source plugin) to try it again. Theoretically it should be able to out perform a good capture card without the compatibility issues.
A few things I've noticed, monaserver is quite a bit more robust then adobe media server I used back in the day. It seems to only crash when you shove upwards of 400+Mbps through it on one connection for a prolonged period of time, which is pretty ridiculous. I'm starting to think this is being caused by it hitting a theoretical maximum for my 1Gbps connection and crashing it (not all cards can reach actual 1Gbps). I didn't really put this together at first, but it seems as though even when you use a loopback (localhost, 127.0.0.1) it still ends up being seen by the adapter and gets thrown at the OS. I'm not entirely sure on this just yet, but it's something I noticed. My streaming PC showed that it was using close to the 1Gbps even though the upload from the gaming PC wasn't anywhere close.
Quicksync operating at 1080p@60 will stutter in high motion scenes unless you turn the encoding preset down to 2 from 1 due to the encoder being unable to keep up. I have a 4690k, so your mileage may very. Earlier Intel processors had less powerful encoders.
ICQ/CQP functions very similar to CRF. Setting it to 21 is very similar to CRF=21. This is where the weird stuff starts happening.
Higher bitrate does not cause the encoder (Quicksync) to stutter, however, it does use more CPU cycles due to network overhead. OBS was using about 4% of my processor doing 1080p@60 with QS with a ICQ of 21. Setting ICQ to 1 would use about 11% of my processor on my gaming PC. ICQ 1 would result at maximum of 400Mbps in high motion scenes in a FPS.
On the receiving PC, of course increasing the bitrate would also result in a increase in CPU utilization by all associated programs.
The weird bit is that OBS would use more CPU cycles the higher the incoming bitrate (through RTMP for video source plugin). Quite a bit more. It seems as though bitrate had a bigger impact on performance of OBS then FPS for decoding. I'm not exactly sure why this is. I assume, high FPS content should take more CPU cycles to decode then just higher bitrate content.
Using a ICQ=1 OBS would use around 90-100% of my 8350. ICQ=11 OBS uses around 70% of my processor. Using my capture card with 1080p@30 source it would use 70% of my processor. Changing the FPS on the source material doesn't effect this at all.
This is all operating at the same encoder preset, same resolution, same downscaling, etc, all else being equal.
Now I assume the difference in CPU utilization is due to decoding overhead inside OBS. It's weird that changing the FPS on the source material doesn't change this though. The other possibility is this is actually missing information OBS now has to encode because I have a crappy capture card.
Technically speaking, cards and many built in IGPs have had decoders built in them for quite some time now. Is it possible for OBS to leverage them? This may be a more niche scenario, but it's pretty easy to use lan encoding now days (monaserver runs under windows, doesn't need nix, doesn't need setup), so I imagine once word gets around it'll become a bigger deal.
I'm curious on other peoples thoughts and experiences regarding this. People may not even need capture cards in the future and still get all the benefits of a separate capture PC. Although this isn't completely built into OBS yet, it's pretty close and really easy to do.
I do want to try this all out with VCE, but I haven't had time to test it yet and I assume results would be very similar as it runs into other limitations (OBS) other then the encoder with a insanely high bitrate.
A few things I've noticed, monaserver is quite a bit more robust then adobe media server I used back in the day. It seems to only crash when you shove upwards of 400+Mbps through it on one connection for a prolonged period of time, which is pretty ridiculous. I'm starting to think this is being caused by it hitting a theoretical maximum for my 1Gbps connection and crashing it (not all cards can reach actual 1Gbps). I didn't really put this together at first, but it seems as though even when you use a loopback (localhost, 127.0.0.1) it still ends up being seen by the adapter and gets thrown at the OS. I'm not entirely sure on this just yet, but it's something I noticed. My streaming PC showed that it was using close to the 1Gbps even though the upload from the gaming PC wasn't anywhere close.
Quicksync operating at 1080p@60 will stutter in high motion scenes unless you turn the encoding preset down to 2 from 1 due to the encoder being unable to keep up. I have a 4690k, so your mileage may very. Earlier Intel processors had less powerful encoders.
ICQ/CQP functions very similar to CRF. Setting it to 21 is very similar to CRF=21. This is where the weird stuff starts happening.
Higher bitrate does not cause the encoder (Quicksync) to stutter, however, it does use more CPU cycles due to network overhead. OBS was using about 4% of my processor doing 1080p@60 with QS with a ICQ of 21. Setting ICQ to 1 would use about 11% of my processor on my gaming PC. ICQ 1 would result at maximum of 400Mbps in high motion scenes in a FPS.
On the receiving PC, of course increasing the bitrate would also result in a increase in CPU utilization by all associated programs.
The weird bit is that OBS would use more CPU cycles the higher the incoming bitrate (through RTMP for video source plugin). Quite a bit more. It seems as though bitrate had a bigger impact on performance of OBS then FPS for decoding. I'm not exactly sure why this is. I assume, high FPS content should take more CPU cycles to decode then just higher bitrate content.
Using a ICQ=1 OBS would use around 90-100% of my 8350. ICQ=11 OBS uses around 70% of my processor. Using my capture card with 1080p@30 source it would use 70% of my processor. Changing the FPS on the source material doesn't effect this at all.
This is all operating at the same encoder preset, same resolution, same downscaling, etc, all else being equal.
Now I assume the difference in CPU utilization is due to decoding overhead inside OBS. It's weird that changing the FPS on the source material doesn't change this though. The other possibility is this is actually missing information OBS now has to encode because I have a crappy capture card.
Technically speaking, cards and many built in IGPs have had decoders built in them for quite some time now. Is it possible for OBS to leverage them? This may be a more niche scenario, but it's pretty easy to use lan encoding now days (monaserver runs under windows, doesn't need nix, doesn't need setup), so I imagine once word gets around it'll become a bigger deal.
I'm curious on other peoples thoughts and experiences regarding this. People may not even need capture cards in the future and still get all the benefits of a separate capture PC. Although this isn't completely built into OBS yet, it's pretty close and really easy to do.
I do want to try this all out with VCE, but I haven't had time to test it yet and I assume results would be very similar as it runs into other limitations (OBS) other then the encoder with a insanely high bitrate.