Turns out it makes for a great alternative to quicksync+nginx when paired with an rtmp server.
Why use relive over obs? simple, you remove literally any and all overhead obs provides.
And lets be honest if you are doing scene composition and the like you are probably doing it on your stream pc (because consoles and stuff)
Just as a comparison, I used the indistinguishable setting of the obs vce plugin on a loopback connection (to rule out any and all problems related to bandwidth, since in this day an age with usb 3.1 crossover a 10gbps network uncongested connection is feasible) This resulted in a bandwidth usage of around 14000 kbps during high motion, and 30% cpu usage on an intel i5 4690 (to put this into perspective when i tried to run a 1080p 60fps video on youtube (my usual benchmark for stream performance is to restream a 1080p60 sonic the hedgehog vid) the combined cpu usage of an obs client streaming 1080p60, an obs client decoding 1080p60 from rtmp, and youtube playing a 1080p60 vid caused me to achieve 100% cpu usage and absolutely tank my framerate, which due to no frames being dropped made the decoded stream play back at 50% speed)
In rolls relive, which uses a whopping 4% cpu usage to output a constant 10mpbs stream (maximum quality setting you can choose for streaming) thats a 26% loss in cpu usage.
As far as im aware shadowplay doesnt yet and has no plans to support custom rtmp streaming so if you use nvidia you are still stuck with running obs on the gaming pc and then either transcoding your stream with ffmpeg on the rtmp server or piping it to a second obs client.
Now as for why I listed this as feedback.
I noticed in my testing of this that obs still has an insanely high delay on its rtmp input, as a point of reference the already slow vlc rtmp input is a whole 10 seconds faster than obs, I have no idea if this can be fixed, or if its even on the drawing board but thatd be a gem.
Another thing is I recall jim saying he had code for an rtmp server for obs? Is there any timeframe as to when that could be added, as my current setup basically involves mona starting up on windows startup, then just pressing my stream hotkey (since relive is driver level so I dont have to actually run anything on the game pc), and while that works fine, removing one extra step makes my life easier, and makes it slightly less intimidating for new users.
Tbh we dont even really need a server, we just need an rtmp option that listens to port 1935 for incoming streams (although i guess you could call that a server? by loose definition)
lastly I noticed very quickly that something isnt adding up right with hardware decoding, why am I getting 30% cpu usage from an incoming stream when using an rx470 which has a hardware decoder on the silicon (infact it has a hardware decoder for h.265 as well, so it officially outspecs obs's capabilities) why is it not being used?
Do you just not support amd's hardware decoder yet? if so is there plans to add it?
Why use relive over obs? simple, you remove literally any and all overhead obs provides.
And lets be honest if you are doing scene composition and the like you are probably doing it on your stream pc (because consoles and stuff)
Just as a comparison, I used the indistinguishable setting of the obs vce plugin on a loopback connection (to rule out any and all problems related to bandwidth, since in this day an age with usb 3.1 crossover a 10gbps network uncongested connection is feasible) This resulted in a bandwidth usage of around 14000 kbps during high motion, and 30% cpu usage on an intel i5 4690 (to put this into perspective when i tried to run a 1080p 60fps video on youtube (my usual benchmark for stream performance is to restream a 1080p60 sonic the hedgehog vid) the combined cpu usage of an obs client streaming 1080p60, an obs client decoding 1080p60 from rtmp, and youtube playing a 1080p60 vid caused me to achieve 100% cpu usage and absolutely tank my framerate, which due to no frames being dropped made the decoded stream play back at 50% speed)
In rolls relive, which uses a whopping 4% cpu usage to output a constant 10mpbs stream (maximum quality setting you can choose for streaming) thats a 26% loss in cpu usage.
As far as im aware shadowplay doesnt yet and has no plans to support custom rtmp streaming so if you use nvidia you are still stuck with running obs on the gaming pc and then either transcoding your stream with ffmpeg on the rtmp server or piping it to a second obs client.
Now as for why I listed this as feedback.
I noticed in my testing of this that obs still has an insanely high delay on its rtmp input, as a point of reference the already slow vlc rtmp input is a whole 10 seconds faster than obs, I have no idea if this can be fixed, or if its even on the drawing board but thatd be a gem.
Another thing is I recall jim saying he had code for an rtmp server for obs? Is there any timeframe as to when that could be added, as my current setup basically involves mona starting up on windows startup, then just pressing my stream hotkey (since relive is driver level so I dont have to actually run anything on the game pc), and while that works fine, removing one extra step makes my life easier, and makes it slightly less intimidating for new users.
Tbh we dont even really need a server, we just need an rtmp option that listens to port 1935 for incoming streams (although i guess you could call that a server? by loose definition)
lastly I noticed very quickly that something isnt adding up right with hardware decoding, why am I getting 30% cpu usage from an incoming stream when using an rx470 which has a hardware decoder on the silicon (infact it has a hardware decoder for h.265 as well, so it officially outspecs obs's capabilities) why is it not being used?
Do you just not support amd's hardware decoder yet? if so is there plans to add it?