Kurtalicious
Member
I have a 2 PC setup where the gaming pc sends the stream to a rtmp server which uses ffmpeg to transcode and send it off to twitch.
I've used both AMD (R9 380) and Nvidia (GTX 970 FTW+) encoders successfully with beautiful high quality video at 1080p30 with no frame drop/stutter/distortion/etc. I've read a lot of stuff that streaming using the gpu encoding hardware has minimal impact on performance (about 5%) because it is separate from the hardware that generates the in-game image. However, this is not what I experience in real life, with both AMD and Nvidia my fps in game drops significantly when streaming using their gpu encoding hardware.
As of right now I am mostly streaming league of legends (a gpu intensive game). When not streaming I can get anywhere from 125-200+ in game fps all day. However, when I start up the encoding hardware and stream my in game fps drops to around 60-80 fps depending on what is going on in game (sometimes as low as 45 fps if the fights have a lot of animations).
Is this normal? If not, how can I improve performance?
I've used both AMD (R9 380) and Nvidia (GTX 970 FTW+) encoders successfully with beautiful high quality video at 1080p30 with no frame drop/stutter/distortion/etc. I've read a lot of stuff that streaming using the gpu encoding hardware has minimal impact on performance (about 5%) because it is separate from the hardware that generates the in-game image. However, this is not what I experience in real life, with both AMD and Nvidia my fps in game drops significantly when streaming using their gpu encoding hardware.
As of right now I am mostly streaming league of legends (a gpu intensive game). When not streaming I can get anywhere from 125-200+ in game fps all day. However, when I start up the encoding hardware and stream my in game fps drops to around 60-80 fps depending on what is going on in game (sometimes as low as 45 fps if the fights have a lot of animations).
Is this normal? If not, how can I improve performance?
Last edited: