Hey everyone!
I have a dedicated streaming box that has an AMD R7 with 32GB of memory running windows 10.
I'm able to stream to Twitch.tv at the following settings
Doing so shows the CPU usage peaking at 40%.
I wanted to try and add a replay buffer. So i set my buffer to be 30 seconds, and ensured the buffer starts when I go live.
Doing so shows CPU usage at 100% and i start to drop frames.
Is this expected? I was thinking it would not be since ram is to be the only change but its as if the system is doing two encodes now.
I have a dedicated streaming box that has an AMD R7 with 32GB of memory running windows 10.
I'm able to stream to Twitch.tv at the following settings
- 1080p, 60 FPS
- 6 mBPS
- Medium Encode
Doing so shows the CPU usage peaking at 40%.
I wanted to try and add a replay buffer. So i set my buffer to be 30 seconds, and ensured the buffer starts when I go live.
Doing so shows CPU usage at 100% and i start to drop frames.
Is this expected? I was thinking it would not be since ram is to be the only change but its as if the system is doing two encodes now.