Hello everyone.
I "rebuild" my dedicated Streaming PC. Hardware is a Dell T20 with a 3.2GHz Xeon 1225v3 and 12GB RAM on Win 10 Pro Version 1704 (first creators update). It uses an avermedia Live Gamer HD lite 1 (HMDI capture card).
With Voicemeter Banana on with VBAN (network audiostream for my mic on my recording PC) and OBS open + Xbox app I have a CPU usage of around 10%.
OBS is version 19.0.3. My video/canvas resolution is 1080 with no downscaling, bicubic 16 samples.
I tested with x264 720p30@2500KBit and 4500KBit. The presets I used were veryfast and faster. On faster my CPU spiked at 100% usage and the stream /recording was choppy at times.
Before I used an RTMP-server solution with nginx and ffmpeg on my streaming PC on Ubuntu and my recording PC (using an Elgato HD60) send a RTMP stream in 1080p30 with 40MBit via Quicksync. There I could stream with 720p30@medium preset IIRC with a bitrate between 2000-2500KBit. Another test was streaming to twitch in 2500KBit in 720p30@veryfast and to youtube in 1080p30@veryfast with 5000KBit.
My question is: Did my Ubuntu setup not use much performance because the Quickync stream from my recording PC being in "h264"? So the "heavy workload" for converting source material into h264 was done by Quicksync? Now my stream PC with Win 10 and Avermedia Live Gamer HD has to use more power for converting to h264 from the capture card?
Is there a website /calculator which can "roughly" tell what kind of cpu power you need for certain resolutions like 720p/900p/1080p?
I hope you understand my question. Thank you in advance.
I will provide a log in the evening.
I "rebuild" my dedicated Streaming PC. Hardware is a Dell T20 with a 3.2GHz Xeon 1225v3 and 12GB RAM on Win 10 Pro Version 1704 (first creators update). It uses an avermedia Live Gamer HD lite 1 (HMDI capture card).
With Voicemeter Banana on with VBAN (network audiostream for my mic on my recording PC) and OBS open + Xbox app I have a CPU usage of around 10%.
OBS is version 19.0.3. My video/canvas resolution is 1080 with no downscaling, bicubic 16 samples.
I tested with x264 720p30@2500KBit and 4500KBit. The presets I used were veryfast and faster. On faster my CPU spiked at 100% usage and the stream /recording was choppy at times.
Before I used an RTMP-server solution with nginx and ffmpeg on my streaming PC on Ubuntu and my recording PC (using an Elgato HD60) send a RTMP stream in 1080p30 with 40MBit via Quicksync. There I could stream with 720p30@medium preset IIRC with a bitrate between 2000-2500KBit. Another test was streaming to twitch in 2500KBit in 720p30@veryfast and to youtube in 1080p30@veryfast with 5000KBit.
My question is: Did my Ubuntu setup not use much performance because the Quickync stream from my recording PC being in "h264"? So the "heavy workload" for converting source material into h264 was done by Quicksync? Now my stream PC with Win 10 and Avermedia Live Gamer HD has to use more power for converting to h264 from the capture card?
Is there a website /calculator which can "roughly" tell what kind of cpu power you need for certain resolutions like 720p/900p/1080p?
I hope you understand my question. Thank you in advance.
I will provide a log in the evening.
Attachments
Last edited: