Question / Help Forced to use REALLY low bitrate for local stream?

digitaldeity_

New Member
Hey everyone. I am trying to frankenstein a live streaming solution that is all using local network access.

The problem I am having is, despite the fact that my IP cam is running perfectly fine, after i run it through OBS as a sort of proxy, I get massive pixelation, frame drop, and general lag in the video... UNLESS i set my bitrate <500k

So here's a little more background on the setup:

IP Cam -> RTSP stream -> OBS w/ Video Source plugin -> Ubuntu 14.04 NGINX server -> VLC

If I load the stream direct from the camera, is works flawlessly, so it's not the camera, nor VLC
Looking at the stream preview in OBS looks very very good

OBS Host is running Win 2K12 Server
CPU use is <25% peak
Server has 2x 1Gbit LAN in a NIC Team, <0.1% use
Server has 16GB RAM, 2.5GB in use, 13GB free

NGINX host is a vmware box, 1 core, 2GB RAM
using HTOP I see this:
CPU 1%
RAM 63MB used
It's connected to a vSwitch on a 10Gbit link

It doesn't ~appear~ to be a CPU, RAM, or network limitation, so I'm not sure where else to dig into this. Any bitrate over 500K ends up giving me yellow or red squares, and I start dropping frames like crazy.

Considering I have at least a 1Gbit connection between all points of the stream, why does my bitrate have to be so low...

Thoughts on where to look?
 
Hey everyone. I am trying to frankenstein a live streaming solution that is all using local network access.

The problem I am having is, despite the fact that my IP cam is running perfectly fine, after i run it through OBS as a sort of proxy, I get massive pixelation, frame drop, and general lag in the video... UNLESS i set my bitrate <500k

So here's a little more background on the setup:

IP Cam -> RTSP stream -> OBS w/ Video Source plugin -> Ubuntu 14.04 NGINX server -> VLC

If I load the stream direct from the camera, is works flawlessly, so it's not the camera, nor VLC
Looking at the stream preview in OBS looks very very good

OBS Host is running Win 2K12 Server
CPU use is <25% peak
Server has 2x 1Gbit LAN in a NIC Team, <0.1% use
Server has 16GB RAM, 2.5GB in use, 13GB free

NGINX host is a vmware box, 1 core, 2GB RAM
using HTOP I see this:
CPU 1%
RAM 63MB used
It's connected to a vSwitch on a 10Gbit link

It doesn't ~appear~ to be a CPU, RAM, or network limitation, so I'm not sure where else to dig into this. Any bitrate over 500K ends up giving me yellow or red squares, and I start dropping frames like crazy.

Considering I have at least a 1Gbit connection between all points of the stream, why does my bitrate have to be so low...

Thoughts on where to look?
Please post your logfile
giphy.gif
 
Your APU appears to really be eating it at 720p regardless of the bitrate you set. When you took the resolution down to 640x480 things seemed to improve a lot.
 
What Sapiens said, but your A10 can also use VCE which is a hardware encoder built in to your APU (GPU portion).
Try this build especially since you are streaming to LAN to an RTMP server

https://obsproject.com/forum/threads/obs-branch-with-amd-vce-support.13996/

This build should allow you to stream at high bitrates for 1280x720@30fps with as much bitrate as you'll need. 25000 is the max for 720@30 with the above fork of OBS.
 
Right now that APU based system is only for testing and proof of concept, eventually we're planning to implement a new server, likely with either a low power xeon (E5-2603, 6-core 1.6Ghz) or the new Avoton Atoms (C2750/C2758 8-core 2.6Ghz/2.4Ghz)

Would you foresee any problems once we move to the newer system specs? Should we aim for higher clock speed or more CPU threads?

Thanks for all your help guys... (and/or gals)
 
What Sapiens said, but your A10 can also use VCE which is a hardware encoder built in to your APU (GPU portion).
Try this build especially since you are streaming to LAN to an RTMP server

https://obsproject.com/forum/threads/obs-branch-with-amd-vce-support.13996/

This build should allow you to stream at high bitrates for 1280x720@30fps with as much bitrate as you'll need. 25000 is the max for 720@30 with the above fork of OBS.

Did i read that right? 25000? not 2500?
 
Did i read that right? 25000? not 2500?
I said it right, but worded it incorrectly. I should have said local record, not stream. When local recording, you can do as much as 25,000 bitrate (25Mbps), when streaming, you'd want that to be around 2,500-3000 bitrate (2.5-3.0Mbps)
 
Right now that APU based system is only for testing and proof of concept, eventually we're planning to implement a new server, likely with either a low power xeon (E5-2603, 6-core 1.6Ghz) or the new Avoton Atoms (C2750/C2758 8-core 2.6Ghz/2.4Ghz)

Would you foresee any problems once we move to the newer system specs? Should we aim for higher clock speed or more CPU threads?

Thanks for all your help guys... (and/or gals)

Thanks for all the help guys... Any thoughts on the above CPUs? We'll be ordering our final production machine soon... any pitfalls I should know about beforehand?
 
Thanks for all the help guys... Any thoughts on the above CPUs? We'll be ordering our final production machine soon... any pitfalls I should know about beforehand?
Unsure if an atom processor, even with 8 cores, would have the power, the xeon e5-2603 is a quad core @1.8 GHz not hexcore at 1.6. Anyway, the xeon is more like an i5 which will be fine for 720@30
 
Unsure if an atom processor, even with 8 cores, would have the power, the xeon e5-2603 is a quad core @1.8 GHz not hexcore at 1.6. Anyway, the xeon is more like an i5 which will be fine for 720@30

it's actually a E5-2603v3, so it is hex core... I forgot that all changed with v3
http://ark.intel.com/products/83349/Intel-Xeon-Processor-E5-2603-v3-15M-Cache-1_60-GHz

Thanks though, I'll stick with the Xeon in my plans... At least I have the option to add a second proc that way also...
 
Back
Top