Camera Bitrate versus OBS Stream Bitrate

jp-livecontrol

New Member
What is the advantage / disadvantage of setting the camera bitrate to 8192 if we are only streaming out of OBS at 2500? Is there a benefit to the higher bitrate in the cameras? What is the visual difference between say 8192 and 4096 from the camera feeding into OBS, compared to the visual difference between increasing OBS from 2500 to 4096 instead? Thanks!
 

carlmmii

Active Member
I'm confused... is there a drawback to not using a higher bitrate for your camera?

There are two stages of compression you're referencing. The first occurs between the camera and OBS, where the data sent from the camera is decoded into the actual frames that are composited in the scene. You want this as high quality as possible without sacrificing anything, because anything here will be further compressed in the final encode (garbage in -> garbage out).

In a general sense, if you're talking about the same compression algorithm, when re-encoding you always want to have a higher source bitrate than your final bitrate. That way the re-compression of artifacts left over from the original compression don't make as much of a difference in the final output. There will always be some residual quality loss, and how much is very much a case-by-case study, but in general you want to have as high a quality reasonable going into the final encode.

Now, it's that "reasonable" that is the tricky bit. Do you need a perfect image quality if your webcam is only taking up a small corner in your scene, compressed down to a 2500kbps stream? Probably not. If there's a tradeoff in your situation where you are considering using a lower bitrate for the camera for some kind of benefit, the just test -- do multiple encodes (using your target bitrate for OBS's output) using the same scene, just changing the camera's bitrate. See whether it makes a difference in the final output, and whether that difference is worth the tradeoff.
 

jp-livecontrol

New Member
The question you posed is actually what prompted the conversation - we are running two 4K PTZ cameras and a computer on a dedicated gigabit network. If we set the cameras to 1080p, h.264 with a bit rate of 8192, we periodically see a ghosting/smearing effect when we pan and tilt. This has been alleviated by reducing the bitrate to 4096 or even 2048. I'm not sure why we are seeing the ghosting considering we're well below the overall throughput of the network. We were using VLC for ingestion into OBS and have recently switched to gStreamer. This minimizes the issue, but does not completely eliminate it. I am open to any other information that could help with this issue.

Thanks!
 

carlmmii

Active Member
Ok, so that seems like above a certain bitrate, VLC or whatever ingest program you're using misses occasional keyframes, which causes the smearing. If there's any kind of buffer setting for the cameras/ingest software, raising it should help. If not, then you're at the mercy of the network reliability and software.
 

BluePeer

Member
ghosting results mostly in wrong settings
smart codec active
to small buffer
missmatching in iframe,fps,maxrate fps and other basic miss configuration of Basic settings
different work szenarios needs different settings related on setup
issues you see in this mostly online is things work close limits (for example network capability if the Bitrate of the cams bounce up/down related to a bottleneck on the transfer devices, then buffers can run out of specification and results in issues)
related to your information that the issue is related to the bitrate (your feeling) its a high chance that the issue is related to a to small selected buffer setting on Receiver
 

jp-livecontrol

New Member
We are running on a gigabit network dedicated to this equipment. OBS is running on an i7 with 16gb of RAM and the only other devices on the network are two cameras set to a bitrate of 8192. And we still see the smearing/artifacts. Keyframes are set to 2 in OBS. Using VLC as the ingestion method in OBS displays the smearing. gStreamer in OBS using TCP prevents smearing and provides lower latency, but periodically freezes and the source needs to be restarted. Where would the buffer setting be in OBS to prevent the smearing/ghosting effect?
 
Top