JasonVP
Member
Hey folks -
I know the 1/2 second maximum "Render Delay" filter for the Video Capture is/was intentional. Could we consider changing that to be any length the end user needs? This, of course, comes with the assumed caveat that: the larger the delay, the more RAM being used?
Here's why:
I'm still not impressed with the NDI plugin. Its audio syncing issues are just plain obnoxious and even with the current built, myself and others are having issues with it. I don't want to use it. Full stop. Instead, I'd rather stream from the gaming PC to the streaming PC using RTMP. I've got NGINX w/RTMP built in running on the streaming rig; that's all super easy for me to do. No issues. The problem I'm running into is the processing delay between the two machines. It's about 2 seconds. With the camera connected to the streaming rig, I'd end up reacting to an event 2 seconds before the stream viewers would see the event.
The fix for this is simply letting me bump the Render Delay up to 2 seconds. Yes, this will eat more RAM; I got that. I also have 64GB of RAM on the streaming rig, so I'm not concerned. So can we reconsider that 1/2-second limit? Make it infinite with the caveat that longer delays mean more system resources eaten? In other words: give me more rope and I'll worry about not hanging myself?
Thanks.
I know the 1/2 second maximum "Render Delay" filter for the Video Capture is/was intentional. Could we consider changing that to be any length the end user needs? This, of course, comes with the assumed caveat that: the larger the delay, the more RAM being used?
Here's why:
I'm still not impressed with the NDI plugin. Its audio syncing issues are just plain obnoxious and even with the current built, myself and others are having issues with it. I don't want to use it. Full stop. Instead, I'd rather stream from the gaming PC to the streaming PC using RTMP. I've got NGINX w/RTMP built in running on the streaming rig; that's all super easy for me to do. No issues. The problem I'm running into is the processing delay between the two machines. It's about 2 seconds. With the camera connected to the streaming rig, I'd end up reacting to an event 2 seconds before the stream viewers would see the event.
The fix for this is simply letting me bump the Render Delay up to 2 seconds. Yes, this will eat more RAM; I got that. I also have 64GB of RAM on the streaming rig, so I'm not concerned. So can we reconsider that 1/2-second limit? Make it infinite with the caveat that longer delays mean more system resources eaten? In other words: give me more rope and I'll worry about not hanging myself?
Thanks.