papadiscobravo
New Member
I work at a small college and have been asked to help with the first in-person theater production: small cast, smaller crew, no audience in the house, only streaming (or on-demand if necessary). For artistic and practical reasons, we don't want to do live switching with one of our Blackmagic ATEM ME1 units, but rather composite all three cameras's shots into a single video frame. I mocked this up in OBS in a few minutes and started a test stream to YouTube. The three images were out of synch with each other. Nothing is serving as master clock, timecode generator, or frame synchronizer--I would've thought the three signals might drift apart by a few frames over the course of many minutes but they were different by a few seconds starting right away. It was visible in the OBS preview window, not only on YouTube. I happen to have a C100 mkII and two hardware encoders at home: a Magewell and a Blackmagic Intensity Shuttle. I used internal camera on a new Macbook Pro as second video source, and ran the output from an older Macbook Pro's camera through a hardware encoder as a third source. No one of the three video sources was always out or in synch with the video. I will not have physical access to the actual cameras we intend to use until early January (two Panasonic AG-AC90's and a Canon XA-10). I wanted to get a conversation started about how, on a theater budget, would anyone suggest approaching the problem. Thanks for any thoughts you might have.