Hi all. I've been working with a local venue recording DIY bands using OBS and several NDI-connect handheld cameras. Due to the current "thing", we're trying to keep our staffing for shoots to a minimum. While we intend to keep one incumbent camera person, the rest of the gear will be set up as static.
I'm trying to figure out how to introduce a handheld motion to the static camera footage for any live streams (cutting between static and handheld is a bit jarring IMHO). Presently I can track a "dot on the wall" and re-introduce that handheld camera's motion to anything we post edit, but can't source a way to do it through OBS for streaming.
Ideally we'd create a library of multi-minute looped motion from a 2D track and use as appropriate. So "Slow Song", "Fast Song", "Jim after three beer" etc. Relating the furthest extent of the loaded motion to the level of camera push-in needed would be ideal to automate, perhaps just pre-calculated and embedded into the motion file as metadata.
I have a bit of a background in API's and Python from a previous career (visual effects TD) and would be happy to contribute in any way if such a resource or method does not presently exist. Given the nature of the venue, I'm sure they would agree to share any library of motions we build (in fact I'd insist) as content.
Some(what silly) samples of our work:
Any pointers appreciated!
Cheers,
Christopher
I'm trying to figure out how to introduce a handheld motion to the static camera footage for any live streams (cutting between static and handheld is a bit jarring IMHO). Presently I can track a "dot on the wall" and re-introduce that handheld camera's motion to anything we post edit, but can't source a way to do it through OBS for streaming.
Ideally we'd create a library of multi-minute looped motion from a 2D track and use as appropriate. So "Slow Song", "Fast Song", "Jim after three beer" etc. Relating the furthest extent of the loaded motion to the level of camera push-in needed would be ideal to automate, perhaps just pre-calculated and embedded into the motion file as metadata.
I have a bit of a background in API's and Python from a previous career (visual effects TD) and would be happy to contribute in any way if such a resource or method does not presently exist. Given the nature of the venue, I'm sure they would agree to share any library of motions we build (in fact I'd insist) as content.
Some(what silly) samples of our work:
Mixed Gems on Red Gate TV
Featuring Milk, Sigh, Wut, and Shitlord Fuckerman Oct. 31 2020 @ Red Gate
www.youtube.com
Any pointers appreciated!
Cheers,
Christopher