Joe Fitzpatrick
New Member
Hi,
Last week a friend asked if I could put together a way for him to control OBS Studio via DMX (a lighting control protocol) and Art-Net (a protocol for sending DMX over an ethernet connection). Overall, it came together quite quickly. I already have a DMX and Art-Net engine that I put together that can be built for OS X, Linux, and Windows, (http://scrootchme.com/x2/) so I created an OBS plugin.
I'm not sure if I did this the correct way, but I linked to both libobs and lib_frontend_api. Triggering scene changes using the frontend api was very straight forward, as was triggering the transport controls (recording start/stop, etc.) Audio Mixer control was slightly trickier. I ended up creating 5 obs_fader instances. Then, in response to the first OBS_FRONTEND_EVENT_SCENE_CHANGED event, I enumerated the sources and attached those faders to the Mic/Aux and Desktop Audio sources that exist. Mapping the DMX input onto the obs_fader deflection then appears to control the UI mixer well.
The only thing that I was not able to figure out was how to remotely control the transition time, at least not in a way that is friendly to the existing UI. That mechanism seems internal to the main window handling itself. If I missed something, I'd appreciate someone letting me know.
But my main question actually came up today. In helping my friend setup DMX control it occurred to both of us that the reverse could be quite useful. I already created a Tool Menu triggered settings dialog to configure the DMX base channel and Art-Net in subnet, etc. for remote control. Providing a mechanism to map Scenes (by name) to lighting cues is very easy. But you would normally like lighting transitions to the start of a scene change, but the existing frontend api events only appear to signal when a transition has finished and a scene change is complete.
It seems like this is a an obvious gap in the FrontEnd API events, but it also appears that I should be able to to get the event activity I need with signals from obs_core and/or specific objects. But since scenes can be nested and reused it is not clear to me what the cleanest mechanism is to observe. I think that I am only interested in the highest level scene/source changes. But when I add a signal handler for source_transition_start, source_transition_stop, and source_activate, the order and count of messages I output to the log is a bit baffling. I guess part of my confusion is it seems that I need to include obs-internal.h to make conditional decisions that I want to make based on the triggered signals. For long term compatibility, etc., this seems like something best avoided. Any feedback anyone could provide in this area would be most appreciated. Sorry for the long post!
Last week a friend asked if I could put together a way for him to control OBS Studio via DMX (a lighting control protocol) and Art-Net (a protocol for sending DMX over an ethernet connection). Overall, it came together quite quickly. I already have a DMX and Art-Net engine that I put together that can be built for OS X, Linux, and Windows, (http://scrootchme.com/x2/) so I created an OBS plugin.
I'm not sure if I did this the correct way, but I linked to both libobs and lib_frontend_api. Triggering scene changes using the frontend api was very straight forward, as was triggering the transport controls (recording start/stop, etc.) Audio Mixer control was slightly trickier. I ended up creating 5 obs_fader instances. Then, in response to the first OBS_FRONTEND_EVENT_SCENE_CHANGED event, I enumerated the sources and attached those faders to the Mic/Aux and Desktop Audio sources that exist. Mapping the DMX input onto the obs_fader deflection then appears to control the UI mixer well.
The only thing that I was not able to figure out was how to remotely control the transition time, at least not in a way that is friendly to the existing UI. That mechanism seems internal to the main window handling itself. If I missed something, I'd appreciate someone letting me know.
But my main question actually came up today. In helping my friend setup DMX control it occurred to both of us that the reverse could be quite useful. I already created a Tool Menu triggered settings dialog to configure the DMX base channel and Art-Net in subnet, etc. for remote control. Providing a mechanism to map Scenes (by name) to lighting cues is very easy. But you would normally like lighting transitions to the start of a scene change, but the existing frontend api events only appear to signal when a transition has finished and a scene change is complete.
It seems like this is a an obvious gap in the FrontEnd API events, but it also appears that I should be able to to get the event activity I need with signals from obs_core and/or specific objects. But since scenes can be nested and reused it is not clear to me what the cleanest mechanism is to observe. I think that I am only interested in the highest level scene/source changes. But when I add a signal handler for source_transition_start, source_transition_stop, and source_activate, the order and count of messages I output to the log is a bit baffling. I guess part of my confusion is it seems that I need to include obs-internal.h to make conditional decisions that I want to make based on the triggered signals. For long term compatibility, etc., this seems like something best avoided. Any feedback anyone could provide in this area would be most appreciated. Sorry for the long post!