Zoom Webinar/Live Streaming

louiiee8

New Member
I have a hybrid event coming up and I need help figuring out how to set it up. Due to Covid reasons, the event host is only allowing a limited amount of attendees. Therefore we are trying to set up a live stream via Zoom Webinar for those who cannot attend the event and there are a couple of angles to how I intend to pull this off. I am new to live streaming so bear with me as I try to explain this.

First, I want to have a camera set up in-person at the event to capture the live speakers so those watching at home can see and hear them speak. Currently I have a Black Magic Pocket Cinema Camera 4k that I am connecting to my Mac with an HDMI but have to use a CamLink 4k adapter to make the connection.

Second, since the event is an award ceremony we wanted to have the honoree's speak at the event but due to covid, not every honoree can make the event so what we did is pre-recorded each honoree's speech pre-event, and then I want to display those pre-recorded videos onto the Live Stream via Zoom. So the people watching the live stream will be able to see and hear the pre-recorded videos, which I intended to display by doing a screen share via Zoom.

Some of the software that I have downloaded onto my Mac includes: OBS, SoundFlower, VB-Cables A & B. Is there anything else I need or don't need?

I believe I have to use OBS because I want to seamlessly transition the camera views from the camera that will be set up in-person (BMPCC 4K) at the event to then switch to the pre-recorded videos that will be up and ready to start playing on a Mac and like I said screen shared on Zoom. I was able to display the camera feed from my BMPCC to OBS, as well as a Window capture to display the pre-recorded videos.

My biggest issue seems to be audio. I can't seem to figure out how I can input the audio coming from the in-person camera (BMPCC 4K) as well as the audio coming from the pre-recorded videos that are on my Mac to both output through Zoom. The current audio settings I have set up are:
Mac Sound Input:
Soundflower (2ch)

OBS Audio:
Desktop Audio = Soundflower
Mic/Aux Audio = VB-Cable B
Monitoring Device = VB-Cable A

Zoom
Speaker = VB-Cable B
Microphone = VB-Cable A

After hours of researching, watching videos, reading forums, I can't seem to find something that is exactly what I am looking for (unless I just don't understand due to my lack of live streaming knowledge).

Hoping that was a good enough explanation. Anyone who can help me out would be greatly appreciated.

Here's is the current log file: https://obsproject.com/logs/a46TF2IRxS7MZEXO
&
Not sure if this is any different but here is also the latest log file: https://obsproject.com/logs/NfTDSnkZyWXUwcrJ

Thanks you!
 

onlexOBS

New Member
Hey did you find a good solution ? I am basically going to do the same thing, so I was looking for a good guide on how to host a hybrid event, in regards to the cameras, microphones and so on.
 

AaronD

Active Member
It's not straightforward! I have a hybrid meeting rig myself that is mature enough now to "just use" without much thought, but it took a LOT of twice-a-month meetings, and testing between those meetings, to work out the wrinkles. I knew from the beginning that it was going to be that complicated, so I made a script to set it all up and tear it down, the same way every time, so I only have to think it through when I'm *modifying* the rig, not when I'm running it.

Documentation is attached, including the scripts, audio processing sessions, and diagrams. It's designed for Ubuntu Studio 22.04 LTS (Linux), so some of what it does is specific to that particular system, but it should be easy enough to figure out and adjust for your rig.
I also made it work for both a live meeting and a studio recording, since there were enough similarities in the way that I thought about it. Same main script, different options passed to it.

From the beginning, I had *two* instances of OBS on the same machine, in a Master / Slave configuration. That's not something that OBS does natively; it's just how I set them up. obs ... --multi ... in the script, prevents them from complaining about each other, and some other options make each one load a different set of settings.
  • The Master produces the feed to the remote people, as if it were any other livestream, except that OBS itself doesn't actually stream. It uses the Virtual Camera to pass it to the meeting app, which does its thing like normal.
  • The Slave window-captures the meeting (so the actual meeting window can be behind my controls, not shown directly), and takes a direct feed from the Master, and shows either of those two scenes to the local display via a full-screen projector. It also records that.
Audio stays *outside* of OBS as much as possible, being handled in a DAW instead. Digital Audio Workstation: essentially a complete sound studio all in one app. For this rig, I'm only interested in the live mixer part:
  • It takes all of the mics directly, cleans them up and broadcast-masters them, and sends that mix to the meeting and to OBS Slave for recording.
  • It takes the meeting audio, cleans it up and broadcast-masters it just like the mics, and sends that to the local speakers and to OBS Slave for recording.
  • It also "ducks" the local mics under the meeting return, so that the remote people don't get an echo. This is because there's enough going on at this end that the meeting's own echo cancellation can't work here. It does work on the remote end, because those people still have exactly what it's designed to work with.
  • It takes any audio that OBS Master produces (videos, etc.), and sends that as-is to the meeting, to the local speakers, and to OBS Slave for recording.
    • Likewise for any other apps, like maybe a background music player that is used as if it were standalone. The audio sink that OBS Master connects to is also set to be the default system audio sink; so *any* app that plays audio, ends up in that input to the DAW, and handled accordingly.
  • It takes automation commands (see below), to turn different things on or off, which keeps the various destination signals clean, depending on what's happening at the time. (mics off during a video, for example, and back on afterward)
The Advanced Scene Switcher plugin is used in OBS Master to decode the naming convention that I made for the scenes and send the appropriate control signals: WebSocket to OBS Slave, and OSC (Open Sound Control) to the DAW. It's also used in OBS Slave to receive the WebSocket signals and switch between the two scenes. So all of the automation logic is in the Master, using Adv. SS, and everything else just does what it's told from there.
From an operator's perspective, they're just producing a live stream to the remote people, using OBS Master, and telling OBS Slave to start and stop recording at the appropriate times. Everything else is automatic.
 

Attachments

  • Live_Show_Meeting.zip
    513.6 KB · Views: 26
Top