Route Audio to Zoom on Linux with Pipewire

jroc

New Member
Having spent the entire afternoon working on this, and not coming across a direct solution that matched what worked for me I thought I'd post here in case it helps others.

I am on a fresh install of fedora-35, KDE spin; running X11 display server. But I assume this will work on any distro running PipeWire.

I did first set up the virtual video camera for use in Zoom, as per many posts on the internet. (I.e., install the v4l2loopback package appropriate for your distro. For fedora you need to get this from the RPM Fusion repo. Then in a terminal execute sudo modprobe v4l2loopback to load the module so OBS can see it).

Next I installed qjackctl. This is a utility that enables you to graphically see and configure the 'wiring diagram' between audio inputs (sinks) and outputs (sources). There are a few other utilities you could use to do this - I use qjackctl because I am familiar with it.

Fire up OBS. Fire up Zoom and open a Zoom meeting. Run qjackctl and click on the 'graph' button. You'll now get a display of all the inputs/outputs on your system, with lines between them showing what is routed to what. You'll likely need to resize the window and drag the objects around to untangle the mess of wires. You should see Zoom and an OBS objects on the graph. To 'wire' the output of OBS into Zoom, drag a wire out of the 'OBS Monitor' object's 'monitor_FL' and 'monitor_FR' ports over to the Zoom input ports. Zoom should now be getting any audio that OBS is producing or processing.

You can, of course, continue to tune the wiring for your OBS session - maybe you'd want to remove any wire that connects your mic to Zoom and instead wire the mic into OBS so that Zoom gets it's audio only from OBS. Depending on your needs you can mess around with the wiring. One thing you will need to consider is how you'll listen to audio that isn't your own mic. If you want to hear a video you're playing into OBS you'll want to be sure that the web browser is wired to feed into the 'Built in Audio Analog Stereo.'

Note that you can also create Jack inputs in OBS by adding a 'Jack Input Client.' When you do that it will show up in the qjackctl graph and you can then wire other audio sources to feed into it - thus feeding into an OBS scene. E.g., in a case where you'd only want your web browser's audio to be present in a specific scene and always on.

Hope this helps anyone else struggling with this as I was.
 

bnordgren

New Member
I was messing with this a while back but a youtube video called my attention to the fact that OBS's "monitor" outputs represent the raw microphone feed into OBS. Processed audio and video go to the stream, the recording, or both. However, the patchbay is essentially doing the same task that picking the microphone directly from zoom is doing.
See: https://www.youtube.com/watch?v=wr9PdkX93WM

Windows users can solve this with NDI, but I bet there's a linux solution that includes ffmpeg, a virtual camera via v4l2loopback (https://stackoverflow.com/questions...mp-stream-and-convert-into-a-live-webcam-v4l2), and a virtual device ala Pipewire (https://gitlab.freedesktop.org/pipewire/pipewire/-/wikis/Virtual-Devices).

I haven't yet cracked this but this is the approach I'm taking given the limitations alleged in the video above.
 

visierl

New Member
Did this ever get solved? I need to do the same thing on Ubuntu 24.04 (maybe too soon) where I need an NDI like audio feed that is sync'ed to my virtual camera output for input into Zoom. The reason I am using Zoom is that it is streaming for dummies: my audiences understand it, and I can control distribution. I am surprised I can't easily find an established solution to this for Linux yet, the world has been using Zoom for quite a while now.
 

visierl

New Member
BTW, I was using NDI on a MacBook more or less successfully but the MacBook in question is very long in the tooth and simply doesn't have the horsepower to handle OBS production.
 

visierl

New Member
Okay I solved my problem using PulseAudio. I may, someday revisit it with Jack or PipeWire but for now there is a solution. Here is what I wound up with.

First off, my setup and the specific problem I was trying to solve. I host House Concerts and have a two camera live-stream that I produce in OBS and run over Zoom. I know that Zoom is far from the best platform technically, but I use it for a few reasons. Primarily, Zoom is the one thing I know my remote audience can use. Zoom also allows me to limit my audience to people who know about the show and, important for some artists, not create a permanent recording in the wild that the artists do not control. Finally, because Zoom sessions are finite, not permanently hosted anywhere, and controlled access, there are no copyright material restrictions on them. I can freely allow my artists to perform anything I want in a limited access "private party" setting without stepping outside of fair-use. My audio feed comes already processed from a sub-mix on the front-of-house mixing console, so my sound needs inside OBS are simple. One audio stream, delivered to my laptop through a sound-card, and sent on from OBS to Zoom via an OBS Monitor Output.

The problem I had with this was that, while I could adjust the audio delay on the source to achieve excellent audio/video sync (300ms) in recordings made directly from OBS (and, presumably, for any stream generated to a supported streaming service), the monitor output is, as has been noted by others, a monitor of the audio sources at the input to OBS, not a monitor of the final mix. As a result, my Zoom feed was 300ms out of sync with my audio.

So, the setup without the solution was as follows:

Sound Card ->OBS(Alsa Capture Source)->Monitor Output->Pulse(loopback, sink):obs-monitor->Pulse(remap, source)obs-monitor->Zoom

As an aside, I use an Alsa capture source for the sound card because when I tried to use a PulseAudio capture source, the source did not show up in the OBS mixer plugin. Not sure what is going on there...

I solved my sync problem adding a Pulse loopback device with a 300ms delay into that path so that it now looks like:

Sound Card ->OBS(Alsa Capture Source)->Monitor Output->Pulse(loopback,sink):obs-monitor->Pulse(remap, source)obs-monitor->
Pulse(loopback, 300ms delay, sink)obs-monitor-delayed->Pulse(remap, source)obs-monitor-delayed->Zoom

Writing this, I realize I could probably just add the 300ms delay to the original 'obs-monitor' loopback, but what I have gives me access to the OBS monitor output both delayed or not delayed in a second loopback that can source either one and feed my headphones, which is convenient, though not definititve (if the video latency were to turn out to be higher into Zoom than it is through OBS this would be misleading), for watching the overall stream from OBS during production too.

I am currently setting this up using a simple shell script because I am having trouble getting Pulse to read my ~/.config/pulse/default.pa on Ubuntu 24.04. Here is the script that sets up the whole path (and, also, a headphone monitor loopback to help with debugging this mess). If default.pa works for you, the content would be the same lines with the pactl removed:

# Set up source from the OBS Monitor Output for Zoom Audio
pactl load-module module-null-sink \
sink_name=obs-monitor \
sink_properties=device.description=OBS_Monitor
pactl load-module module-remap-source \
master=obs-monitor.monitor \
source_name=obs-monitor-remap \
source_properties=device.description=OBS_Monitor

# Set up a delay line on the OBS Monitor Output to control audio
# latency into consumers of the OBS Monitor output. Clients can either
# pick off the non-delayed monitor output (obs-monitor-output) or the
# delayed monitor output (obs-monitor-delayed
#
#The sink that the delay line (delayed loopback) dumps into
pactl load-module module-null-sink \
sink_name=obs-monitor-delayed \
sink_properties=device.description=Delayed_OBS_Monitor

# The source carrying the audio dumped into the above sink for input
# into a client
pactl load-module module-remap-source \
source_name=obs-monitor-delayed \
master=obs-monitor-delayed.monitor \
source_properties=device.description=Delayed_OBS_Monitor

# The delay line itself taking from obs-monitor.monitor (the internal
# source that caries the OBS Monitor output) and delivering it to
# obs-monitor-delayed (the sink that will become the source for Zoom
# or other clients.
pactl load-module module-loopback \
latency_msec=300 \
source=obs-monitor.monitor \
source_dont_move=true \
sink=obs-monitor-delayed \
sink_dont_move=true \
sink_input_properties=device.description=OBS_Delay_Line_Loopback \
source_output_properties=device.description=OBS_Delay_Line_Loopback

# Set up a monitoring loopback so I can debug things with
# headphones... Default to the OBS monitor output delayed, since that
# should look and feel natural when using OBS.
pactl load-module module-loopback \
sink_input_properties=device.description=PA_Monitoring_Loopback \
source=obs-monitor-delayed.monitor \
source_output_properties=device.description=PA_Monitoring_Loopback
Not sure whether this will be of any use to anyone, but I thought I would post it here in case some other lost soul comes across it.
 
Top