By the way, how well does that work in general? Resolution, encoder settings, resulting quality, system load, thermal performance, etc.
I have a project in the back of my mind, not started yet, that might use something like that. Record 3D and stream 2D from a pair of cameras on a robot. Barely any processing before the encoder - just side-by-side raw cameras for the 3D recording, and maybe a transparency gradient for the 2D stream - but it does need to keep up with the data rate and those two different encodings. Probably 1920x1080p60 from each camera, so that's the streaming 2D size and framerate. Recording 3D is the same but twice the width: 3840x1080p60.
Probably need two simultaneous instances of OBS, just for the different frame sizes. If access to the cameras is exclusive, as was true for almost all systems until recently for performance reasons, then the recording instance might do *all* of the compositing with the larger frame, then send the too-big 2D version through the virtual camera to the streaming instance, which only crops it. Anyway......
The only other device that might be even semi-reliably present would be a phone hotspot, or maybe a laptop that does the same function, but neither one is reliable enough to consider as anything more than a streaming passthrough and occasional maintenance. (it doesn't always stream, but it does always record) So it does need to do everything on the platform itself.
It'll have a GUI, not headless, but the GUI is already taken by another app. It's perfectly okay though, for me to set up OBS graphically, and then switch to the other app while OBS just runs. No further changes required. It's more a record of what happened, than a live production.