Alik Send
New Member
Hi
First of all, thank you much for your big and really qualitative product, good architecture, documentation and code. It is really good experience in opensource for me.
Then go to subject.
I try to build small app to capture video from blackmagic card, add logo, titles etc, stream over rtmp and show output on display. I want to control it using simple web interface, configuration will be partially hardcoded partially loaded from json-file, partially added from web-interface (titles etc).
I learned API and created application that fit my needs, but I want to run it inside docker, but I have some problems with it.
I use mac os in usual life and created my app on it, but I can't run it in docker on mac os because it crashes on `obs_init_hotkeys` because it wants X-server. Ok, I can comment it and get another crash when try to reset video.
Not a problem. "Production server" will be on archlinux, so I just installed it on virtualbox vm, mounted directory from mac to vm, logged in using ssh, created monitor configuration and then ran `X vt1`. But I also can't get my app working: I have `Segmentation fault` on `Initializing OpenGL...` step. Using `printf` and `blog` "debugger" I find that it crashes in libobs-opengl –> gl-x11.c in `gl_context_create` method when calling `glXChooseFBConfig(display, DefaultScreen(display), ctx_visual_attribs, &frame_buf_config_count);`. I wrote small app on C to play with and detect and solve error.
When I tried to run builded obs inside docker it runs ok. Although no, not quite, it crashes when tried to load scenes, just after answering "No" or going through prompt to run Auto-Configuration Wizard, but it successfully (or partially successfully, see log) passed `Initializing OpenGL...` phase.
So I conclude that code of UI have something, that my app haven't, I walked through the code but don't find something that that would seem to be important (on my mind) in this situation.
Also I tried different solutions.
Due to https://www.opengl.org/discussion_boards/showthread.php/194751-i-MX6-glXChooseFBConfig-Segmentation-Fault-(Segfault)?p=1275981&viewfull=1#post1275981 I tried change `ctx_visual_attribs` to `[None]` or `NULL`.
I tried to set `gpu_conversion = false` (on mac I used `true`).
I tried run container with `--device /dev/dri`.
I tried to run not with simple X but with xfce.
None of this helps. So I look forward to your help.
I posted my code on github: https://github.com/aliksend/obs-proof (yes, CMakeLists stealed from UI with my truncations). You can build it with `docker build . -t obs_proof`, run my app with `docker run --rm -it -e DISPLAY=:0 -v /tmp/.X11-unix:/tmp/.X11-unix obs_proof` and obs with `docker run --rm -it -e DISPLAY=:0 -v /tmp/.X11-unix:/tmp/.X11-unix obs_proof ./obs`.
First of all, thank you much for your big and really qualitative product, good architecture, documentation and code. It is really good experience in opensource for me.
Then go to subject.
I try to build small app to capture video from blackmagic card, add logo, titles etc, stream over rtmp and show output on display. I want to control it using simple web interface, configuration will be partially hardcoded partially loaded from json-file, partially added from web-interface (titles etc).
I learned API and created application that fit my needs, but I want to run it inside docker, but I have some problems with it.
I use mac os in usual life and created my app on it, but I can't run it in docker on mac os because it crashes on `obs_init_hotkeys` because it wants X-server. Ok, I can comment it and get another crash when try to reset video.
Not a problem. "Production server" will be on archlinux, so I just installed it on virtualbox vm, mounted directory from mac to vm, logged in using ssh, created monitor configuration and then ran `X vt1`. But I also can't get my app working: I have `Segmentation fault` on `Initializing OpenGL...` step. Using `printf` and `blog` "debugger" I find that it crashes in libobs-opengl –> gl-x11.c in `gl_context_create` method when calling `glXChooseFBConfig(display, DefaultScreen(display), ctx_visual_attribs, &frame_buf_config_count);`. I wrote small app on C to play with and detect and solve error.
When I tried to run builded obs inside docker it runs ok. Although no, not quite, it crashes when tried to load scenes, just after answering "No" or going through prompt to run Auto-Configuration Wizard, but it successfully (or partially successfully, see log) passed `Initializing OpenGL...` phase.
Code:
info: ---------------------------------
info: Initializing OpenGL...
info: Loading up OpenGL on adapter VMware, Inc. llvmpipe (LLVM 5.0, 256 bits)
info: OpenGL loaded successfully, version 3.3 (Core Profile) Mesa 17.2.8, shading language 3.30
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
error: glGetIntegerv(GL_MAX_TEXTURE_ANISOTROPY_MAX) failed, glGetError returned 0x500
info: ---------------------------------
info: video settings reset:
base resolution: 800x600
output resolution: 800x600
downscale filter: Bicubic
fps: 30/1
format: NV12
YUV mode: 601/Partial
Also I tried different solutions.
Due to https://www.opengl.org/discussion_boards/showthread.php/194751-i-MX6-glXChooseFBConfig-Segmentation-Fault-(Segfault)?p=1275981&viewfull=1#post1275981 I tried change `ctx_visual_attribs` to `[None]` or `NULL`.
I tried to set `gpu_conversion = false` (on mac I used `true`).
I tried run container with `--device /dev/dri`.
I tried to run not with simple X but with xfce.
None of this helps. So I look forward to your help.
I posted my code on github: https://github.com/aliksend/obs-proof (yes, CMakeLists stealed from UI with my truncations). You can build it with `docker build . -t obs_proof`, run my app with `docker run --rm -it -e DISPLAY=:0 -v /tmp/.X11-unix:/tmp/.X11-unix obs_proof` and obs with `docker run --rm -it -e DISPLAY=:0 -v /tmp/.X11-unix:/tmp/.X11-unix obs_proof ./obs`.
Last edited: