Hey Everyone,
I'm running into an issue and I'm not entirely sure if it is OBS or other problems. Here is my workflow:
I set up a camera tripod 8' in front of me and another one 8' directly to my right. I put my iPad down on the ground, with both phones screen up. My right hand hits play on my iPad which sends audio to my bluetooth speaker. My left hand simultaneously pushes record on both phone screens at the same time. They don't exactly start at the same time, since one is an iPhone and the other is an Android, so I have to stagger my screen taps to get it to actually have started at the same time.
The first time I did this my delay was 50~-100ms~. I did this another night, except this time after I hit record, I clapped my hands really loudly as to have some obvious waveform to cut.
The one recording on the android camera, for some reason, split the recording into 3 separate files. I stitched them together with ffmpeg:
for f in *.mp4 ; do echo file \'$f\' >> list.txt; done && ffmpeg -f concat -safe 0 -i file.txt -c copy IMG_2.mp4
The contents of file.txt:
file 'VID_20240226_011704187.mp4' file 'VID_20240226_011704187_02.mp4' file 'VID_20240226_011704187_03.mp4'
I'm hesitant to say this worked flawlessly. It Appears to have done so...
I'm having a hard time viewing frame count in quicktime (I'm on a Mac). What I did was open each video file in audacity so I could see AND zoom in on the waveform. The waveform view in quicktime doesn't let me zoom in to the level of detail to create an exact cut.
In audacity I zoomed into the max on the hand clap, and the FIRST * dot that goes under the X axis in audacity is where I put my cursor and marked the time, which was 1.699 seconds.
I trimmed the file using this command:
ffmpeg -i IMG_5734.mp4 -ss 00:00:1.699 -to 00:54:00 -c:v copy -c:a copy 1-trim.mp4
The second file in audacity came out to 1.221 for the same spot in the waveform. The command I used to trim that file:
ffmpeg -i 2-720p.mp4 -ss 00:00:1.221 -to 00:53:51 -c:v copy -c:a copy 2-trim.mp4
One file is slightly shorter because that's the first camera I hit stop recording on.
I opened the files in quick time to make sure my ffmpeg cut worked as expected, and it seemed to...
In OBS I have a scene. In that scene exist my webcam and two VLC Media file sources. In my OBS preferences I have keypad 6 set as play/pause for both VLC Source 1 & 2. Getting this to play via hotkey was a little buggy but I eventually got it so that when I hit my keypad 6, both files started playing at the exact same time.
My result is confusing. During the performance you can still see what appears the videos de-syncing and re-synching a multiple of times. I'm not sure if this is due to ffmpeg's incorrect stitching or whether or not there is something in OBS that is not actually triggering the files to play at the same time, or something is happening that is causing one of the files to be played faster than the other? IDK.
This particular highlight is from the second part of the stitch. ffmpeg stitched 3 files together, two twenty something minute files and a 2-3~ minute file at the end. You can see they clearly appear to be synched:
https://www.twitch.tv/videos/2075074441
However if we look this highlight from the first part of the stitch super imposed over the other camera angle, it seems like there is a noticeable delay. You can see me clearly picking up the meteor hammers with one camera being out of sync of the other by <100ms. Due to the nature of this activity even so much as a 10ms delay will be very noticeable. I drop one of the meteors at 42-43seconds in, and this appears to hit the ground at different timesin each frame... At the time stamp 2:03 to 2:07 the blue and yellow ones are spinning in an exact circle, and this looks to be synced in between frames/cameras, when just a couple of minutes ago it was not. Towards the end of that highlight it appears to desync again....
The second example I have is here:
https://www.twitch.tv/videos/2075084452
It looks entirely synched for the duration of the highlight. I drop it towards the end, which solidifies that belief.
I'm trying to capture 3 dimensions to be displayed on a 2 dimensional screen. I do this by having Camera One capture my X to Y axis movement. My Camera Two captures my Z to Y axis movement. It should make sense that Z and X do not always line up, but the Y should line up between both frames exactly and this is not what I believe I am seeing. To clarify, there is only one of each color. Any symmetries you see between colors are between camera angle frames, and the height *should* be identical.
Is this sync problem an OBS bug ?? Or is this just somehow a feature of my performance // my brain incorrectly parsing a 3rd dimension on a 2 dimensional screen?
Thanks again for taking the time to read through my workflow and confirm the results of my files. Both clips are from the same OBS broadcast.
I'm running into an issue and I'm not entirely sure if it is OBS or other problems. Here is my workflow:
I set up a camera tripod 8' in front of me and another one 8' directly to my right. I put my iPad down on the ground, with both phones screen up. My right hand hits play on my iPad which sends audio to my bluetooth speaker. My left hand simultaneously pushes record on both phone screens at the same time. They don't exactly start at the same time, since one is an iPhone and the other is an Android, so I have to stagger my screen taps to get it to actually have started at the same time.
The first time I did this my delay was 50~-100ms~. I did this another night, except this time after I hit record, I clapped my hands really loudly as to have some obvious waveform to cut.
The one recording on the android camera, for some reason, split the recording into 3 separate files. I stitched them together with ffmpeg:
for f in *.mp4 ; do echo file \'$f\' >> list.txt; done && ffmpeg -f concat -safe 0 -i file.txt -c copy IMG_2.mp4
The contents of file.txt:
file 'VID_20240226_011704187.mp4' file 'VID_20240226_011704187_02.mp4' file 'VID_20240226_011704187_03.mp4'
I'm hesitant to say this worked flawlessly. It Appears to have done so...
I'm having a hard time viewing frame count in quicktime (I'm on a Mac). What I did was open each video file in audacity so I could see AND zoom in on the waveform. The waveform view in quicktime doesn't let me zoom in to the level of detail to create an exact cut.
In audacity I zoomed into the max on the hand clap, and the FIRST * dot that goes under the X axis in audacity is where I put my cursor and marked the time, which was 1.699 seconds.
I trimmed the file using this command:
ffmpeg -i IMG_5734.mp4 -ss 00:00:1.699 -to 00:54:00 -c:v copy -c:a copy 1-trim.mp4
The second file in audacity came out to 1.221 for the same spot in the waveform. The command I used to trim that file:
ffmpeg -i 2-720p.mp4 -ss 00:00:1.221 -to 00:53:51 -c:v copy -c:a copy 2-trim.mp4
One file is slightly shorter because that's the first camera I hit stop recording on.
I opened the files in quick time to make sure my ffmpeg cut worked as expected, and it seemed to...
In OBS I have a scene. In that scene exist my webcam and two VLC Media file sources. In my OBS preferences I have keypad 6 set as play/pause for both VLC Source 1 & 2. Getting this to play via hotkey was a little buggy but I eventually got it so that when I hit my keypad 6, both files started playing at the exact same time.
My result is confusing. During the performance you can still see what appears the videos de-syncing and re-synching a multiple of times. I'm not sure if this is due to ffmpeg's incorrect stitching or whether or not there is something in OBS that is not actually triggering the files to play at the same time, or something is happening that is causing one of the files to be played faster than the other? IDK.
This particular highlight is from the second part of the stitch. ffmpeg stitched 3 files together, two twenty something minute files and a 2-3~ minute file at the end. You can see they clearly appear to be synched:
https://www.twitch.tv/videos/2075074441
However if we look this highlight from the first part of the stitch super imposed over the other camera angle, it seems like there is a noticeable delay. You can see me clearly picking up the meteor hammers with one camera being out of sync of the other by <100ms. Due to the nature of this activity even so much as a 10ms delay will be very noticeable. I drop one of the meteors at 42-43seconds in, and this appears to hit the ground at different timesin each frame... At the time stamp 2:03 to 2:07 the blue and yellow ones are spinning in an exact circle, and this looks to be synced in between frames/cameras, when just a couple of minutes ago it was not. Towards the end of that highlight it appears to desync again....
The second example I have is here:
https://www.twitch.tv/videos/2075084452
It looks entirely synched for the duration of the highlight. I drop it towards the end, which solidifies that belief.
I'm trying to capture 3 dimensions to be displayed on a 2 dimensional screen. I do this by having Camera One capture my X to Y axis movement. My Camera Two captures my Z to Y axis movement. It should make sense that Z and X do not always line up, but the Y should line up between both frames exactly and this is not what I believe I am seeing. To clarify, there is only one of each color. Any symmetries you see between colors are between camera angle frames, and the height *should* be identical.
Is this sync problem an OBS bug ?? Or is this just somehow a feature of my performance // my brain incorrectly parsing a 3rd dimension on a 2 dimensional screen?
Thanks again for taking the time to read through my workflow and confirm the results of my files. Both clips are from the same OBS broadcast.