Jim said:
- Bandwidth scaling - I'll look into it. My current setup should be compatible with dynamic setting changes, so I'll see what I can do.
I suspect that what FMLE does is it maintains a dynamically sized send buffer, and when the buffer fills up it starts reducing the bitrate until the buffer is emptied, at which point it increases the bitrate again. Or there might be some RTMP magic behind it, I don't know.
Jim said:
- Video filters - I've thought about this as well. It should be fairly easy to implement at some point, but I want to keep it on the GPU if possible, so my idea is that I would like to make it so that you can run an image source through extra pixel post-process pixel shaders if desired. I'd prefer to keep things image processing off the CPU if possible, and shaders would be easier anyway. Image processing on the CPU is just super CPU/memory consuming.
Personally, I have much more CPU to burn than GPU, since StarCraft II only uses two threads, and I have 8 cores with HyperThreading, but only a single AMD HD5870 GPU. I realise that that's a fairly uncommon setup, but I think you'll have to concede at some point that certain more complex filters (like spatiotemporal denoisers) simply can't be implemented in shaders.
Jim said:
Also, what was going on with your video capture?
I'm not entirely sure what the underlying cause was, as I had another identical game board that synced just fine. The video source was 320x240 CGA, and the capture card interpreted it as 640x480 VGA.
The image was horizontally stretched, starting at 4x stretched and gradually tapering off to actual pixels, with the image spilling out from even lines into odd lines, so one half of the image was on odd lines with the other half on even lines. To make matters worse, scanlines "started" in the middle of the frame, meaning the left of the image was actually the far right of the image.
To fix it, I wrote an Avisynth script that first separated the even and odd lines of the image, then horizontally cropped out three segments of the frame, the part that was 4x stretched (and squeezed it back into its original size), the middle part that was 2x stretched, and the last part that rapidly tapered off to actual pixels (which was too complex to correct). I then stitched the three parts together and resized the result to 640x480.
Because of the offset (image starting in the middle of the frame) there were parts of the image missing on the left and right in the reconstructed frame, but thankfully I could set the game to single player mode so that all the in-game action would occur between the missing parts of the image.
I wish I would've made a screenshot, because I don't have the culprit video source with me any more (it was borrowed from a friend). I do have the
Twitch recording of the live stream though, if you're interested. The video feed at the top right is the one I've stitched together.
The shaking in the bottom left feed is because it was captured using an AVerMedia capture card, and I needed to run the 320x240 signal through a scan doubler to make it 640x480, because the AVerMedia card doesn't support capturing resolutions lower than 640x480. The output of the scan doubler isn't exactly stable, but at least it works (and I wouldn't be able to capture 3 CGA sources otherwise).
I'm not using OBS, obviously; I'm using VHMultiCam (the free predecessor to XSplit) fed into FMLE, with ffdshow injected into VHMultiCam's DirectShow graph using
DirectShowSpy and GraphEdit (you have to stop the graph, insert the filter, connect it between the capture source and VHMultiCam's frame grabber filter, and then restart the graph and hope you haven't crashed VHMultiCam in the process).
Jim said:
- Audio filters - Yes! This is definitely something I was going to look into, someone else mentioned wanting to get rid of his audio hum, and I immediately thought of VST plugins (being that I also use them myself in apps like ableton live) -- I'll see what I can do here as well. CPU is pretty tight, but the audio thread should have room for a little bit of extra optional processing.
That's so awesome to hear! This is actually the one feature I was least expecting you to take seriously, and so far I've heard of no other streamers using VSTs. I'm sure people will start messing with it as soon as it stops being such a faff.
Jim said:
I can't promise any of these things right away, but all of them should be fairly doable.
Great! I can't wait to be able to start using OBS for serious streaming. Keep up the good work :)