Hey there. Unfortunately there are fundamental problems with writing deinterlacing as an effect filter. Someone else had already originally written a deinterlacing effect filter similarly, but ran in to problems that made it unacceptable for merge that you're probably beginning to see.
First problem: async frames are not guaranteed to be consecutively sent to an effect filter. This means that if for whatever reason a frame was not played (if OBS' framerate is lower than the source's) or the last frame repeated (if OBS' framerate is higher than the source's). In the former case, there's no way to get the "last" frame in texture form. In the latter case, you could write ways to detect it (like you did), but it's still not ideal. The last frame must be stored in texture form and be accessible to the deinterlacer at all times, whether or not that last frame was actually able to be played on time or not.
Second problem: being an effect filter means that it can be ordered in the list of effect filters in a way that could break deinterlacing. For example, a user could unintentionally apply a crop before applying the deinterlacing and it could subsequently mess up the output. Deinterlacing needs to occur before effect filters are applied.
Both of those things mean that deinterlacing unfortunately really needs to be something handled by libobs itself to get the a result guaranteed to have proper playback. Proper deinterlacing is not trivial, hence why it's been delayed quite a bit.
I do plan on personally handling this very soon just to simply get it out of the way, but I wanted to get API and plugin issues out of the way first before doing so. Having to wait on it sucks, but I'd rather it have it done right rather than be done on a specific schedule.
And yea, it'd be better to write API functions exposing what you want to do rather than include obs-internal.h directly.
Anyway, just wanted to explain why it's taken so long to add it.