Filter interface for connecting to the processing chain using AviSynth.

colorist-64

New Member
Idea for PC systems - OBS provides a function for receiving a video stream for a custom script, which is processed by the AviSynth+ installed in the Windows system and receives frames from the script.

The filter calls a custom named script file, in which the user himself specifies the OBS function of receiving raw frames from the OBS. The filter calls a custom named text script file, in which the user himself call the function of receiving frames from the OBS. The filter then takes the frames and passes them down the filter chain. The user's task is to write the processing of the algorithm for eliminating tears and restoring dropped frames inside his script. In this case, the user himself is responsible for the synchronization of video and audio.

A custom script can output frames in different formats, including YV12.

An example of interface implementation is available in ffdshow in the k-lite codec pack. There is no need to process frames in the filter itself, it is enough to embed user script into the chain of processing.

I don't see a way to do this yet. It seems to me that such a relatively simple filter is universal and does not require special attention. There are probably solutions in Pyton, and if there are any, I will always be glad to know about it.

PS. There are several solutions for restoring frames in these scripts that run in real time on powerful modern systems.
 
Last edited:

colorist-64

New Member
Hello to all!
After rereading my post, I realized that some clarifications were needed.

There are situations that cannot be solved in principle with the help of the OBS developer's FAQ. Often drops or duplicate frames are not caused by weak hardware. And these duplicates are very noticeable in games like American / Euro Truck Simulator, which shows uniform forward motion in the frame. Jerks and breaks in movement greatly spoil the emotion of games. Therefore, replacing duplicate frames with synthesized or restored frames can improve the perception of the broadcast.

The main reason for these duplicates:
1. Short-term overload of PCIe buses and waiting for a response from drives due to problems of the game itself, when the game is engaged in relaying outdated data in mods or handling internal critical errors without crashing.
2. Inconsistency between the game FPS and the FPS of the capture system.
3. Problems with the network interface and Internet provider.
4. The problem is on the side of the viewer, whose Internet channel cannot ensure the timely arrival of data packets from the stream relay provider.

Points 4 have a decision on the side of the viewer, although they are not always available to him. But there is an opportunity to fix the problems on the side of the streamer. Most of the solutions, as I said, are described in the OBS FAQ. But a number of problems remain that have not been solved by simply increasing the power of gaming devices. I will tell you about this in detail.

Developers of mods in ATC / ETS2 games do not always make optimal maps, which leads to a drop in FPS in certain areas of the terrain. This cannot be eliminated even with the latest graphics cards currently available. In addition, without fixing the FPS, duplicate frames or jerks will necessarily appear in the game, since the lifetime of the frame is not constant in the game and the movement in the frame has different time stamps on the real time scale. If you enable vertical sync, you can eliminate motion jerks. But this does not solve the problem if the FPS falls below the frame capture rate. This situation often appears on non-optimized mods and there is a way to overcome it.

We all know TVs that increase the uniformity and smoothness of motion in the frame, eliminating short chains of lost frames. But there is a software solution that is not covered by commercial patents. And these are special free libraries in AviSynt+. Although AviSynt+ is advertised as a scripting language for video with a predetermined number of frames, it can be used in real time by providing it with limited chain buffered frames to process. In a previous post, I pointed out a method that you can use. But, unfortunately, there is no link to include this process in the stream between the OBS frame grabber and the codec.

I was able to locally obtain a frame rate of 200+ FPS when analyzing duplicates and recovering lost frames on an AMD 2400G processor. Libraries also allow you to access graphics card resources to speed up certain functions, such as searching for motion vectors and blending pixels to create a syntetic frame. This solution is good in that it becomes possible not only to recover missing frames, but also to do fine processing of the video. Motion smoothing in the frame is implemented by shaders (yes, exactly?). But I did not find recovery of lost frames using shaders.

That is why there are requests here to connect AviSynth and OBS. I am not a software developer, so it is difficult to assess the complexity of this solution. But at a superficial glance, it seems to me that you can do a trick with taking out the processing of frames outside using a filter. Perhaps it will be a Python script, I don't know.

The problem with duplicates in video, which cannot be fixed by simply increasing the power of the equipment, can be eliminated by such a trick. Modern multithreaded processors are quite capable of helping, especially if encoding on the GPU is used and the processor is not more than 50% busy.

I tried to implement AviSynth connection by transferring the video stream to an external network interface and then capturing frames on another computer using the FFMPEG & K-Lite libraries, where it is possible to embed your scripts. However, I got a number of problems associated with data corruption in the frame, since nothing but UDP did not want to work.

I will say that OBS is a very good solution for those who are able to make an original scene in the frame. OBS is clear and has rich functionality, which allows you to use it not only as software for broadcasting, but also for recording video if you do not have a smartphone (yes, I'm oldfag! :). But there is a USB camera.

Always with you.
 
Last edited:

colorist-64

New Member
Sorry, I forgot to add that I use a 1660Super 6Gbytes VRAM graphics card to run games and to encode the stream. The embedded core of the 2400G processor is disabled, but in other situations it can be used simultaneously for auxiliary calculations.
 

theistaks

New Member
Sorry, I forgot to add that I use a 1660Super 6Gbytes VRAM graphics card to run games and to encode the stream. The embedded core of the 2400G processor is disabled, but in other situations it can be used simultaneously for auxiliary calculations.

I tried to implement AviSynth connection by transferring the video stream to an external network interface and then capturing frames on another computer using the FFMPEG & K-Lite libraries, where it is possible to embed your scripts. However, I got a number of problems associated with data corruption in the frame, since nothing but UDP did not want to work.
 

colorist-64

New Member
I tried to implement AviSynth connection by transferring the video stream to an external network interface and then capturing frames on another computer using the FFMPEG & K-Lite libraries, where it is possible to embed your scripts. However, I got a number of problems associated with data corruption in the frame, since nothing but UDP did not want to work.
It is my answer and it describes one of my attempts to deduce processing of video for limits OBS. It has been checked up, that after processing usual service FFMPEG allows to send directly video and audio through RTMP with the indication of keys of translation in a command line.

This translation well worked by transfer of an actual file as a source. But the problem was at attempt to pass video and audio from OBS on other computer on which it supposed process video for restoration of the dropped out staff.

At present conservation of video and audio from OBS on any network port is a unique way to reach the staff of video. Meanwhile OBS does not provide any other means for the user access to the staff of video in actual time before they will get on Youtube or Twitch. Other means allow пробросить video on other computer, but use in a circuit of processing AviSynth there is not stipulated.
 
Top