How the 'rtmp_output' plugin gets video frame data from the textures?

choes

New Member
Hello guys,

I'm reading the source code of obs-studio, and confused about the data source of the rtmp_output plugin.

The av_capture_input plugin puts the captured frame data into async_frames by calling obs_source_output_video(), we can get the frame in obs_source_update_async_video() by calling obs_source_get_frame(), then copy the frame data to the async_texture.

However, I notice that the obs_video_thread uses convert_textures as the source while calling gs_stage_texture(). I'm a novice about the OpenGL, and confused about the relationship between the async_texture and the convert_textures.

So, how does the rtmp_output plugin gets video frame data from these textures?

Maybe I misunderstand the source code, please correct me if that.

Thanks,
Choes
 
Last edited:

Lain

Forum Admin
Lain
Forum Moderator
Developer
rtmp_output gets encoded data, not raw video data. If you look at its capability flags, you'll notice it's an output that only takes encoded data -- it has to be linked to audio/video encoders before it can be used. The raw video/audio data goes to the encoders, then the encoders pass the encoded packets to rtmp_output automatically. rtmp_output then muxes the packets into FLV and sends it out to the stream.
 

choes

New Member
rtmp_output gets encoded data, not raw video data. If you look at its capability flags, you'll notice it's an output that only takes encoded data -- it has to be linked to audio/video encoders before it can be used. The raw video/audio data goes to the encoders, then the encoders pass the encoded packets to rtmp_output automatically. rtmp_output then muxes the packets into FLV and sends it out to the stream.
Hello Jim,
Thanks for your reply, but I'm still confused about the process that the encoders gets raw video data from the textures.
How does the encoer(e.g. obs_x264) gets its raw video data?
 

Lain

Forum Admin
Lain
Forum Moderator
Developer
They get it through their "encode" callback. Defined in libobs/obs-encoder.h:162, which obs_x264 defines in obs-x264.c:755, which goes to the obs_x264_encode function in obs-x264.c:674.
 

choes

New Member
They get it through their "encode" callback. Defined in libobs/obs-encoder.h:162, which obs_x264 defines in obs-x264.c:755, which goes to the obs_x264_encode function in obs-x264.c:674.
Hello Jim,
Maybe I don't express my intention clearly. I'm not good at English, thanks for your patience.
I'd like to figure out the data flow from the source plugin(e.g. av_capture_input) to the encoder plugin.

I noticed that the av_capture_input puts the captured data to source->async_frames by calling obs_source_output_video(), and gets the closest frame by calling obs_source_get_frame() while calling obs_source_update_async_video(), then stores the data to the soruce->async_texture by calling gs_texture_set_image().
Does the encoder plugin get the data from the source plugin's async_texture? If so, how does it get data form this texture?
 
Top