I would like to know if it's possible to apply LUT filter to grabbed video frames and then extract the result as a image? Essentially I would like to grab a video frame at a preset interval, scale it, apply a LUT to the frame and then save the result as an image file (JPEG/PNG/BMP).
I can capture the frame from a video filter and scale it using the video_scaler* API. I can also initialize the LUTs using the code in the original filter. But how do I apply the LUT to the raw frame data and then extract the result as an image?
Thanks
I can capture the frame from a video filter and scale it using the video_scaler* API. I can also initialize the LUTs using the code in the original filter. But how do I apply the LUT to the raw frame data and then extract the result as an image?
Thanks