Capturing video from a camera on a Raspberry Pi and filtering in OpenGL before encoding

I need a way to capture video from the camera interface in a Raspberry Pi, run it through a filter written as OpenGL shaders, and then send it to a hardware encoder.

This blog post talks about using OpenGL shader filters on camera output when using raspistill. This is the corresponding source code . The output in this case, however, does not go to the video encoder, and it does not work on video, only on still images. Also (not quite sure), I think this is due to the preview, see These bits: raspitex_state A pointer to the GL preview stateand state->ops.redraw = sobel_redraw.

The blog also talks about the "fastpath", can anyone explain what this means in this context?

+4
source share
1 answer

Texture conversion will work on any opaque MMAL buffer, i.e. camera preview (up to 2000x2000 resolution), video. However, in the sample code is only for GL plumbing to preview still images. I think someone posted the patch on RPI forums to make it work with RaspiVid so you can use it.

Fastpath basically means not copying buffer data to ARM memory and doing software conversion. Thus, for GL rendering, this simply means passing the GL descriptor, so the GPU driver can do this directly.

/ , OpenGL, . , , - glReadPixels, YUV .

, , , RPI, , .

+5

All Articles