On Android, I try to do some OpenGL processing on camera frames, show these frames in the camera preview, and then encode the frames in the video file. I am trying to do this using OpenGL, using GLSurfaceView and GLSurfaceView.Renderer and with FFMPEG to encode the video.
I successfully processed the images using the shader. Now I need to encode the processed frames on the video. GLSurfaceView.Renderer provides the onDrawFrame method (GL10 ..). In this method, I try to read image frames using only glReadPixels (), and then put the frames in a queue for encoding on video. By itself, glReadPixels () is too slow - frame rate in one bit. I am trying to speed this up using Pixel Buffer Objects. This does not work. After pbo is connected, the frame rate remains unchanged. This is my first time using OpenGL, and I don't know where to start looking for a problem. Am I doing it right? Can someone give me some direction? Thanks in advance.
public class MainRenderer implements GLSurfaceView.Renderer, SurfaceTexture.OnFrameAvailableListener { . . public void onDrawFrame ( GL10 gl10 ) {
Bill gockeler
source share