OpenGL ES 2 does not work on Android API23

In my application, I do image manipulation. My code is based on THIS example.

The only change I made was to change FRAGMENT_SHADER to shades of gray and look like this:

 private static final String FRAGMENT_SHADER = "#extension GL_OES_EGL_image_external : require\n" + "precision mediump float;\n" + // highp here doesn't seem to matter "varying vec2 vTextureCoord;\n" + "uniform samplerExternalOES sTexture;\n" + "void main() {\n" + " vec4 tc = texture2D(sTexture, vTextureCoord);\n" + " gl_FragColor.r = tc.r * 0.3 + tc.g * 0.59 + tc.b * 0.11;\n" + " gl_FragColor.g = tc.r * 0.3 + tc.g * 0.59 + tc.b * 0.11;\n" + " gl_FragColor.b = tc.r * 0.3 + tc.g * 0.59 + tc.b * 0.11;\n" + "}\n"; 

Problem: I recorded video from the Galaxy s7 device, after recording I took the recorded video, and I read the first frame from two different devices (Galaxy S7 and Galaxy s3), I found that the pixel values ​​are completely different between these devices.

Does anyone know why this is happening? and what can I do to solve this problem as my algorithm fails due to these differences?

Update:

This is an example of the differences that I have.

Gaaxy S3: Part of the matrix 213, 214, 212, 214, 216, 214, 213, 212, 212, 212, 213, 214, 214, 214, 213, 213, 214, 214, 214, 214, 212, 213, 212 , 213, 212, 214, 214, 212, 212, 210, 211, 210, 211, 210, 211, 211, 214, 211, 214, 213, 213, 214, 214, 216, 216, 216, 215, 215 , 216, 212, 213, 213, 214, 213, 213, 212, 211, 209, 209, 207, 208, 208, 210, 211, 209, 207, 209, 210, 217, 219, 216, 209, 209 , 210, 210, 210, 211, 209, 207, 205, 205, 206, 210, 210, 220, 211, 202, 210, 211, 206, 206, 209, 210, 211, 213, 219, 222, 216 , 217, / p>

number of non-zero pixels: 1,632,816

Sum of all pixels: 3.1834445E8

Galaxy S7: the same part of the matrix as Galaxy 3 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164

number of pixels without zero: 1063680

Sum of all pixels: 1.6593408E8

Update2:

I found that the resulting image was completely destroyed, but the video was recorded well. This is a good image from the Galaxy S3: enter image description here

And this is the image I received from the Galaxy S7 (same frame #) enter image description here

I have no idea what is going on here at all, but I know that the last image is the same for all Marshmallow devices (Galaxy S6, S7 and Huawei)

+6
source share
1 answer

After a week of hard work, he was found to find a solution!

As I said, I was based on the Bigflake example, and in this example it is possible to invert the frame, and in the example, what was done.

By changing the inverse to false, the problem has been fixed.

I will be very grateful if someone can explain to me the reason why the frame is inverted, and why it was decided to set it as "true" by default.

Thannks for your help!

This is the change I made if it was not clear enough:

 outputSurface.drawImage(false); 
+1
source

All Articles