As something similar to the problem that I posted before, I am trying to correctly display the normals in my GLSL application. For the purpose of my explanation, I am using the ninjaHead.obj model provided by RenderMonkey for testing ( you can capture it here ). Now in the preview window in RenderMonkey everything looks great:
and the vertex and fragment code generated respectively:
Vertex:
uniform vec4 view_position; varying vec3 vNormal; varying vec3 vViewVec; void main(void) { gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
Fragment:
uniform vec4 color; varying vec3 vNormal; varying vec3 vViewVec; void main(void) { float v = 0.5 * (1.0 + dot(normalize(vViewVec), vNormal)); gl_FragColor = v* color; }
I based my GLSL code on this, but I'm not quite getting the expected results ...
My vertex shader code:
uniform mat4 P; uniform mat4 modelRotationMatrix; uniform mat4 modelScaleMatrix; uniform mat4 modelTranslationMatrix; uniform vec3 cameraPosition; varying vec4 vNormal; varying vec4 vViewVec; void main() { vec4 pos = gl_ProjectionMatrix * P * modelTranslationMatrix * modelRotationMatrix * modelScaleMatrix * gl_Vertex; gl_Position = pos; gl_TexCoord[0] = gl_MultiTexCoord0; gl_FrontColor = gl_Color; vec4 normal4 = vec4(gl_Normal.x,gl_Normal.y,gl_Normal.z,0);
My shader code snippet:
varying vec4 vNormal; varying vec4 vViewVec; void main() {
However, my render produces this ... 
Does anyone know what might cause this and / or how to make it work?
EDIT In response to the quark's comments, a model is presented here without any normal / lighting calculations to show all displayed triangles. 
And here is a shader model with the normals used for colors. I believe the problem was found! Now the reason is why this is done this way and how to solve it? Suggestions are welcome! 
SOLUTION Well, all problems are resolved! Thanks to kvark for all his useful insights, which definitely helped my programming practice, but I'm afraid the answer comes from me as a MASSIVE tit ... I had an error in the display () function of my code that sets the glNormalPointer offset to a random value. It was like this:
gl.glEnableClientState(GL.GL_NORMAL_ARRAY); gl.glBindBuffer(GL.GL_ARRAY_BUFFER, getNormalsBufferObject()); gl.glNormalPointer(GL.GL_FLOAT, 0, getNormalsBufferObject());
But it should have been like this:
gl.glEnableClientState(GL.GL_NORMAL_ARRAY); gl.glBindBuffer(GL.GL_ARRAY_BUFFER, getNormalsBufferObject()); gl.glNormalPointer(GL.GL_FLOAT, 0, 0);
So, I think this is a lesson. NEVER mentally Ctrl + C and Ctrl + V code to save time on Friday afternoon And ... When you are sure that the part of the code that you are looking at is right, the problem is probably in another place!