Odd effect with GLSL normals

As something similar to the problem that I posted before, I am trying to correctly display the normals in my GLSL application. For the purpose of my explanation, I am using the ninjaHead.obj model provided by RenderMonkey for testing ( you can capture it here ). Now in the preview window in RenderMonkey everything looks great: Rendermonkey and the vertex and fragment code generated respectively:

Vertex:

uniform vec4 view_position; varying vec3 vNormal; varying vec3 vViewVec; void main(void) { gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; // World-space lighting vNormal = gl_Normal; vViewVec = view_position.xyz - gl_Vertex.xyz; } 

Fragment:

 uniform vec4 color; varying vec3 vNormal; varying vec3 vViewVec; void main(void) { float v = 0.5 * (1.0 + dot(normalize(vViewVec), vNormal)); gl_FragColor = v* color; } 

I based my GLSL code on this, but I'm not quite getting the expected results ...

My vertex shader code:

 uniform mat4 P; uniform mat4 modelRotationMatrix; uniform mat4 modelScaleMatrix; uniform mat4 modelTranslationMatrix; uniform vec3 cameraPosition; varying vec4 vNormal; varying vec4 vViewVec; void main() { vec4 pos = gl_ProjectionMatrix * P * modelTranslationMatrix * modelRotationMatrix * modelScaleMatrix * gl_Vertex; gl_Position = pos; gl_TexCoord[0] = gl_MultiTexCoord0; gl_FrontColor = gl_Color; vec4 normal4 = vec4(gl_Normal.x,gl_Normal.y,gl_Normal.z,0); // World-space lighting vNormal = normal4*modelRotationMatrix; vec4 tempCameraPos = vec4(cameraPosition.x,cameraPosition.y,cameraPosition.z,0); //vViewVec = cameraPosition.xyz - pos.xyz; vViewVec = tempCameraPos - pos; } 

My shader code snippet:

 varying vec4 vNormal; varying vec4 vViewVec; void main() { //gl_FragColor = gl_Color; float v = 0.5 * (1.0 + dot(normalize(vViewVec), vNormal)); gl_FragColor = v * gl_Color; } 

However, my render produces this ... openGL render

Does anyone know what might cause this and / or how to make it work?

EDIT In response to the quark's comments, a model is presented here without any normal / lighting calculations to show all displayed triangles. Render flat shading

And here is a shader model with the normals used for colors. I believe the problem was found! Now the reason is why this is done this way and how to solve it? Suggestions are welcome! Normal shading

SOLUTION Well, all problems are resolved! Thanks to kvark for all his useful insights, which definitely helped my programming practice, but I'm afraid the answer comes from me as a MASSIVE tit ... I had an error in the display () function of my code that sets the glNormalPointer offset to a random value. It was like this:

 gl.glEnableClientState(GL.GL_NORMAL_ARRAY); gl.glBindBuffer(GL.GL_ARRAY_BUFFER, getNormalsBufferObject()); gl.glNormalPointer(GL.GL_FLOAT, 0, getNormalsBufferObject()); 

But it should have been like this:

 gl.glEnableClientState(GL.GL_NORMAL_ARRAY); gl.glBindBuffer(GL.GL_ARRAY_BUFFER, getNormalsBufferObject()); gl.glNormalPointer(GL.GL_FLOAT, 0, 0); 

So, I think this is a lesson. NEVER mentally Ctrl + C and Ctrl + V code to save time on Friday afternoon And ... When you are sure that the part of the code that you are looking at is right, the problem is probably in another place!

+4
source share
2 answers
  • What is your P matrix? (I suppose this is the transformation of vision into a world-> camera).

  • vNormal = normal4*modelRotationMatrix; Why did you change the order of the arguments? By doing this, you multiply the normal by reverse rotation, which you really don't want. Use standard order instead (modelRotationMatrix * normal4)

  • vViewVec = tempCameraPos - pos . This is completely untrue. pos is your vertex in a uniform clip space, and tempCameraPos is in world space (I suppose). You should get the result in the same space as your normal (world space), so use the vertex of the world space for this equation ( modelTranslationMatrix * modelRotationMatrix * modelScaleMatrix * gl_Vertex ).

+1
source

Do you seem to mix GL versions a bit? You pass matrices manually through the form, but use a fixed function to pass vertex attributes. Hectometer In any case ...


I sincerely do not like what you do with your normals. Take a look:

 vec4 normal4 = vec4(gl_Normal.x,gl_Normal.y,gl_Normal.z,0); vNormal = normal4*modelRotationMatrix; 

In normal mode, only directional data is stored, why use vec4 for this? I find it more elegant to just use vec3 only. In addition, look what will happen next - you multiply the normal by the rotation matrix of the 4x4 model ... And in addition, your normal fourth core is 0, so it is not a regular vector in uniform coordinates. I'm not sure the main problem is here, but I wonโ€™t be surprised if this multiplication gives you trash.

The standard way to transform normals is to multiply a vec3 by the 3x3 submatrix of the model matrix (since you are interested in orientation, not translation). Well, exactly, the โ€œrightโ€ approach is to use the reverse transposition of this 3x3 submatrix (this becomes important when you scale). In older versions of OpenGL, you previously calculated it as gl_NormalMatrix .

So, instead of the above, you should use something like

 // (...) varying vec3 vNormal; // (...) mat3 normalMatrix = transpose(inverse(mat3(modelRotationMatrix))); // or if you don't need scaling, this one should work too- mat3 normalMatrix = mat3(modelRotationMatrix); vNormal = gl_Normal*normalMatrix; 

This is definitely one thing that needs to be fixed in your code - I hope it solves your problem.

0
source

All Articles