How to update normals after positioning vertices in a vertex shader?

Short version: I manipulate the position of the vertices in the vertex shader, but when I calculate the normals based on the vertex position , the normals are calculated based on the original vertex position, Shouldn't the vertex shaders know where the new vertices are?

Long version: I write a custom shader in the three.js R.58 file based on the three.js normalmap shader. I pass the flat texture (0.5, 0.5, 1.0 lavender) to the shader as tNormal , and then set the position of the planeGeometry vertices in the vertex shader to bend it into a sphere.

Here's the vertex shader bit where the normals are computed:

 vec3 newPosition = mix( position, goalPosition, mixAmount ); vec4 mvPosition = modelViewMatrix * vec4( newPosition, 1.0 ); vViewPosition = -mvPosition.xyz; vNormal = normalize( normalMatrix * normal ); //tangent and binormal vectors vTangent = normalize( normalMatrix * tangent.xyz ); vBinormal = cross( vNormal, vTangent ) * tangent.w; vBinormal = normalize( vBinormal ); 

This code works for default geometry and geometry. However, when I warp the plane into a sphere, the shader does not work as expected - the fragment shader seems to think that the sphere is still a plane - or something like that.

Here's a sphere with pointLight showing reflective specular highlighting:

enter image description here

Demo: http://meetar.imtqy.com/planebending/plane_bending.html

And this only appears if the original plane faces a point focus - at other rotations, the sphere is black, regardless of where the camera is located, which indicates that the normals of the plane are still somehow calculated as if peaks were not attached.

I set geometry.dynamic = true , as well as geometry.normalsNeedUpdate and geometry.tangentsNeedUpdate , and I call geometry.computeTangents() , but nothing works.

I got the impression that in GLSL, the scene components used to calculate the normals (for example, normalMatrix and tangent ) take into account any vertex manipulations with the vertex shader. What am I missing? Is this something specific to three.js?

Edit:

When checking in the console, I see that each face in the deformed plane still has a normal state in the world (0, 0, 1), as it was in its undeformed state, unlike the instance of the Geometry sphere, the normals of which change depending on their orientation.

Here is another demonstration: these two objects have the same material. On the left is SphereGeometry, on the right is a deformed PlaneGeometry. When the plane is deformed, the normals are not updated to reflect the new orientation of the faces, and the mirror is not displayed properly.

enter image description here

Demo: http://meetar.imtqy.com/planebending/plane_bending_06.html

+7
opengl-es shader webgl glsl
source share
1 answer

Short version: you can not!

Long version: Even if the vertex shader moves the vertices, it only knows the updated position of the current vertex - to calculate the new normal of the face, it must know the updated position of all neighboring vertices, and WebGL still does not allow this.

Although the vertex shader calculates the normals, they are all based on the original normals, which are passed to the vertex shader along with the original positions of the vertices. These calculations are basically standard normal things necessary for a fragment shader, such as taking into account the position of the light, transforms, etc., and all of them assume that the vertices will still be held relative to each other in the space of objects.

It is relatively easy to update the normals with the processor and pass them to the vertex shader, but if you have to do this in the vertex shader, there are some sneaky ways to fake it, for example, using bump maps ; and if the vertices move based on any parametric calculation, you can generate some neighbors "on the fly" and check the normal between them , which definitely deceives.

Sources:

+5
source share

All Articles