Access to barycentric coordinates inside a fragment shader

In the fragment shader, the values ​​are naturally interpolated. For example, if I have three vertices, each with a color, red for the first vertex, green for the second and blue for the third. If I make a triangle with them, the expected result will be a common triangle .

Obviously, OpenGL calculates the interpolation coefficients (a, b, c) for each point inside the triangle. Is there a way to explicitly access these values, or will I need to calculate the coordinates of a fragment of three vertices and find the barycentric coordinates of the point? I know this is feasible, but I thought OpenGL could provide something.

+4
source share
1 answer

I do not know any built-in functions for obtaining barycentric coordinates. But you do not need any calculations in the fragment shader.

You can pass the barycentric coordinates of the vertices of the triangle as attributes to the vertex shader. The attribute values ​​for the three vertices are simply (1, 0, 0), (0, 1, 0), and (0, 0, 1). Then pass the attribute value through the fragment shader (using the variable varyingin the deprecated OpenGL, outin the vertex shader and inin the fragment shader, mainly OpenGL). Then the value of the variable obtained by the fragment shader is the barycentric coordinates of the fragment.

, , , .

+5

All Articles