Cocoa and OpenGL, How to set the GLSL vertex attribute using an array?

I am new to OpenGL and I seem to be experiencing some difficulties. I wrote a simple shader in GLSL that needs to transform vertices using shared matrices, allowing simple skeletal animation. Each vertex has a maximum of two bone effects (stored as the x and y components of Vec2), indices and corresponding weights that are associated with the array of transformation matrices and are listed as “Attribute Variables” in my shader, and then set using the glVertexAttribPointer function.

Here, where the problem arises ... I managed to correctly set the array of "Uniform Variable" matrices, when I check these values ​​in the shader, they are all imported correctly and contain the correct data. However, when I try to set the variable of shared indices, the vertices are multiplied by arbitrary transformation matrices! They jump to seemingly random positions in space (which differ each time each time), I assume that the indices are set incorrectly, and my shader reads beyond the end of my shared matrix array into the next memory. I'm not quite sure why, because after reading all the information that I could find on this issue, I was surprised to see the same (if not very similar) code in my examples, and it seemed to work for them.

I tried to solve this problem for a long time, and it really starts to get nervous ... I know that the matrices are correct, and when I manually change the index value in the shader to an arbitrary integer, it reads the correct matrix values ​​and works as it should, converting all the vertices to this matrix, but when I try to use the code that I wrote to set the attribute variables, it doesn't seem to work.

The code I use to set the variables is as follows:

// this works properly... GLuint boneMatLoc = glGetUniformLocation([[[obj material] shader] programID], "boneMatrices"); glUniformMatrix4fv( boneMatLoc, matCount, GL_TRUE, currentBoneMatrices ); GLfloat testBoneIndices[8] = {1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0}; // this however, does not... GLuint boneIndexLoc = glGetAttribLocation([[[obj material] shader] programID], "boneIndices"); glEnableVertexAttribArray( boneIndexLoc ); glVertexAttribPointer( boneIndexLoc, 2, GL_FLOAT, GL_FALSE, 0, testBoneIndices ); 

And my vertex shader looks like this ...

 // this shader is supposed to transform the bones by a skeleton, a maximum of two // bones per vertex with varying weights... uniform mat4 boneMatrices[32]; // matrices for the bones attribute vec2 boneIndices; // x for the first bone, y for the second //attribute vec2 boneWeight; // the blend weights between the two bones void main(void) { gl_TexCoord[0] = gl_MultiTexCoord0; // just set up the texture coordinates... vec4 vertexPos1 = 1.0 * boneMatrices[ int(boneIndex.x) ] * gl_Vertex; //vec4 vertexPos2 = 0.5 * boneMatrices[ int(boneIndex.y) ] * gl_Vertex; gl_Position = gl_ModelViewProjectionMatrix * (vertexPos1); } 

It really starts to upset me, and any help would be appreciated,

-Andrew Ready

+6
attributes shader opengl glsl vertex
source share
1 answer

Ok, I figured it out. OpenGL draws triangles using the drawArrays function, reading every 9 values ​​as a triangle (3 vertices with 3 components each). Because of this, the vertices repeat between triangles, so if two adjacent triangles share a vertex, it appears twice in the array. Therefore, my cube, which I initially thought to have 8 vertices, actually has 36!

six sides, two triangles - one, three vertices per triangle - all are multiplied by 36 independent vertices instead of 8 common.

The whole problem was associated with setting too small values. As soon as I expanded my test array to include 36 values, it worked perfectly.

+2
source share

All Articles