VBO glDrawElements and glVertexAttribPointer on GLES2.0 does not display anything

I can display the texture using shaders, glVertexAttribPointer and glDrawArrays, for example:

Inits

const GLfloat squareVertices[] = { -0.5f, -0.33f, 0.5f, -0.33f, -0.5f, 0.33f, 0.5f, 0.33f }; const GLfloat squareTex[] = { 0, 0, 1, 0, 0, 1, 1, 1 }; glEnableVertexAttribArray(PositionTag); glEnableVertexAttribArray(TexCoord0Tag); glVertexAttribPointer(PositionTag, 2, GL_FLOAT, GL_FALSE, 0, squareVertices); glVertexAttribPointer(TexCoord0Tag, 2, GL_FLOAT, GL_FALSE, 0, squareTex); 

And for the draw

 glDrawArrays(GL_TRIANGLE_STRIP, 0, 4); 

But it's hard for me to convert to VBOs, shaders and glDrawElements. This is the code that I have so far, but nothing is displayed:

Headline

 typedef struct MyVertex { float x, y, z; //Vertex float nx, ny, nz; //Normal float s0, t0; //Texcoord0 } MyVertex; #define BUFFER_OFFSET(i) ((char *)NULL + (i)) 

Inits

 glGenBuffers(1, &VertexVBOID); glBindBuffer(GL_ARRAY_BUFFER, VertexVBOID); MyVertex pvertices[4]; //Fill the pvertices array pvertices[0].x = -0.5f; pvertices[0].y = -0.33f; pvertices[0].z = 0.0; pvertices[0].nx = 0.0; pvertices[0].ny = 0.0; pvertices[0].nz = 1.0; pvertices[0].s0 = 0.0; pvertices[0].t0 = 0.0; pvertices[1].x = 0.5f; pvertices[1].y = -0.33f; pvertices[1].z = 0.0; pvertices[1].nx = 0.0; pvertices[1].ny = 0.0; pvertices[1].nz = 1.0; pvertices[1].s0 = 1.0; pvertices[1].t0 = 0.0; pvertices[2].x = -0.5f; pvertices[2].y = 0.33f; pvertices[2].z = 0.0; pvertices[2].nx = 0.0; pvertices[2].ny = 0.0; pvertices[2].nz = 1.0; pvertices[2].s0 = 0.0; pvertices[2].t0 = 1.0; pvertices[3].x = 0.5f; pvertices[3].y = 0.33f; pvertices[3].z = 0.0; pvertices[3].nx = 0.0; pvertices[3].ny = 0.0; pvertices[3].nz = 1.0; pvertices[3].s0 = 1.0; pvertices[3].t0 = 1.0; glBufferData(GL_ARRAY_BUFFER, sizeof(MyVertex)*4, NULL, GL_STATIC_DRAW); glBufferSubData(GL_ARRAY_BUFFER, 0, sizeof(MyVertex)*4, pvertices); glGenBuffers(1, &IndexVBOID); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, IndexVBOID); int pindices[6]; pindices[0]=0; pindices[1]=1; pindices[2]=2; pindices[3]=2; pindices[4]=1; pindices[5]=3; glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(int)*6, NULL, GL_STATIC_DRAW); glBufferSubData(GL_ELEMENT_ARRAY_BUFFER, 0, sizeof(int)*6, pindices); 

Draw

 glBindBuffer(GL_ARRAY_BUFFER, VertexVBOID); glEnableVertexAttribArray(PositionTag); glEnableVertexAttribArray(NormalTag); glEnableVertexAttribArray(TexCoord0Tag); glVertexAttribPointer(PositionTag, 3, GL_FLOAT, GL_FALSE, 32, BUFFER_OFFSET(0)); glVertexAttribPointer(NormalTag, 3, GL_FLOAT, GL_FALSE, 32, BUFFER_OFFSET(12)); glVertexAttribPointer(TexCoord0Tag, 2, GL_FLOAT, GL_FALSE, 32, BUFFER_OFFSET(24)); // glDrawRangeElements(GL_TRIANGLES, x, y, z, GL_UNSIGNED_SHORT, BUFFER_OFFSET(0)); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, IndexVBOID); glDrawElements(GL_TRIANGLES, 3, GL_INT, 0); 
+4
source share
2 answers

According to here , GL_INT not a valid type for indexes in glDrawElements . Try using unsigned int for your indexes (and of course GL_UNSIGNED_INT in glDrawElements ). You can still use int data as indices, but since glDrawElements needs GL_UNSIGNED_INT , it would be more consistent to make an unsigned int array.

EDIT:. Having studied the specification (based on your tags, I took the ES 2.0 specification), they seem to limit it to an unsigned byte and unsigned short. I don't know if this is limited in iOS, but we can conclude that the data, at least, should be unsigned. On the other hand, I did not find any allegations of a possible GL_INVALID_ENUM error that is generated by an invalid type argument, but it would be wise to get it.

+4
source

Your code doesn't look terribly wrong, so this time the devil is somewhere in the details. I assume that your use of data field structures and alignments does not match the offsets passed by OpenGL.

I suggest you use the offsetof() macro found in stddef.h to get data field offsets portable.

+1
source

All Articles