How to create your own vertex format with OpenGL

I write my own engine using OpenTK (basically just OpenGL bindings for C #, gl * becomes GL. *), And I'm going to store many vertex buffers with several thousand vertices in each. Therefore, I need my own proprietary vertex format, since Vec3 with floats would simply take up too much space. (I'm talking about millions of peaks here)

What I want to do is create my own vertex format with this layout:

Byte 0: Position X Byte 1: Position Y Byte 2: Position Z Byte 3: Texture Coordinate X Byte 4: Color R Byte 5: Color G Byte 6: Color B Byte 7: Texture Coordinate Y 

Here is the code in C # for the top:

 public struct SmallBlockVertex { public byte PositionX; public byte PositionY; public byte PositionZ; public byte TextureX; public byte ColorR; public byte ColorG; public byte ColorB; public byte TextureY; } 

There are many bytes as a position for each axis, since I only need 32 ^ 3 unique positions.

I wrote my own vertex shader, which takes two vec4 as inputs, for each set of bytes. My vertex shader:

 attribute vec4 pos_data; attribute vec4 col_data; uniform mat4 projection_mat; uniform mat4 view_mat; uniform mat4 world_mat; void main() { vec4 position = pos_data * vec4(1.0, 1.0, 1.0, 0.0); gl_Position = projection_mat * view_mat * world_mat * position; } 

To try to isolate the problem, I made the shader as simple as possible. The code for compiling the shaders is checked with the direct mode graph, and it works, so this cannot be done.

Here is my function that generates, sets, and populates the vertex buffer with data and sets a pointer to attributes.

  public void SetData<VertexType>(VertexType[] vertices, int vertexSize) where VertexType : struct { GL.GenVertexArrays(1, out ArrayID); GL.BindVertexArray(ArrayID); GL.GenBuffers(1, out ID); GL.BindBuffer(BufferTarget.ArrayBuffer, ID); GL.BufferData<VertexType>(BufferTarget.ArrayBuffer, (IntPtr)(vertices.Length * vertexSize), vertices, BufferUsageHint.StaticDraw); GL.VertexAttribPointer(Shaders.PositionDataID, 4, VertexAttribPointerType.UnsignedByte, false, 4, 0); GL.VertexAttribPointer(Shaders.ColorDataID, 4, VertexAttribPointerType.UnsignedByte, false, 4, 4); } 

From what I understand, this is the correct procedure: Create a vertex array object and bind it Create a vertex buffer and bind it Fill the vertex buffer with data Set pointers to attributes

Shaders. * DataID is set using this code after compiling and using the shader.

 PositionDataID = GL.GetAttribLocation(shaderProgram, "pos_data"); ColorDataID = GL.GetAttribLocation(shaderProgram, "col_data"); 

And this is my rendering function:

 void Render() { GL.UseProgram(Shaders.ChunkShaderProgram); Matrix4 view = Constants.Engine_Physics.Player.ViewMatrix; GL.UniformMatrix4(Shaders.ViewMatrixID, false, ref view); //GL.Enable(EnableCap.DepthTest); //GL.Enable(EnableCap.CullFace); GL.EnableClientState(ArrayCap.VertexArray); { Matrix4 world = Matrix4.CreateTranslation(offset.Position); GL.UniformMatrix4(Shaders.WorldMatrixID, false, ref world); GL.BindVertexArray(ArrayID); GL.BindBuffer(OpenTK.Graphics.OpenGL.BufferTarget.ArrayBuffer, ID); GL.DrawArrays(OpenTK.Graphics.OpenGL.BeginMode.Quads, 0, Count / 4); } //GL.Disable(EnableCap.DepthTest); //GL.Disable(EnableCap.CullFace); GL.DisableClientState(ArrayCap.VertexArray); GL.Flush(); } 

Could someone be so kind as to give me some pointers (no pun intended)? Am I doing this in the wrong order or are there some functions that I need to call?

I searched all over the internet but can't find one good tutorial or guide explaining how to implement custom vertices. If you need more information, say so.

+4
source share
1 answer

There is not much to create your own vertex format. All this is done in glVertexAttribPointer calls. First of all, you use parameter 4 as a step parameter, but your vertex structure is 8 bytes wide, so from the beginning of one vertex to the beginning there are 8 bytes, so the step should be 8 (in both calls, of course). The offsets are correct, but you have to set the normalized flag to true for the colors, since you, of course, want them to be in the range [0,1] (I don't know if this should also be the case for vertex positions).

Further, when using custom vertex attributes in shaders, you do not include obsolete arrays of fixed functions ( gl...ClienState ). You should use instead

 GL.EnableVertexAttribArray(Shaders.PositionDataID); GL.EnableVertexAttribArray(Shaders.ColorDataID); 

and corresponding calls to glDisableVertexAttribArray .

And what does count/4 mean in a glDrawArrays call. Keep in mind that the last parameter indicates the number of vertices, not primitives (squares in your case). But perhaps that was so.

In addition to these real errors, you should not use such a collided vertex format that you must decode in the shader yourself. For this, the step and offset glVertexAttribPointer . For example, modify the vertex data a bit:

 public struct SmallBlockVertex { public byte PositionX; public byte PositionY; public byte PositionZ; public byte ColorR; public byte ColorG; public byte ColorB; public byte TextureX; public byte TextureY; } 

and then you can just use

 GL.VertexAttribPointer(Shaders.PositionDataID, 3, VertexAttribPointerType.UnsignedByte, false, 8, 0); GL.VertexAttribPointer(Shaders.ColorDataID, 3, VertexAttribPointerType.UnsignedByte, true, 8, 3); GL.VertexAttribPointer(Shaders.TexCoordDataID, 2, VertexAttribPointerType.UnsignedByte, true, 8, 6); 

And in the shader you have

 attribute vec3 pos_data; attribute vec3 col_data; attribute vec2 tex_data; 

and you don’t need to retrieve the texture coordinate from position and color.

And you really have to think about whether your space requirements really require the use of bytes for vertex positions, as this extremely limits the accuracy of your location data. Perhaps shorts or semi-solid floats will be a good compromise.

And it is also not necessary to call glBindBuffer in the rendering method, since this is necessary only for glVertexAttribPointer and is stored in the VAO that is activated by glBindVertexArray . You also usually should not call glFlush , as the OS does this anyway when the buffers are replaced (assuming you use double buffering).

Last but not least, make sure your equipment also supports all the features you use (such as VBOs and VAO).

EDIT: Actually flags with flags enabled are also stored in VAO, so you can call

 GL.EnableVertexAttribArray(Shaders.PositionDataID); GL.EnableVertexAttribArray(Shaders.ColorDataID); 

in the SetData method (after creating and binding the VAO, of course), and then they turn on when you bind the VAO using glBindVertexArray in the render function. Oh, I just saw another mistake. When you bind a VAO to a rendering function, flags with the checkboxes of the attribute arrays turned on are overwritten by the state from the VAO and, since you did not enable them after creating the VAO, they are still disabled. Therefore, you will need to do this, as said, to include arrays in the SetData method. In fact, you might be lucky in your case, and the VAO is still bound when you include arrays in the rendering functions (since you did not call glBindVertexArray(0) ), but you shouldn't count on that.

+8
source

All Articles