OpenGL Vertex Buffer doesn't draw anything in golang

I tried using this tutorial with Golang: http://www.opengl-tutorial.org/beginners-tutorials/tutorial-2-the-first-triangle/ The Go version opens a window and makes the background blue, but does not show a triangle. The C version shows this. This is the code in Go:

err := glfw.Init() if err != nil { log.Fatal("Failed to init GLFW: " + err.Error()) } err = glfw.OpenWindow(1024, 768, 0,0,0,0, 32,0, glfw.Windowed) if err != nil { log.Fatal("Failed to open GLFW window: " + err.Error()) } if gl.Init() != 0 { log.Fatal("Failed to init GL") } gl.ClearColor(0.0, 0.0, 0.3, 0.0) // create vertexbuffer gVertexBufferData := []float32{-1.0,-1.0,0.0, 1.0,-1.0,0.0, 0.0,1.0,0.0} vertexBuffer := gl.GenBuffer() vertexBuffer.Bind(gl.ARRAY_BUFFER) gl.BufferData(gl.ARRAY_BUFFER, len(gVertexBufferData), gVertexBufferData, gl.STATIC_DRAW) for { // clear screen gl.Clear(gl.COLOR_BUFFER_BIT) // first attribute buffer: vertices var vertexAttrib gl.AttribLocation = 0 vertexAttrib.EnableArray() vertexBuffer.Bind(gl.ARRAY_BUFFER) var f float32 = 0.0 vertexAttrib.AttribPointer( 3, // size false, // normalized? 0, // stride &f) // array buffer offset // draw the triangle gl.DrawArrays(gl.TRIANGLES, 0, 3) vertexAttrib.DisableArray() glfw.SwapBuffers() } 

And this is the code in c that works:

 if(!glfwInit()) return -1; if(!glfwOpenWindow( 1024, 768, 0,0,0,0, 32,0, GLFW_WINDOW )) return -1; if(glewInit() != GLEW_OK) return -1; glClearColor(0.0f, 0.0f, 0.3f, 0.0f); GLuint VertexArrayID; glGenVertexArrays(1, &VertexArrayID); glBindVertexArray(VertexArrayID); static const GLfloat g_vertex_buffer_data[] = { -1.0f, -1.0f, 0.0f, 1.0f, -1.0f, 0.0f, 0.0f, 1.0f, 0.0f, }; GLuint vertexbuffer; glGenBuffers(1, &vertexbuffer); glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer); glBufferData(GL_ARRAY_BUFFER, sizeof(g_vertex_buffer_data), g_vertex_buffer_data, GL_STATIC_DRAW); while(1) { glClear( GL_COLOR_BUFFER_BIT ); // 1rst attribute buffer : vertices glEnableVertexAttribArray(0); glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer); glVertexAttribPointer( 0, 3, // size GL_FLOAT, // type GL_FALSE, // normalized? 0, // stride (void*)0 // array buffer offset ); // Draw the triangle ! glDrawArrays(GL_TRIANGLES, 0, 3); // From index 0 to 3 -> 1 triangle glDisableVertexAttribArray(0); // Swap buffers glfwSwapBuffers(); } 

Maybe I'm giving vertexAttrib.AttribPointer () the wrong arguments because I'm not sure what to give it instead of (void *) 0. I tried zero, but this caused the application to crash. & gVertexBufferData [0] also does not work.

I am using github.com/banthar/gl as glew-wrapper, go 1.0.2 and ubuntu 12.04 amd64.

EDIT update:

glGetError gives no errors

+4
source share
4 answers

I had the same problem, and I managed to fix it by looking at your post, so first of all thank you very much.

I managed to render the triangle using the banthar bindings branch with this AttribPointer call:

 vertexAttrib.AttribPointer( 3, // size gl.FLOAT, //type false, // normalized? 0, // stride nil) // array buffer offset 

and passing the size in bytes to BufferData.

 [...] data := []float32{0, 1, 0, -1, -1, 0, 1, -1, 0} [...] gl.BufferData(gl.ARRAY_BUFFER, len(data)*4, data, gl.STATIC_DRAW) [...] 

Perhaps the best way to convey the correct length.

+5
source

I recently got into a similar problem with Golang OpenGL bindings, and this question was one of the only links to it that I could find. However, none of the existing answers solved my problem, since the bindings are currently slightly different from each other in 2015 than in 2012.

A solution to my problem that was not yet covered by the existing answers included the gl.BufferData () function, called when the VBO was created.

A sample code in question will look like this:

 [...] vertices := []float32{0, 1, 0, -1, -1, 0, 1, -1, 0} [...] gl.BufferData( gl.ARRAY_BUFFER, len(vertices)*4, unsafe.Pointer(&vertices), gl.STATIC_DRAW) [...] 

In one of the proposed solutions, it is recommended to change this code approximately like this:

 [...] vertices := []float32{0, 1, 0, -1, -1, 0, 1, -1, 0} [...] gl.BufferData( gl.ARRAY_BUFFER, len(vertices)*4, vertices, gl.STATIC_DRAW) [...] 

However, the bindings I used had a different function signature used here, and with an error:

 cannot use vertices (type []float32) as type unsafe.Pointer in argument to gl.BufferData 

The solution that I finally found, and wanted to put here, so that no one else should go through the headache that he was trying to find out, looks like this:

 [...] vertices := []float32{0, 1, 0, -1, -1, 0, 1, -1, 0} [...] gl.BufferData( gl.ARRAY_BUFFER, len(vertices)*4, //len(vertices)*int(reflect.TypeOf(vertices).Elem().Size()), gl.Ptr(vertices), gl.STATIC_DRAW) [...] 

I also included a commented out option to replace len (vertices) * 4s, which produces exactly the same result, but finds "4" based on the type of the slice (float32 in this case)

Footnote

The links that I used:
github.com/go-gl/gl/all-core/gl
github.com/go-gl/glfw/v3.1/glfw

My OpenGL context was created with these hints: primaryMonitor: = glfw.GetPrimaryMonitor () vidMode: = primaryMonitor.GetVideoMode ()

 glfw.WindowHint(glfw.ContextVersionMajor, 3) glfw.WindowHint(glfw.ContextVersionMinor, 3) glfw.WindowHint(glfw.OpenGLProfile, glfw.OpenGLCoreProfile) glfw.WindowHint(glfw.OpenGLForwardCompatible, glfw.True) glfw.WindowHint(glfw.RedBits, vidMode.RedBits) glfw.WindowHint(glfw.GreenBits, vidMode.GreenBits) glfw.WindowHint(glfw.BlueBits, vidMode.BlueBits) glfw.WindowHint(glfw.RefreshRate, vidMode.RefreshRate) glfw.WindowHint(glfw.Visible, glfw.False) 
+3
source

I had the same problem, in the end it was due to the fact that for some reason, calling glfw.OpenWindowHint was twisting it. It will ask for the correct context, my version of opengl will match, I would not get any errors at all, but it did not work. If I give up the hint, I get context 4.3, and everything seems to work.

Even if I ask for 4.3 at the prompt, this will not work. If I request something else, my opengl string matches, but once again it doesn't work.

Hope this helps

+1
source

I don’t know how OpenGL bindings to Go look exactly, but I can tell you at least like this:

The last parameter glVertexAttribPointer should be the offset of the byte from the beginning of the buffer object, therefore (in your case) 0.

Note. The C type of this parameter should usually be int , since it is a byte offset. Instead, it is void* for outdated reasons - it used to matter to VBOs.

Instead of &f try passing either the literal 0, or, if that doesn't work, a pointer with a value of 0. How can I do this in Go? It is for you to understand, since I am not grok Go. I told you what OpenGL expects, and I hope this helps.


Also: for debugging often check glGetError() .

0
source

All Articles