I am trying to load textures into a background thread to speed up my application.
The stack used is C / C ++ on Linux, with compilation with gcc. We use OpenGL, GLUT and GLEW. We used libSOIL to load textures.
Ultimately, starting a texture load with libSOIL fails because it encounters a glGetString () call, which calls segfault. Trying to narrow down the problem, I wrote a very simple OpenGL application that reproduces the behavior. The code example below should not do anything, but it should not be segfault either. If I knew why this happened, I could theoretically redesign libSOIL to behave in a pthreaded environment.
void *glPthreadTest( void* arg ) { glGetString( GL_EXTENSIONS ); //SIGSEGV return NULL; } int main( int argc, char **argv ) { glutInit( &argc, argv ); glutInitDisplayMode( GLUT_RGBA | GLUT_DOUBLE | GLUT_DEPTH ); glewInit(); glGetString( GL_EXTENSIONS ); // Does not cause SIGSEGV pthread_t id; if (pthread_create( &id, NULL, glPthreadTest, (void*)NULL ) != 0) fprintf( stderr, "phtread_create glPthreadTest failed.\n" ); glutMainLoop(); return EXIT_SUCCESS; }
An example stacktrace for this application from gdb is as follows:
#0 0x00000038492f86e9 in glGetString () from /usr/lib64/nvidia/libGL.so.1 No symbol table info available. #1 0x0000000000404425 in glPthreadTest (arg=0x0) at sf.cpp:168 No locals. #2 0x0000003148e07d15 in start_thread (arg=0x7ffff7b36700) at pthread_create.c:308 __res = <optimized out> pd = 0x7ffff7b36700 now = <optimized out> unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140737349117696, -5802871742031723458, 1, 211665686528, 140737349117696, 0, 5802854601940796478, -5829171783283899330}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}} not_first_call = 0 pagesize_m1 = <optimized out> sp = <optimized out> freesize = <optimized out> #3 0x00000031486f246d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:114 No locals.
You will notice that I am using the nvidia libGL implementation, but this also happens identically with mesa libgl, which Ubuntu uses for Intel HD graphics cards.
Any tips on what might go wrong, or how to research further to find out what is going on?
Edit: here is #includes and the compilation line for my test example:
#include <SOIL.h> #include <GL/glew.h> #include <GL/freeglut.h> #include <GL/freeglut_ext.h> #include <signal.h> #include <pthread.h> #include <cstdio>
g++ -Wall -pedantic -I/usr/include/SOIL -O0 -ggdb -o sf sf.cpp -lSOIL -pthread -lGL -lGLU -lGLEW -lglut -lX11