GlTexImage2d and Null data

This is a strange question, but I was wondering why glTexImage2d cares about the pixel data type and the Null data format. Signature for glTexImage2d ...

void glTexImage2D( GLenum target, GLint level, GLint internalFormat, 
           GLsizei width, GLsizei height, GLint border, GLenum format, 
           GLenum type, const GLvoid * data);

The Internal Format acts to tell the graphics driver that you want to save the data as on gpu, and the format and type tell the graphics driver what to expect from GLvoid * data. So, if I do not transfer any data, passing, for example, null, why does the graphics driver care about which format and type? So this is a strange question, because sometimes it is not. The time when this happens, and I did not check every iteration, but, especially when creating a deep texture, or something I recently met, are Integer types like GL_RED_INTEGER, GL_RG_INTEGER, etc. And / or their corresponding internalFormats, such as GL_R8I, GL_RGBA32UI. While any of the "simple" types should not in any case correspond to the internal format, for example GL_RGBA8, GL_RGBA32F, etc.For some reason, previous specific data types and formats must be accurate, even if you don't pass by any data. Why is this?

+4
source share
2 answers

If you are not transferring any data, it still allocates memory on the GPU for future reference. That is why you must specify the exact format; tell the GPU how much memory it should reserve.

+2
source

There are two formats when called glTexImage2D (...), the internal format and the format used to transmit pixels. The internal format is needed to allocate memory for the texture, the pixel transfer format is used by GL to interpret the (optional) array of pixels that you transmit.

NULL , , , format type . , GL . , , . , , .

, glTexStorage2D (...) - . , LOD glTexImage2D (...). glTexStorage2D (...) , glTexSubImage2D (...).

+2

All Articles