This is a strange question, but I was wondering why glTexImage2d cares about the pixel data type and the Null data format. Signature for glTexImage2d ...
void glTexImage2D( GLenum target, GLint level, GLint internalFormat,
GLsizei width, GLsizei height, GLint border, GLenum format,
GLenum type, const GLvoid * data);
The Internal Format acts to tell the graphics driver that you want to save the data as on gpu, and the format and type tell the graphics driver what to expect from GLvoid * data. So, if I do not transfer any data, passing, for example, null, why does the graphics driver care about which format and type? So this is a strange question, because sometimes it is not. The time when this happens, and I did not check every iteration, but, especially when creating a deep texture, or something I recently met, are Integer types like GL_RED_INTEGER, GL_RG_INTEGER, etc. And / or their corresponding internalFormats, such as GL_R8I, GL_RGBA32UI. While any of the "simple" types should not in any case correspond to the internal format, for example GL_RGBA8, GL_RGBA32F, etc.For some reason, previous specific data types and formats must be accurate, even if you don't pass by any data. Why is this?
source
share