The OpenGL specification calls it the maximum image size of 1D / 2D, so it really means an 8192x8192 image. Well, bearing in mind the width and height, it would be completely pointless, since it says nothing about the size, the 8000x192 -texture has a completely different size than the 4096x4096 -texture. At least multiplication would be more reasonable, but in this case 8192 would mean ~90x90 -texture.
But you must take these values ββwith salt. They really are just the upper limit of what the implementation allows (hardware / driver), so it should not match your hardware video memory. In practice, there are many more things in video memory, such as framebuffers, VBOs, whatever, so itβs quite reasonable to give some conservative value. Similarly, it could also be your driver developers donβt pay much attention to this constant (itβs not ATI, right?) And just return the default value, and in fact your texture may actually be larger.
And, as Robinson says in his commentary, it can also depend on other things than raw memory, since texture memory can be some special area in video memory, offering a special kind of 2D addressing / caching mode.
Christian rau
source share