Since int is (waaaay simplifies here) 32 bits in a 32-bit system and 64 bits in a 64-bit system - so even just "int" is not a universal concept. Keep in mind that the graphics hardware is different than your processor, and there is a need for new types. Using its own typedef, OpenGL can guarantee the correct placement of the correct number of bits when sending data to your graphics card.
You could do this with conversion functions that abstract away from the mess of βdifferent ints,β but this can lead to a performance penalty, which is usually unacceptable when you talk about every number that goes to and from the video card.
tl; dr when using the "int" that you write using your processor. When using "GLInt" you write the hardware of your video card.
EDIT: as pointed out in the comments, on a 64-bit int processor it can (and probably will) 32 bits for compatibility reasons. Historically, thanks to 8, 16, and 32-bit hardware, this was the processor's own size, but technically it is regardless of how the compiler feels when using machine code to create it. Stands for @Nicol Bolas and @Mark Dickinson
Matt
source share