I searched for binary OpenGL formats and what they really / mean. So far, I have not had much success. I know that I can get the number and set of formats as follows:
::glGetIntegerv(GL_PROGRAM_BINARY_FORMATS, &values[0]);
where the values ββare std :: vector integers of size:
::glGetIntegerv(GL_NUM_PROGRAM_BINARY_FORMATS, &value);
For the driver on my computer, one binary format is returned. Itβs just a number and for me itβs completely pointless.
So my questions are these:
(1) What do format numbers / codes mean? (2) Are there standard ARB binary formats, or are they all vendor-specific? (3) Is there a common format between suppliers? (4) If there are several formats, how should you choose between them?
I have a vague feeling that I must first compile and cache the shaders on this machine in order to take advantage of this. I mean, this is not like DX, where you can run the shader compiler as an event after assembly and pack them together and use them on any (Windows) machine with D3D.
Robinson
source share