I have 10-bit YUV (V210) video clips coming from a capture card, and I would like to unpack this data inside the GLSL shader and ultimately convert to RGB to display the screen. I am using a Quadro 4000 card on Linux (OpenGL 4.3).
I load the texture with the following settings:
video frame: 720x486 pixels
physically occupies 933120 bytes in 128-byte aligned memory (step 1920)
the texture is currently loaded as 480x486 pixels (step / 4 x height), as this corresponds to the number of data bytes
internalFormat of GL_RGB10_A2
format GL_RGBA
type GL_UNSIGNED_INT_2_10_10_10_REV
Currently filteringset to GL_NEAREST
The following is the boot command:
int stride = ((m_videoWidth + 47) / 48) * 128;
glTexImage2D (GL_TEXTURE_2D, 0, GL_RGB10_A2, stride/4, m_videoHeight, 0, GL_RGBA, GL_UNSIGNED_INT_2_10_10_10_REV, );
:
U Y V A | Y U Y A | V Y U A | Y V Y A
Blackmagic : http://i.imgur.com/PtXBJbS.png
32 ( 10 "R, G, B" 2 ). , , 6 128 . .
, 2D (tex, coord).rgb, (, UYV vs YUY) , , .
, , , GL , , , , /, /mag filtering ( ) . (, , ), , .
?