MSDN Status :
When specifying an explicit RGB color, the COLORREF value has the following hexadecimal form:
0x00bbggrr
The low order byte contains the relative intensity of red; the second byte contains the value for green; and the third byte contains the value for blue. high byte must be zero. The maximum value for one byte is 0xFF.
From wingdi.h
#define RGB(r,g,b) ((COLORREF)((BYTE)(r) | ((BYTE)(g) << 8) | ((BYTE)(b) << 16))) #define GetRValue(rgb) ((BYTE) (rgb) ) #define GetGValue(rgb) ((BYTE) ((rgb) >> 8)) #define GetBValue(rgb) ((BYTE) ((rgb) >> 16))
Since the windows are a bit endian, COLORREF is in RGBA format. Does it look weird because it's not the color format that Windows uses internally, BGR (A)?
The RGBQUAD structure RGBQUAD defined as
typedef struct tagRGBQUAD { BYTE rgbBlue; BYTE rgbGreen; BYTE rgbRed; BYTE rgbReserved; } RGBQUAD;
which unlike COLORREF , BGRA.
Since the bitblt function expects an array of COLORREF values, this means that there is always a transition from RGBA to BGRA during each call if Windows uses BGRA as its own format.
I donβt remember correctly, but I also read somewhere that the pixel format used in winapi has a strange mix.
Can someone explain?
source share