Why do GDI windows use the RGBA format for `COLORREF` instead of BGRA?

MSDN Status :

When specifying an explicit RGB color, the COLORREF value has the following hexadecimal form:

0x00bbggrr

The low order byte contains the relative intensity of red; the second byte contains the value for green; and the third byte contains the value for blue. high byte must be zero. The maximum value for one byte is 0xFF.

From wingdi.h

 #define RGB(r,g,b) ((COLORREF)((BYTE)(r) | ((BYTE)(g) << 8) | ((BYTE)(b) << 16))) #define GetRValue(rgb) ((BYTE) (rgb) ) #define GetGValue(rgb) ((BYTE) ((rgb) >> 8)) #define GetBValue(rgb) ((BYTE) ((rgb) >> 16)) 

Since the windows are a bit endian, COLORREF is in RGBA format. Does it look weird because it's not the color format that Windows uses internally, BGR (A)?

The RGBQUAD structure RGBQUAD defined as

 typedef struct tagRGBQUAD { BYTE rgbBlue; BYTE rgbGreen; BYTE rgbRed; BYTE rgbReserved; } RGBQUAD; 

which unlike COLORREF , BGRA.

Since the bitblt function expects an array of COLORREF values, this means that there is always a transition from RGBA to BGRA during each call if Windows uses BGRA as its own format.

I don’t remember correctly, but I also read somewhere that the pixel format used in winapi has a strange mix.

Can someone explain?

+5
source share
1 answer

COLORREFs will return to when there is significantly less standardization in pixel formats. Many graphics cards still used palettes rather than the full 24- or 32-bit color, so even if your adapter needed re-ordering the bytes, you didn't have to do a lot of them. Some graphics cards even store images in separate color planes, rather than in the same plane of multi-channel colors. Then there was no β€œright” answer.

RGBQUAD came from the BMP format, which, as Raymond Chen said in the comments, comes from the OS / 2 bitmap format.

0
source

All Articles