Does alpha OpenGL make texture whiter?

I am trying to load a texture with RGBA values, but alpha values โ€‹โ€‹just make the texture more white and not adjust the transparency. I heard about this issue with 3D scenes, but I just use OpenGL for 2D. Anyway, can I fix it?

I initialize OpenGL with

glViewport(0, 0, winWidth, winHeight); glDisable(GL_TEXTURE_2D); glEnable(GL_BLEND); glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); glDisable(GL_DEPTH_TEST); glClearColor(0, 0, 0, 0); glMatrixMode(GL_PROJECTION); glLoadIdentity(); gluOrtho2D(0, winWidth, 0, winHeight); // set origin to bottom left corner glMatrixMode(GL_MODELVIEW); glLoadIdentity(); glColor3f(1, 1, 1); 

Screenshot: 2HnoT.png This blurry bitmap should be translucent. Black bits should be completely transparent. As you can see, there is an image that is not displayed.

The code for creating such a texture is quite long, so I will describe what I did. This is an array of 40 * 30 * 4 of type unsigned char. Every fourth char is set to 128 (should it be 50% transparent, right?).

Then I pass it to this function, loads the data into the texture:

 void Texture::Load(unsigned char* data, GLenum format) { glEnable(GL_TEXTURE_2D); glBindTexture(GL_TEXTURE_2D, _texID); glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, _w, _h, format, GL_UNSIGNED_BYTE, data); glDisable(GL_TEXTURE_2D); } 

And ... I think I just found a problem. Initialized a full-sized texture with this code:

 glEnable(GL_TEXTURE_2D); glBindTexture(GL_TEXTURE_2D, _texID); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, tw, th, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, NULL); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE); glDisable(GL_TEXTURE_2D); 

But I think glTexImage2D should also be GL_RGBA? Can't I use two different internal formats? Or at least not from different sizes (3 bytes versus 4 bytes)? GL_BGR works fine even when it is initialized as follows ...

+4
source share
3 answers

In the interest of others, I post my decision here.

The problem was that although my Load function was correct,

 glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, _w, _h, GL_RGBA, GL_UNSIGNED_BYTE, data); 

I passed GL_RGB to this function

 glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, tw, th, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, NULL); 

You must also specify the correct number of bytes (four). In my opinion, you cannot use a different number of bytes for SubImage, although I think you can use a different format if it has the same number of bytes (i.e. mixing GL_RGB and GL_BGA is fine, but not GL_RGB and GL_RGBA ).

+4
source

Are there any overlapping primitives in your scene?

You know that you are calling a 3-parameter version of glColor that sets alpha 1.0, right?

It would be useful if you could post a screenshot or otherwise describe what happens, say, when drawing two primitives with the same colors and different alpha. In fact, any code that demonstrates the problem can help.

Edit:

I would suggest that using TexImage with GL_RGB (for internal format, 3rd parameter) creates a three-component texture without alpha or alpha values โ€‹โ€‹implicitly initialized to 1, no matter what pixel data you provide.

GL_BGR not a valid value for this parameter, maybe it is cheating on your implementation to use the full 4-byte internal format? (Or 2-byte, by GL_LUMINANCE_ALPHA ) Or do you mean passing GL_BGR to your Texture :: Load () function, which should not be different from passing GL_RGB ?

+1
source

I think this should work, but assumes the image has an alpha channel. If you try to upload an image without an alpha channel, you will get an exception or your application may crash. For images without an alpha channel, use GL_RGB instead of GL_RGBA for the second parameter, before setting GL_UNSIGNED_BYTE .

 void Texture::Load(unsigned char* data) { glEnable(GL_TEXTURE_2D); glBindTexture(GL_TEXTURE_2D, _texID); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, tw, th, 0, GL_RGBA, GL_UNSIGNED_BYTE, data); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE); glDisable(GL_TEXTURE_2D); } 
0
source

All Articles