How to encode emission or mirror information in alpha of an open texture gl

I have an OpenGL texture with a UV map. I read about using the alpha channel to store some other value, which saves when you need to load an additional map. For example, you can store mirror information (gloss) or an emission map in alpha, because for this you only need to swim, and alpha is not used.

So I tried it. Writing a shader is not a problem. I have all this part. The problem is that all 4 channels go into the texture as I want.

I have all the cards, so in the PSD I put the base card in rgb and the emission map in a. But when you save as png, alpha either does not save (if you add it as a new channel) or drags rgb, previously increasing the transparency to rgb (if you use the map as a mask).

PNG files seem to support transparency, but not alpha channels as such. Thus, there is no way to control all 4 channels.

But I read about it. So, what format can I save to the PSD that I can download using my image downloader on the iPhone?

NSString *path = [[NSBundle mainBundle] pathForResource:name ofType:type]; NSData *texData = [[NSData alloc] initWithContentsOfFile:path]; UIImage *image = [[UIImage alloc] initWithData:texData]; 

Does this method accept other file formats? How TIFF, which will allow me to control all 4 channels?

I could use texturetool to create PVR .. but from documents, it also accepts PNG as input.

EDIT:

First of all, this is clear on the iPhone.

It could be a psd error. As I said, there are two ways to set up a document in my psd version (cc 14.2 mac), which I can find. One is to manually add a new channel and insert cards there. It appears as a red invoice. Secondly, add a mask, click it and paste alpha. In this case, he shows it with alpa as transparency with a checkerboard in areas with a zero alpha area. When I save as png, the alpha option is highlighted.

And when I load the png back into psd, it seems to be pre-multiplied. I can not return to my full rgb data in Photoshop.

Is there any other tool that I can use to combine two cards in png that will save it png-32?

TIFF will not work because it does not save alpha. Maybe I was thinking about TGA.

I also noticed this in my bootloader ...

  GLuint width = CGImageGetWidth(image.CGImage); GLuint height = CGImageGetHeight(image.CGImage); CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); void *imageData = malloc( height * width * 4 ); CGContextRef thisContext = CGBitmapContextCreate( imageData, width, height, 8, 4 * width, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big ); if (flipImage) { CGContextTranslateCTM (thisContext, 0, height); CGContextScaleCTM (thisContext, 1.0, -1.0); } CGColorSpaceRelease( colorSpace ); CGContextClearRect( thisContext, CGRectMake( 0, 0, width, height ) ); CGContextDrawImage( thisContext, CGRectMake( 0, 0, width, height ), image.CGImage ); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, imageData); glBindTexture(GL_TEXTURE_2D, textureInfo[texIndex].texture); 

When I create this context, the kCGImageAlphaPremultipliedLast option.

Maybe I need to try the glkit loader, but it seems that my png is pre-multiplied.

+1
source share
1 answer

You can create a PNG with an alpha channel, but you cannot read this PNG image using the built-in iOS APIs without first multiplying. The main problem is that CoreGraphics only supports the alpha preview for better performance. You should also be careful to disable the Xcode optimization for PNGs attached to the project file because it performs premultiplication at compile time. What you can do is compile and link in your own copy of libpng after disabling PNG Xcode processing, and then read the file directly using libpng at level C. But to be honest, this is a waste of time. Just save one image with RGB values ​​and the other in grayscale with alpha values, as values ​​0-255 shades of gray. You can have these shades of gray mean everything you want, and you don’t have to worry about pressing everything in order. Your opengl code just needs to be read from several textures, and not for large purposes.

0
source

All Articles