I want to load 4-channel texture data from a file in iOS, so I view the texture as a (continuous) map
[0,1]x[0,1] -> [0,1]x[0,1]x[0,1]x[0,1]
If I use fileformat .png , Xcode / iOS treats the file as an image and therefore multiplies each rgb component by a (premultiplex alpha), distorting my data. How do i solve this? Examples may be
- use two textures with
rgb components (3 channel) - postdivide alpha
- use a different file format
Of these, I consider the best solution to use a different file format. The compressed GL file format (PVRTC?) Is not independent of the Apple platform and seems to have a low resolution (4 bits) ( link ).
EDIT: If my own answer is below, it is not possible to get 4-channel png data in iOS. Since OpenGL is intended for creating images, not for representing images, it should be possible to somehow load 4-channel data. png is a file format for images ( and compression depends on all 4 channels , but the compression of one channel does not depend on other channels), so it can be argued that I should use a different file format. So, what other compressed file formats should I use that are easy to read / integrate on iOS?
UPDATE: "combinatorial" mentioned a way to load 4-channel continuous textures, so I had to give it the correct answer. However, this solution had some limitations that I did not like. My next question is: "Access to raw 4-channel data from png files in iOS" :)
I think this is a bad library design that does not allow reading 4-channel png data. I don't like systems that try to be smarter than me.
source share