I have some code that just loads some contrived data into a texture:
glActiveTexture(GL_TEXTURE0+gl_sim_texture_active_n); glBindTexture(GL_TEXTURE_2D, gl_sim_texture_buff_id); for(int i = 0; i < w*h; i++) buff[i] = 0xAB; glTexImage2D(GL_TEXTURE_2D,0,GL_ALPHA,w,h,0,GL_ALPHA,GL_UNSIGNED_BYTE,buff);
and code that just displays this texture in my shaders:
uniform vec2 viewport; uniform sampler2D sim_texture; void main() { vec2 tex_uv = vec2(gl_FragCoord.x/(viewport.x-1.),gl_FragCoord.y/(viewport.y-1.)); gl_FragColor = texture2D(sim_texture,tex_uv).argb; //swizzle is to just put 'a' in a visibly renderable position as "redness" }
On OSX and Android, this texture is read in my shader (through sampler2D - nothing unusual) - it works. On iOS, any samples from this sampler2D return vec4(0.,0.,0.,1.) , Regardless of the data entered.
(Note that when I change GL_ALPHA to GL_RGBA , attach the texture to the framebuffer and then call glReadPixels after glTexImage2D . I will return the data that I entered, regardless of the platform, and the functionality (or lack of it on iOS) remains the same. Switch to GL_RGBA it was necessary only for binding to the framebuffer, which is necessary for glReadPixels , which I only care about for debugging purposes. tl; dr: I am sure that the data is loaded onto the texture on all platforms.)
Additional Information:
gl_sim_texture_active_n is 6, and gl_sim_texture_buff_id is 14 (both were obtained legally and without errors). Call glGetError() or glCheckFramebufferStatus(GL_FRAMEBUFFER) before or after the return. glGetIntegerv(GL_MAX_TEXTURE_IMAGE_UNITS) returns 8 (the same as on my test Android device.
I just completely lost why this would work on OSX / Android, not iOS. Any direction: where to go from here would be useful!