This is probably due to another unsolved mystery .
I am drawing Orthographic 2d on an iPhone using a real device and simulator. I am trying to color my pixels of a given color depending on how far they are from an arbitrary point in the pixel space "A" that I am transmitting (hard code). I do everything in Retina resolution 960x640. I calculate the distance from A to gl_FragCoord , and I am the color based on the jump between 2 colors with a maximum distance of 300 pixels.
When on the simulator (with the retina displayed) I need to give the center point “460” pixels for the middle earring of the X screen. YI gives 160 pixels and I look for a distance of 300 feet to get the same effect on the device I need a 960X center and a distance of 150, to get the same results (interestingly, the px 80 does not give the same results that I want, but 160 may be an overshoot of the original ...)
Obviously, the problem of the retina is playing. But where and how, and how to find and fix it?
I use:
glViewport(0, 0, 960.0f, 640.0f);
and
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &framebufferWidth); glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &framebufferHeight);
and
[self setView:[[EAGLView alloc] initWithFrame:[UIScreen mainScreen].bounds]];
[(EAGLView *)[self view] setContentScaleFactor:2.0f];
ios opengl-es retina-display shader
Nektarios
source share