IOS 5 + GLKView: how to access RGB pixel data for a set of vertexes based on colors?

I converted my own OGLES 2.0 framework to take advantage of the functionality added by the new iOS 5 GLKit .

After some nice results, I now want to implement a color picker based on here . To do this, you need to access the buffer back to get the RGBA value with an indirect pixel, which is then used as a unique identifier for the vertex / primitive / display object. Of course, this requires a temporary unique coloring of all the vertices / primitives / displayed objects.

I have two questions, and I would be very grateful for the help in this:

  • I have access to GLKViewController , GLKView , CAEAGLLayer ( GLKView ) and EAGLContext . I also have access to all OGLES 2.0 related buffers. How to combine them to determine the color of a pixel in the EAGLContext that I click on the screen?

  • Given that I use Vertex buffer objects to render, is there a neat way to override the color provided to my vertex shader which, firstly, does not include modifying the buffered vertex (color) attributes, and secondly, does not include adding IF to the vertex shader?

I guess the answer to (2) is no, but for reasons of performance and poorly thought out code modifications, I thought it would be wise to check with someone more experienced.

Any suggestions would be greatly appreciated. Thank you for your time.

UPDATE

Well, now I know how to read pixel data from the active frame buffer using glReadPixels . Therefore, I assume that I just need to make special β€œunique colors” in the back buffer, briefly switch to it and read the pixels, and then switch back. This will inevitably create a visual flicker, but I think this is the easiest way; of course, faster (and more reasonable) than creating a CGImageContextRef from a screen shot and analyzing this path.

However, any hints regarding the back buffer would be much appreciated.

+6
colors pixel ios5
source share
1 answer

Well, I developed exactly how to do this as short as possible. Below I will explain how to achieve this and list all the necessary code :)

To allow touch interactions to select a pixel, first add a UITapGestureRecognizer to your GLKViewController subclass (assuming you want to use pixel-select-pixel) with the following target method inside this class. You must make your subclass GLKViewController a UIGestureRecognizerDelegate :

 @interface GLViewController : GLKViewController <GLKViewDelegate, UIGestureRecognizerDelegate> 

After creating your gesture recognizer, add it to the view property (which in GLKViewController is actually GLKView ):

 // Inside GLKViewController subclass init/awakeFromNib: [[self view] addGestureRecognizer:[self tapRecognizer]]; [[self tapRecognizer] setDelegate:self]; 

Set the target action for the gesture recognizer; you can do this by creating it using a specific init... but I created it using a Storyboard (aka β€œNew Interface Designer in Xcode 4.2”) and connected it that way.

Anyway, here is my target action for the gesture recognizer:

 -(IBAction)onTapGesture:(UIGestureRecognizer*)recognizer { const CGPoint loc = [recognizer locationInView:[self view]]; [self pickAtX:loc.x Y:loc.y]; } 

The pick method that I defined inside my subclass of GLKViewController :

 -(void)pickAtX:(GLuint)x Y:(GLuint)y { GLKView *glkView = (GLKView*)[self view]; UIImage *snapshot = [glkView snapshot]; [snapshot pickPixelAtX:x Y:y]; } 

It uses the convenient new snapshot method that Apple has kindly included in GLKView to create UIImage from the underlying EAGLContext .

It is important to note the comment in the snapshot API documentation, which states:

This method should be called whenever your application explicitly needs the contents of the view; never attempt to directly read the contents of the underlying framebuffer using OpenGL ES features.

This made me understand why my previous attempts to call glReadPixels in an attempt to access the pixel data generated by EXC_BAD_ACCESS and the pointer that sent me the right way.

You will notice that in my pickAtX:Y: method defined a moment ago, I call pickPixelAtX:Y: on a UIImage . This is the method that I added to UIImage in the user category:

 @interface UIImage (NDBExtensions) -(void)pickPixelAtX:(NSUInteger)x Y:(NSUInteger)y; @end 

Here is the implementation; This requires a final list of codes. The code came from this question and was modified in accordance with the answer received there:

 @implementation UIImage (NDBExtensions) - (void)pickPixelAtX:(NSUInteger)x Y:(NSUInteger)y { CGImageRef cgImage = [self CGImage]; size_t width = CGImageGetWidth(cgImage); size_t height = CGImageGetHeight(cgImage); if ((x < width) && (y < height)) { CGDataProviderRef provider = CGImageGetDataProvider(cgImage); CFDataRef bitmapData = CGDataProviderCopyData(provider); const UInt8* data = CFDataGetBytePtr(bitmapData); size_t offset = ((width * y) + x) * 4; UInt8 b = data[offset+0]; UInt8 g = data[offset+1]; UInt8 r = data[offset+2]; UInt8 a = data[offset+3]; CFRelease(bitmapData); NSLog(@"R:%i G:%i B:%i A:%i",r,g,b,a); } } @end 

I initially tried to find the code contained in the Apple API document entitled "Getting Pixel Data from a CGImage Context", which requires 2 method definitions instead of 1, but much more code is needed and there is data like void * for which I could not implement correct interpretation.

What is it! Add this code to your project, then by clicking on the pixel it will display it in the form:

 R:24 G:46 B:244 A:255 

Of course, you should write some means to extract these RGBA int values ​​(which will be in the range of 0 to 255) and use them however you want. One approach is to return a UIColor from the above method created in this way:

 UIColor *color = [UIColor colorWithRed:red/255.0f green:green/255.0f blue:blue/255.0f alpha:alpha/255.0f]; 
+11
source share

All Articles