Png iphone touch event registration transparency?

I have three png "320 x 480px" which I load into separate UIImageView. PNG names are body, mouth, hat. By stacking the images on top of each other, I create a character whose body parts can be easily replaced. See Photo>

http://www.1976inc.com/dev/iphone/beast.jpg

My problem is that when you touch the largest UIImageView, the entire image, including transparency, logs a touch event. What I would like to do is make sure that touch events are logged only on png sections that are not transparent. Therefore, make sure that the user can interact with all three UIImageView.

I am sure it is simple, but I am new to iphone development and I cannot figure it out.


Update Therefore, I realized that the easiest way to accomplish what I want to do is to create a loop and create a context for each png, then get the color data for the pixels in which the touch event occurred. If the pixel represents a transparent area, I go to the next image and try to do the same. This works, but only for the first time. For example, the first time I click on the main view, I get this output

2010-07-26 15: 50: 06.285 colorTest [21501: 207] hat
2010-07-26 15: 50: 06.286 colorTest [21501: 207] offset: 227024 colors: RGB A 0 0 0 0
2010-07-26 15: 50: 06.293 colorTest [21501: 207] mouth
2010-07-26 15: 50: 06.293 colorTest [21501: 207] offset: 227024 colors: RGB A 0 0 0 0
2010-07-26 15: 50: 06.298 colorTest [21501: 207] body
2010-07-26 15: 50: 06.299 colorTest [21501: 207] offset: 227024 colors: RGB A 255 255 255 255

what i would like to see. But if I again click on the same area, I will get.

2010-07-26 15: 51: 21.625 colorTest [21501: 207] hat
2010-07-26 15: 51: 21.626 colorTest [21501: 207] offset: 283220 colors: RGB A 255 255 255 255
2010-07-26 15: 51: 21.628 colorTest [21501: 207] mouth
2010-07-26 15: 51: 21.628 colorTest [21501: 207] offset: 283220 colors: RGB A 255 255 255 255
2010-07-26 15: 51: 21.630 colorTest [21501: 207] body
2010-07-26 15: 51: 21.631 colorTest [21501: 207] offset: 283220 colors: RGB A 255 255 255 255

Here is the code I'm using;

the touch event is present in the mainView for the application

- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"Touched balls"); UITouch *touch = [touches anyObject]; CGPoint point = [touch locationInView:self.view]; UIColor *transparent = [UIColor colorWithRed:0 green:0 blue:0 alpha:0]; for( viewTest *currentView in imageArray){ //UIColor *testColor = [self getPixelColorAtLocation:point image:currentView.image]; [currentView getPixelColorAtLocation:point]; } } 

It makes a method call in a user class that extends the image. ViewView. The function returns the color of the pixel under the touch interface.

 - (UIColor*) getPixelColorAtLocation:(CGPoint)point { UIColor *color = nil; CGImageRef inImage = self.image.CGImage; CGContextRef context = [self createARGBBitmapContextFromImage:inImage]; if(context == NULL) return nil; size_t w = CGImageGetWidth(inImage); size_t h = CGImageGetHeight(inImage); CGRect rect = {{0,0},{w,h}}; // Draw the image to the bitmap context. Once we draw, the memory // allocated for the context for rendering will then contain the // raw image data in the specified color space. CGContextDrawImage(context, rect, inImage); // Now we can get a pointer to the image data associated with the bitmap // context. unsigned char* data = CGBitmapContextGetData (context); if (data != NULL) { //offset locates the pixel in the data from x,y. //4 for 4 bytes of data per pixel, w is width of one row of data. int offset = 4*((w*round(point.y))+round(point.x)); int alpha = data[offset]; int red = data[offset+1]; int green = data[offset+2]; int blue = data[offset+3]; NSLog(@"%@",name); NSLog(@"offset: %i colors: RGB A %i %i %i %i ",offset,red,green,blue,alpha); color = [UIColor colorWithRed:(red/255.0f) green:(green/255.0f) blue:(blue/255.0f) alpha:(alpha/255.0f)]; } // When finished, release the context CGContextRelease(context); // Free image data memory for the context if (data) { free(data); } return color; } - (CGContextRef) createARGBBitmapContextFromImage:(CGImageRef) inImage { CGContextRef context = NULL; CGColorSpaceRef colorSpace; void * bitmapData; int bitmapByteCount; int bitmapBytesPerRow; // Get image width, height. We'll use the entire image. size_t pixelsWide = CGImageGetWidth(inImage); size_t pixelsHigh = CGImageGetHeight(inImage); // Declare the number of bytes per row. Each pixel in the bitmap in this // example is represented by 4 bytes; 8 bits each of red, green, blue, and // alpha. bitmapBytesPerRow = (pixelsWide * 4); bitmapByteCount = (bitmapBytesPerRow * pixelsHigh); // Use the generic RGB color space. colorSpace = CGColorSpaceCreateDeviceRGB();//CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB); if (colorSpace == NULL) { fprintf(stderr, "Error allocating color space\n"); return NULL; } // Allocate memory for image data. This is the destination in memory // where any drawing to the bitmap context will be rendered. bitmapData = malloc( bitmapByteCount ); if (bitmapData == NULL) { fprintf (stderr, "Memory not allocated!"); CGColorSpaceRelease( colorSpace ); return NULL; } // Create the bitmap context. We want pre-multiplied ARGB, 8-bits // per component. Regardless of what the source image format is // (CMYK, Grayscale, and so on) it will be converted over to the format // specified here by CGBitmapContextCreate. context = CGBitmapContextCreate (bitmapData, pixelsWide, pixelsHigh, 8, // bits per component bitmapBytesPerRow, colorSpace, kCGImageAlphaPremultipliedFirst); if (context == NULL) { free (bitmapData); fprintf (stderr, "Context not created!"); } // Make sure and release colorspace before returning CGColorSpaceRelease( colorSpace ); return context; } 

Update 2 Thanks for the quick response. I'm not sure if I will follow you. If I changed hidden to true, then the "UIImageView" layer is hidden. I want the transparent part of png not to register touch events. So, for example, if you look at the image that I included in the message. If you click on a worm, stem or leaves, "which are all part of the same png," this event will be triggered by this ImageView, but if you touch the circle, this event will be triggered by this ImageView. BTW here is the code that I use to put them in a view.

 UIView *tempView = [[UIView alloc] init]; [self.view addSubview:tempView]; UIImageView *imageView1 = [[UIImageView alloc] initWithImage:[UIImage imageNamed:@"body.png"] ]; [imageView1 setUserInteractionEnabled:YES]; UIImageView *imageView2 = [[UIImageView alloc] initWithImage:[UIImage imageNamed:@"mouth.png"] ]; [imageView2 setUserInteractionEnabled:YES]; UIImageView *imageView3 = [[UIImageView alloc] initWithImage:[UIImage imageNamed:@"hat.png"] ]; [imageView3 setUserInteractionEnabled:YES]; [tempView addSubview:imageView1]; [tempView addSubview:imageView2]; [tempView addSubview:imageView3]; [self.view addSubview:tempView]; 
+4
source share
1 answer

Primarily:

You can use transparency, but hiding the image is likely to fit your needs.

You can hide the image using the following command: [myImage setHidden:YES]; or myImage.hidden = YES;

 if (CGRectContainsPoint(myImage.frame, touchPosition)==true && myImage.hidden==NO) { } 

This ensures that the image is not transparent on the click, because myImage.hidden==NO checks if the image is hidden or not.

0
source

All Articles