How to use CIFilter in an instance of layerClass UIView?

My UIView uses an instance of TBPaperLayer for its layer.

+(Class)layerClass { return [TBPaperLayer class]; } 

I would like to create a CIFilter to change the look of this layer - especially apply a blur filter to it. How can I use this code to blur part of this layer? (code from: Blur CALayer Super Liver )

 CALayer *blurLayer = [CALayer layer]; CIFilter *blur = [CIFilter filterWithName:@"CIGaussianBlur"]; [blur setDefaults]; blurLayer.backgroundFilters = [NSArray arrayWithObject:blur]; [self.superlayer addSublayer:blurLayer]; 

There -init no superlayer in -init .

+4
source share
2 answers

This is not possible on iOS. From the CALayer class help :

Special considerations

This property is not supported on levels in iOS.

Presumably, Apple does not believe that the current generation of iOS equipment is powerful enough to support real-time filtering.

+10
source

For iOS 6.1, I needed a view that encapsulated the diminishing effect of inward and attenuation, which is used in some motion graphics for title sequences. My final code (not everything is shown here) is steadily reducing both the stretch factor (internal reduction of the horizontal scale of the text) and the amount of blur. Text to which this effect is applied is displayed as UIBezierPath and stored in self.myPath. The timer starts a method that reduces two values ​​and calls setNeedsDisplay.

 - (void)displayLayer:(CALayer *)layer { UIGraphicsBeginImageContext(self.bounds.size); CGContextRef ctx = UIGraphicsGetCurrentContext(); CGAffineTransform stretch = CGAffineTransformMakeScale(self.stretchFactor + 1.0, 1.0); CGPathRef stretchedPath = CGPathCreateCopyByTransformingPath([self.myPath CGPath], &stretch); CGRect newBox = CGPathGetBoundingBox(stretchedPath); float deltaX = CGRectGetMidX(self.bounds) - CGRectGetMidX(newBox); float deltaY = CGRectGetMidY(self.bounds) - CGRectGetMidY(newBox); CGAffineTransform slide = CGAffineTransformMakeTranslation(deltaX, deltaY); CGPathRef centeredPath = CGPathCreateCopyByTransformingPath(stretchedPath, &slide); CGPathRelease(stretchedPath); CGContextAddPath(ctx, centeredPath); CGPathRelease(centeredPath); CGContextSetFillColorWithColor(ctx, [[UIColor blackColor] CGColor]); CGContextFillPath(ctx); UIImage *tmpImage = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); CIImage *inputImage = [CIImage imageWithCGImage:[tmpImage CGImage]]; CIFilter *gBlurFilter = [CIFilter filterWithName:@"CIGaussianBlur" keysAndValues:@"inputRadius", [NSNumber numberWithFloat:self.blurFactor], @"inputImage", inputImage, nil]; CIImage *blurredImage = [gBlurFilter outputImage]; CIContext *context = [CIContext contextWithOptions:nil]; CGImageRef cgimg = [context createCGImage:blurredImage fromRect:[blurredImage extent]]; [layer setContents:(__bridge id)cgimg]; CGImageRelease(cgimg); } - (void)drawRect:(CGRect)rect { // empty drawRect: to get the attention of UIKit } 

I have not yet checked this code for leaks, so consider it as a "pseudo-code" :-) As shown, this could be done in drawRect: and not use layers, but I have other things that happen in this view, and not shown in this abridged version.

But since CIGaussianBlur takes a noticeable amount of time, I look at image processing using the Accelerate infrastructure to see how to make my version more fluid.

+1
source

All Articles