GPUImageMovie uses multiple images as textures and processes

I am trying to use multiple GPUImagePicture as texture sources along with a fragmented shader to filter the video being played.

I can process still images in this quality, but I cannot understand what I am missing to get this to work on GPUImageMovie . I would appreciate any help offered.

 @property (nonatomic, strong) GPUImageView *gpuPlayerView; @property (nonatomic, strong) GPUImageMovie *gpuMovie; AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:self.video]; self.player = [AVPlayer playerWithPlayerItem:playerItem]; self.player.actionAtItemEnd = AVPlayerActionAtItemEndNone; [self.player play]; self.gpuMovie = [[GPUImageMovie alloc]initWithPlayerItem:playerItem]; self.gpuMovie.playAtActualSpeed = YES; GPUImagePicture *sourcePicture1 = [[GPUImagePicture alloc]initWithImage: [UIImage imageNamed:@"FilterBG"]]; GPUImagePicture *sourcePicture2 = [[GPUImagePicture alloc]initWithImage: [UIImage imageNamed:@"FilterOverlay"]]; GPUImagePicture *sourcePicture3 = [[GPUImagePicture alloc]initWithImage: [UIImage imageNamed:@"Filter1Map"]]; GPUImageFilter *filter = [[GPUImageFourInputFilter alloc]initWithFragmentShaderFromString: kFilter1ShaderString]; [self.gpuMovie addTarget:filter atTextureLocation:0]; if (sourcePicture1) { [sourcePicture1 addTarget:filter atTextureLocation:1]; } if (sourcePicture2) { [sourcePicture2 addTarget:filter atTextureLocation:2]; } if (sourcePicture3) { [sourcePicture3 addTarget:filter atTextureLocation:3]; } [filter addTarget:self.gpuPlayerView]; [self.gpuMovie startProcessing]; 
+8
ios objective-c gpuimage avplayer
source share
3 answers

There is a method that you can use to achieve the same

CVOpenGLESTextureCacheCreateTextureFromImage

which allows you to use a shared buffer between the GL texture and the movie. This core video core OpenGLES cache uses caching and texture management by CVOpenGLESTextureRef. These texture caches give you the ability to directly read and write buffers with various pixel formats, such as 420v or BGRA, from GLES.

 //Mapping a BGRA buffer as a source texture: CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_RGBA, width, height, GL_RGBA, GL_UNSIGNED_BYTE, 0, &outTexture); //Mapping a BGRA buffer as a renderbuffer: CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_RENDERBUFFER, GL_RGBA8_OES, width, height, GL_RGBA, GL_UNSIGNED_BYTE, 0, &outTexture); //Mapping the luma plane of a 420v buffer as a source texture: CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_LUMINANCE, width, height, GL_LUMINANCE, GL_UNSIGNED_BYTE, 0, &outTexture); //Mapping the chroma plane of a 420v buffer as a source texture: CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_LUMINANCE_ALPHA, width/2, height/2, GL_LUMINANCE_ALPHA, GL_UNSIGNED_BYTE, 1, &outTexture); //Mapping a yuvs buffer as a source texture (note: yuvs/f and 2vuy are unpacked and resampled -- not colorspace converted) CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_RGB_422_APPLE, width, height, GL_RGB_422_APPLE, GL_UNSIGNED_SHORT_8_8_APPLE, 1, &outTexture); CVReturn CVOpenGLESTextureCacheCreateTextureFromImage ( CFAllocatorRef __nullable allocator, CVOpenGLESTextureCacheRef __nonnull textureCache, CVImageBufferRef __nonnull sourceImage, CFDictionaryRef __nullable textureAttributes, GLenum target, GLint internalFormat, GLsizei width, GLsizei height, GLenum format, GLenum type, size_t planeIndex, CVOpenGLESTextureRef __nullable * __nonnull textureOut ); 

This function either creates a new one or returns a cached CVOpenGLESTextureRef texture object associated with CVImageBufferRef and associated parameters. This operation creates a direct binding between the image buffer and the main texture object.

I hope this helps you in creating what is required :)

-one
source share

I tried the GPUImage sample code ( SimpleVideoFileFilter ) to achieve your result. I created a GPUImageFilterGroup and added filters to it. To add images as a texture, I used the GPUImageMissEtikateFilter filter GPUImageMissEtikateFilter . Here is my code:

 NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:@"sample_iPod" withExtension:@"m4v"]; movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL]; movieFile.runBenchmark = YES; movieFile.playAtActualSpeed = YES; // filter = [[GPUImageFilter alloc] initWithFragmentShaderFromFile:@"CustomFilter"]; filter = [[GPUImageFilterGroup alloc] init]; GPUImageSaturationFilter *saturationFilter = [[GPUImageSaturationFilter alloc] init]; [(GPUImageSaturationFilter *)saturationFilter setSaturation:0.0]; [(GPUImageFilterGroup *)filter addFilter:saturationFilter]; GPUImagePicture *lookupImageSource = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:@"lookup_miss_etikate"]]; GPUImageLookupFilter *lookupFilter = [[GPUImageLookupFilter alloc] init]; [(GPUImageFilterGroup *)filter addFilter:lookupFilter]; [lookupImageSource addTarget:lookupFilter atTextureLocation:1]; [lookupImageSource processImage]; [saturationFilter addTarget:lookupFilter]; [(GPUImageFilterGroup *)filter setInitialFilters:@[saturationFilter]]; [(GPUImageFilterGroup *)filter setTerminalFilter:lookupFilter]; [movieFile addTarget:filter]; // Only rotate the video for display, leave orientation the same for recording GPUImageView *filterView = (GPUImageView *)self.view; [filter addTarget:filterView]; [movieFile startProcessing]; 

Hoping this helps you.

-2
source share

After initializing your GPUImage:

 GPUImagePicture *sourcePicture1 = [[GPUImagePicture alloc]initWithImage:[UIImage imageNamed:@"FilterBG"]]; 

you should call:

 [sourcePicture1 processImage]; 

to efficiently load an image and make it available to the GPUImage pipeline.

And do the same for sourcePicture2 and sourcePicture3 .

Update:

What I needed to do several years ago when I used it with films and photos was to add a notification when the film was downloaded and ready to play (in the GPUImageMovie.m method, processAsset , before self.decodeIsAllowed = YES; Only when the movie was ready did I set its GPUImage pipeline target.

-2
source share

All Articles