I'm trying to play a video (MP4 / H.263) on iOS, but getting really fuzzy results. Here is the code to initialize reading an asset:
mTextureHandle = [self createTexture:CGSizeMake(400,400)]; NSURL * url = [NSURL fileURLWithPath:file]; mAsset = [[AVURLAsset alloc] initWithURL:url options:NULL]; NSArray * tracks = [mAsset tracksWithMediaType:AVMediaTypeVideo]; mTrack = [tracks objectAtIndex:0]; NSLog(@"Tracks: %i", [tracks count]); NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; NSDictionary * settings = [[NSDictionary alloc] initWithObjectsAndKeys:value, key, nil]; mOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:mTrack outputSettings:settings]; mReader = [[AVAssetReader alloc] initWithAsset:mAsset error:nil]; [mReader addOutput:mOutput];
So much to read init, now the actual texturing:
CMSampleBufferRef sampleBuffer = [mOutput copyNextSampleBuffer]; CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); CVPixelBufferLockBaseAddress( pixelBuffer, 0 ); glBindTexture(GL_TEXTURE_2D, mTextureHandle); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 600, 400, 0, GL_BGRA_EXT, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddress( pixelBuffer )); CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 ); CFRelease(sampleBuffer);
Everything works well ... except that the rendered image looks like this: sliced ββand skewed?

I even tried looking at AVAssetTrack preferred transformation matrix, but to no avail, since it always returns CGAffineTransformIdentity .
Side note: if I switch the source to the camera, the image will turn out fine. Did I miss some decompression step? Should this not be handled by the asset reader?
Thanks!
Code: https://github.com/shaded-enmity/objcpp-opengl-video