AVFoundation: reading RAW single frames from mov file (v210, uncompressed 10-bit 4: 2: 2)

To extract the RAW image buffer from uncompressed v210 (with the pixel format kCVPixelFormatType_422YpCbCr10) I tried to execute this excellent post: Reading samples through AVAssetReader

The problem is that when it comes to startReading my Reader resource , I got AVAssetReaderStatusFailed (with setting the NSMutableDictionary kCVPixelBufferPixelFormatTypeKey object to output the setting for kCVPixelFormatType_422YpCbCr10). If I leave outputSettings equal to zero, it parses every frame, but the CMSampleBufferRef buffers are empty.

I tried many pixel formats like

  • kCVPixelFormatType_422YpCbCr10
  • kCVPixelFormatType_422YpCbCr8
  • kCVPixelFormatType_422YpCbCr16
  • kCVPixelFormatType_422YpCbCr8_yuvs
  • kCVPixelFormatType_422YpCbCr8FullRange
  • kCVPixelFormatType_422YpCbCr_4A_8BiPlanar

but none of them work.

What am I doing wrong?

Any comments are welcome ...

Here is my code:

/* construct an AVAssetReader based on an file based URL */
NSError *error=[[NSError alloc]init];
NSString *filePathString=[[NSString alloc]
                          initWithString:@"/Users/johann/Raw12bit.mov"];

NSURL *movieUrl=[[NSURL alloc] initFileURLWithPath:filePathString];
AVURLAsset *movieAsset=[[AVURLAsset alloc] initWithURL:movieUrl options:nil]; 

/* determine image dimensions of images stored in movie asset */
CGSize size=[movieAsset naturalSize];  
NSLog(@"movie asset natual size: size.width=%f size.height=%f", 
      size.width, size.height);

/* allocate assetReader */
AVAssetReader *assetReader=[[AVAssetReader alloc] initWithAsset:movieAsset
                                                          error:&error];

/* get video track(s) from movie asset */
NSArray *videoTracks=[movieAsset tracksWithMediaType:AVMediaTypeVideo];

/* get first video track, if there is any */
AVAssetTrack *videoTrack0=[videoTracks objectAtIndex:0];

/* set the desired video frame format into attribute dictionary */
NSMutableDictionary* dictionary=[NSDictionary dictionaryWithObjectsAndKeys:
  [NSNumber numberWithInt:kCVPixelFormatType_422YpCbCr10], 
  (NSString*)kCVPixelBufferPixelFormatTypeKey,
  nil];

/* construct the actual track output and add it to the asset reader */
AVAssetReaderTrackOutput* assetReaderOutput=[[AVAssetReaderTrackOutput alloc] 
                                             initWithTrack:videoTrack0 
                                             outputSettings:dictionary]; //nil or dictionary
/* main parser loop */
NSInteger i=0;
if([assetReader canAddOutput:assetReaderOutput]){
  [assetReader addOutput:assetReaderOutput];

  NSLog(@"asset added to output.");

  /* start asset reader */
  if([assetReader startReading]==YES){
    /* read off the samples */
    CMSampleBufferRef buffer;
    while([assetReader status]==AVAssetReaderStatusReading){
      buffer=[assetReaderOutput copyNextSampleBuffer];
      i++;
      NSLog(@"decoding frame #%ld done.", i);
    }
  }
  else {
    NSLog(@"could not start reading asset.");
    NSLog(@"reader status: %ld", [assetReader status]);
  }
}
else {
  NSLog(@"could not add asset to output.");
}

Regards, Johann

+5
source share
2 answers

Try it kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, it works great for me.

+2
source
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
kCVPixelFormatType_32BGRA

only supports this.

0
source

All Articles