How can I get raw data from a CVImageBuffer object

I am trying to use cocoa to capture images from a webcam. I can get the RGBA image using QTKit requests and the didOutputVideoFrame delegate and converting CVImageBuffer to CIImage and then to NSBitmapImageRep.

I know that my camera captures initially in YUV, I want to get YUV data directly from CVImageBuffer and execute a YUV frame before displaying it.

My question is: how can I get YUV data from CVImageBuffer?

thank.

+5
source share
2 answers

, CIImage +[CIImage imageWithCVBuffer:], CIImage CGBitmapContext .

. .

+1

, , , , : CVImageBuffer ?

+1

All Articles