I am trying to use cocoa to capture images from a webcam. I can get the RGBA image using QTKit requests and the didOutputVideoFrame delegate and converting CVImageBuffer to CIImage and then to NSBitmapImageRep.
I know that my camera captures initially in YUV, I want to get YUV data directly from CVImageBuffer and execute a YUV frame before displaying it.
My question is: how can I get YUV data from CVImageBuffer?
thank.
jslap source
share