I'm trying to read pixels from a screen buffer, I create CGImageRefc CGDisplayCreateImage, but the values ββfor CGImageGetWidthand CGImageGetBytesPerRowdo not make sense together, dividing the bytes for each line bytes per pixel gives me 1376 pixels per line, but the image width is 1366.
What's going on here? Are there any additions to the image? How do I read the data that I get from it safely and with the correct results?
Edit: The minimum code required to play is as follows:
#import <Foundation/Foundation.h>
#import <ApplicationServices/ApplicationServices.h>
int main(int argc, const char * argv[])
{
@autoreleasepool {
CGImageRef image = CGDisplayCreateImage(CGMainDisplayID());
size_t width = CGImageGetWidth(image);
size_t bpr = CGImageGetBytesPerRow(image);
size_t bpp = CGImageGetBitsPerPixel(image);
size_t bpc = CGImageGetBitsPerComponent(image);
size_t bytes_per_pixel = bpp / bpc;
NSLog(@"%li %li", bpr/bytes_per_pixel, width);
CGImageRelease(image);
}
return 0;
}
source
share