Creating NSImage from CGImageRef Doubles the Image Pixel Size

capture- This CGImageRefis returned from the call CGWindowListCreateImage(). When I try to turn it into NSImagedirectly through initWithCGImage:size:, it mysteriously doubles in size. If I instead manually create NSBitmapImageRepfrom capture, and then add it to an empty one NSImage, everything will work fine.

My hardware setup is an external retina-free MBP + mesh display. Capture occurs on the screen without a retina.

NSLog(@"capture image size: %d %d", CGImageGetWidth(capture), CGImageGetHeight(capture));
NSLog(@"logical image size: %f %f", viewRect.size.width, viewRect.size.height);

NSBitmapImageRep *debugRep;
NSImage *image;

//
// Create NSImage directly

image = [[NSImage alloc] initWithCGImage:capture size:NSSizeFromCGSize(viewRect.size)];

debugRep = [[image representations] objectAtIndex:0];
NSLog(@"pixel size, NSImage direct: %d %d", debugRep.pixelsWide, debugRep.pixelsHigh);

//
// Create representation manually

NSBitmapImageRep *imageRep = [[NSBitmapImageRep alloc] initWithCGImage:capture];
image = [[NSImage alloc] initWithSize:NSSizeFromCGSize(viewRect.size)];
[image addRepresentation:imageRep];
[imageRep release];

debugRep = [[image representations] objectAtIndex:0];
NSLog(@"pixel size, NSImage + manual representation: %d %d", debugRep.pixelsWide, debugRep.pixelsHigh);

Log output:

capture image size: 356 262
logical image size: 356.000000 262.000000
pixel size, NSImage direct: 712 524
pixel size, NSImage + manual representation: 356 262

Is this the expected behavior?

+4
source share
1 answer

The documentation for initWithCGImage:size:states:

, CGImage.

NSBitmapImageRep .

+1

All Articles