I am implementing an application that uses real-time image processing on live camera images. It worked with restrictions, using now obsolete android.hardware.Camera; to increase flexibility and performance I would like to use the new android.hardware.camera2 API. However, I am having problems getting raw image data. This is on a Samsung Galaxy S5. (Unfortunately, I do not have another Lollipop device suitable for testing on other equipment).
I got the general structure (with inspiration from the “HdrViewFinder” and “Camera2Basic” samples), and a live image is drawn on the screen via SurfaceTexture and GLSurfaceView. However, I also need to access image data (grayscale is just fine, at least for now) for custom image processing. According to the documentation of StreamConfigurationMap.isOutputSupportedFor (class) , the recommended surface for receiving image data directly will be ImageReader (right?).
So, I configured my capture requests as:
mSurfaceTexture.setDefaultBufferSize(640, 480); mSurface = new Surface(surfaceTexture); ... mImageReader = ImageReader.newInstance(640, 480, format, 2); ... List<Surface> surfaces = new ArrayList<Surface>(); surfaces.add(mSurface); surfaces.add(mImageReader.getSurface()); ... mCameraDevice.createCaptureSession(surfaces, mCameraSessionListener, mCameraHandler);
and in the onImageAvailable callback for ImageReader, I access the data as follows:
Image img = reader.acquireLatestImage(); ByteBuffer grayscalePixelsDirectByteBuffer = img.getPlanes()[0].getBuffer();
... but for now (as said), viewing a live image is working, something is wrong with the data that I get here (or with the way I get it). According to
mCameraInfo.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputFormats();
... the following ImageFormats must be supported: NV21, JPEG, YV12, YUV_420_888. I tried everything (locked for the "format" above), all support the specified resolution according to getOutputSizes(format) , but none of them give the desired result:
- NV21: ImageReader.newInstance throws java.lang.IllegalArgumentException: NV21 format is not supported
- JPEG: It really works, but it seems like it doesn't make sense for a real-time application to skip JPEG encoding and decode for each frame ...
- YV12 and YUV_420_888: this is the strangest result - I see an image of a grayscale image, but it is flipped vertically (yes, flipped, not rotated!) And significantly spread (scaled significantly horizontally, but not vertically).
What am I missing here? What makes an image flip and shrink? How can I get a geometrically correct grayscale buffer? Should I use a different type of surface (instead of ImageReader)?
Any hints appreciated.