I am writing a small Android application where a user can put an image in a camera preview and take a picture. Then the application will combine the two images, respectively - all this works fine.
I understand that you can get / set PreviewSize using Camera.getParameters (), I assume that this is due to the size of the "camera" in real time.
However, the size of my SurfaceView, where the camera preview is displayed, is different from the PreviewSizes (and used). For example, in an emulator, my available SurfaceView is 360x215, and PreviewSize is 320x240. However, the entire SurfaceView is populated with a preview.
But the image that is generated at the end is (also?) 320x240. How does an android compensate for these differences in size and aspect ratio? Is the image trimmed?
Or I just don’t understand what PreviewSize is - is it due to the size of the generated images or is it related to the “real-time preview” that is projected onto the SurfaceView? Are there any non-trivial camera examples that relate to this?
I need to know how the conversion occurs in order to ultimately copy / scale the image to the photo, therefore, these issues.
android image camera preview surfaceview
Ivo van der Wijk
source share