How to access image pixel data in

I have an image downloaded from a camera roll or any other source, usually local.

How can I access my pixel data map to perform some calculations or measurements?

+8
javascript react-native
source share
1 answer

There is a way to get this information using native modules, but at the moment I only have an Android implementation. Tested according to RN 0.42.3. First of all, you need to create your own module in your application. Assuming the application is initialized with the name SampleApp , create a new directory in the React Native project android/app/src/main/java/com/sampleapp/bitmap with two files in it:

Android / application / SRC / Main / Java / COM / SampleApp / raster / BitmapReactPackage.java

 package com.sampleapp; import com.facebook.react.ReactPackage; import com.facebook.react.bridge.JavaScriptModule; import com.facebook.react.bridge.NativeModule; import com.facebook.react.bridge.ReactApplicationContext; import com.facebook.react.uimanager.ViewManager; import java.util.ArrayList; import java.util.Collections; import java.util.List; public class BitmapReactPackage implements ReactPackage { @Override public List<Class<? extends JavaScriptModule>> createJSModules() { return Collections.emptyList(); } @Override public List<ViewManager> createViewManagers(ReactApplicationContext reactContext) { return Collections.emptyList(); } @Override public List<NativeModule> createNativeModules( ReactApplicationContext reactContext) { List<NativeModule> modules = new ArrayList<>(); modules.add(new BitmapModule(reactContext)); return modules; } } 

Android / application / SRC / Main / Java / COM / SampleApp / raster / BitmapModule.java

 package com.sampleapp; import com.facebook.react.bridge.NativeModule; import com.facebook.react.bridge.ReactApplicationContext; import com.facebook.react.bridge.ReactContext; import com.facebook.react.bridge.ReactContextBaseJavaModule; import com.facebook.react.bridge.ReactMethod; import com.facebook.react.bridge.Promise; import com.facebook.react.bridge.WritableNativeArray; import com.facebook.react.bridge.WritableNativeMap; import android.graphics.Bitmap; import android.graphics.BitmapFactory; import java.io.IOException; public class BitmapModule extends ReactContextBaseJavaModule { public BitmapModule(ReactApplicationContext reactContext) { super(reactContext); } @Override public String getName() { return "Bitmap"; } @ReactMethod public void getPixels(String filePath, final Promise promise) { try { WritableNativeMap result = new WritableNativeMap(); WritableNativeArray pixels = new WritableNativeArray(); Bitmap bitmap = BitmapFactory.decodeFile(filePath); if (bitmap == null) { promise.reject("Failed to decode. Path is incorrect or image is corrupted"); return; } int width = bitmap.getWidth(); int height = bitmap.getHeight(); boolean hasAlpha = bitmap.hasAlpha(); for (int x = 0; x < width; x++) { for (int y = 0; y < height; y++) { int color = bitmap.getPixel(x, y); String hex = Integer.toHexString(color); pixels.pushString(hex); } } result.putInt("width", width); result.putInt("height", height); result.putBoolean("hasAlpha", hasAlpha); result.putArray("pixels", pixels); promise.resolve(result); } catch (Exception e) { promise.reject(e); } } } 

As you can see in the second file, there is a getPixels method that will be available from JS as part of the built-in Bitmap module. It takes the path to the image file, converts the image into an internal Bitmap type, which allows you to read the image pixels. All image pixels are read one after another and stored in an array of pixels in the form of hexadecimal strings (since React Native does not allow transferring hexadecimal values ​​through the bridge). These hexadecimal strings have 8 characters, 2 characters in the ARGB channel: the first two characters are the hexadecimal value for the alpha channel, the second second for the red channel, the second two for the green channel, and the last two for the blue channel. For example, ffffffff is white and ff0000ff is blue. For convenience, the image width, height and alpha channel are returned along with an array of pixels. The method returns a promise with an object:

 { width: 1200, height: 800, hasAlpha: false, pixels: ['ffffffff', 'ff00ffff', 'ffff00ff', ...] } 

The native module must also be registered in the application, change android/app/src/main/java/com/sampleapp/MainApplication.java and add a new module there:

 @Override protected List<ReactPackage> getPackages() { return Arrays.<ReactPackage>asList( new MainReactPackage(), new BitmapReactPackage() // <--- ); } 

How to use JS:

 import { NativeModules } from 'react-native'; const imagePath = '/storage/emulated/0/Pictures/blob.png'; NativeModules.Bitmap.getPixels(imagePath) .then((image) => { console.log(image.width); console.log(image.height); console.log(image.hasAlpha); for (let x = 0; x < image.width; x++) { for (let y = 0; y < image.height; y++) { const offset = image.width * y + x; const pixel = image.pixels[offset]; } } }) .catch((err) => { console.error(err); }); 

I need to note that it works quite slowly, most likely due to the transfer of a huge array across the bridge.

+9
source share

All Articles