I am trying to get images from my skating rink of a camera for test devices for rendering in the form of sketches. I successfully extracted the images from the camera roll and displayed them in a series of image elements as a list, but they take a very long time to load. In addition, I read in React Native docs that the Image element will select the correct image size for the space in which it will be displayed.
This is from the docs.
iOS saves several sizes for the same image in Camera Roll, it is very important to choose the one that is as close to performance as possible. You would not want to use a full 3264x2448 image as a source when displaying a 200x200 thumbnail. If there is an exact match, React Native will select it, otherwise it will use the first one, which is at least 50% larger, to avoid blurring when resizing from a close size. This is all done by default, so you donβt have to worry about writing tedious (and error prone) code to do it yourself. https://facebook.imtqy.com/react-native/docs/image.html#best-camera-roll-image
The code I use to read images is very simple.
CameraRoll.getPhotos({ first: 21, assetType: 'Photos' }, (data) => { console.log(data); var images = data.edges.map((asset) => { return { uri: asset.node.image.uri }; }); this.setState({ images: this.state.images.cloneWithRows(images) }); }, () => { this.setState({ retrievePhotoError: messages.errors.retrievePhotos }); });
And then, to do this, I have these functions.
renderImage(image) { return <Image resizeMode="cover" source={{uri: image.uri}} style={[{ height: imageDimensions, // imageDimensions == 93.5 width: imageDimensions }, componentStyles.thumbnails]}/>; }, render() { <ListView automaticallyAdjustContentInsets={false} contentContainerStyle={componentStyles.row} dataSource={this.state.images} renderRow={this.renderImage} /> }
What am I missing here? I'm going crazy!
source share