Java memorymapping large files

Limiting Java MappedByteBuffer to 2GIG makes it harder to use for displaying large files. It is usually recommended to use the MappedByteBuffer array and index it through:

long PAGE_SIZE = Integer.MAX_VALUE; MappedByteBuffer[] buffers; private int getPage(long offset) { return (int) (offset / PAGE_SIZE) } private int getIndex(long offset) { return (int) (offset % PAGE_SIZE); } public byte get(long offset) { return buffers[getPage(offset)].get(getIndex(offset)); } 

this may be work for single bytes, but requires rewriting a lot of code if you want to handle read / write operations that are larger and require border crossing (getLong () or get (byte [])).

Question: what is your best practice for such scenarios, do you know any working solution / code that can be reused without reinventing the wheel?

+4
source share
1 answer

Have you checked dsiutil ByteBufferInputStream ?

Javadoc

The main usefulness of this class is to create input streams that are really based on MappedByteBuffer .

In particular, the factory map(FileChannel, FileChannel.MapMode) method map(FileChannel, FileChannel.MapMode) map the entire file to an array from ByteBuffer and output the array as ByteBufferInputStream. This makes it easy to access files larger than 2 Gbps.

  • long length()
  • long position()
  • void position(long newPosition)

Is that what you were thinking? This is LGPL too .

+5
source

All Articles