No, if your file is too large to fit in memory, it is too large to fit in memory.
A better solution would be to try to treat the stream as a stream, rather than loading all of this into memory. Not knowing what treatment you are trying to achieve, we cannot say if this is really possible.
For example, if you are just trying to calculate a safe hash of a file, you should do this without loading significant amounts of data at a time, but if your processing requires random access to the data, you might need to use RandomAccessFile .
source share