The fastest way to match a large number of lengths

I am writing a java application that converts numbers (long) into a small set of result objects. This mapping process is very important for application performance, as it is necessary very often.

public static Object computeResult(long input) {
    Object result;
    // ... calculate
    return result;
}

There are about 150 million different key objects and about 3,000 different values. The conversion from the input number (long) to the output (immutable object) can be calculated by my algorithm at a speed of 4,000,000 conversions per second. (using 4 threads)

I would like to cache the 150M display of the various possible inputs in order to make the translation even faster, but I found some difficulties with creating such a cache:

public class Cache {
    private static long[] sortedInputs; // 150M length
    private static Object[] results; // 150M length

    public static Object lookupCachedResult(long input) {
        int index = Arrays.binarySearch(sortedInputs, input);
        return results[index];
    }
}

150. , . 3000 , , .

, . .

, , . , 1,5 . ( 4 )

- ?

, , 4 000 000 , , .

+4
2

- , HashMap, , .. Long , , . , - JIT, HashMap:

n > 3000 - (, , ), index = key % n. 3000 , n, - . , .. O (1).

-, Java- ,

Lapack BLAS, . , /-, ( ).

+1

150 000 000 3000 .

, ( ). Interner ( ).

hashmap treemap, outOfMemoryError.

. TreeMap, , .

, , - , google " ". , . O(1) ( , , ?) , , (*) 20 . log2(150e6), .. 27 , , , . , -; , .

( , , ), 1 << 28, 268435456 , .


(*) , , .

+1

All Articles