Storing a large number of geolocation records in a cached ArrayList or always request them from MongoDB?

I am working on a geolocation application. This application contains about 500K entries in MongoDB correctly indexed. Each line has its own record values ​​in latitude and longitude. Thus, the client must restore 200 nearest points from these 500 thousand lines. I am worried about performance. At first I thought of storing all the entries (lat / lng info) in the cache manager or in the database in memory. After that, this point (lat / lng) can be compared with these values ​​in the cache. At this moment, I doubt it.

Would it be nice to save all these records in ArrayList in the cache manager, and then compare the geolocations of the records with the geolocations in ArrayList to calculate the distances?

With this approach, I prevent a huge number of queries in MongoDB, on the other hand, this may not be true, storing about 500K records (geolocations) in an ArrayList, and then fetch ths list to get 200 closest. If I'm not mistaken, at least I consider that a penalty for performance.

How can I solve this problem?

Thanks in advance.

+4
source share
2 answers

. 500k ArrayList 200 , , 500k . . , , MongoDB.

, , MongoDB : , . , R-Tree. R- - log n n . 500 . .

+4

.

0

All Articles