The list against the dictionary (maximum size, number of elements)

I am trying to set the maximum sizes (in RAM) of a list and a dictionary. I am also curious how many maximum numbers of elements / records each element can contain and their memory for each record.

My reasons are simple: I, like most programmers, are a bit lazy (this is a virtue). When I write a program, I like to write it once and try to hope for the future as much as possible. I am currently writing a program that uses lists, but noticed that the iterator needs an integer. Since the capabilities of my program are limited only by the available memory / coding style, I would like to write it to use a list with Int64s or, possibly, BigInts (as iterators). I saw IEnumerable as an opportunity here, but would like to know if I can just type Int64 into the Dictionary object as a key, and not rewrite everything. If I can, I would like to know what price can be compared to rewriting it.

My hope is that if my program is useful, I just need to recompile it after 5 years to take advantage of the increased memory.

+3
dictionary list c #
source share
3 answers

Is this indicated in the documentation for the class? No, then it is not indicated.

Regarding the current implementations, the classes themselves do not have a maximum size in RAM, if you create a value type whose size is 2 MB, click a few thousand in the list and get an exception in memory, which means nothing to do with List<T> .

Inside, a List<T> output would prevent her from having more than 2 billion items. It’s more difficult to answer Dictionary<TKey, TValue> , since the method of positioning inside it is more complicated, but in fact, if I were looking for processing a billion elements (if, for example, a 32-bit value, then 4 GB), I would like to save them in the database and get them using the data access code.

At least once you are dealing with a single data structure that is 4 GB in size, rolling your own custom collection class is no longer considered a wheel reset.

+6
source share

I use concurrentdictionary to rank 3x3 patterns in half a million games. Obviously, there are many possible patterns. With C # 4.0, about 120 million objects go out of memory in concurrentdictionary. It uses 8 GB at the time (on a 32 gigabyte machine), but wants to grow too much (I think heaps of concurrentdictionary are found in large chunks). I think using a database could slow me down by at least a hundred times. And this process has already taken 10 hours.

My solution was to use a multiphase solution that actually does multiple passes, one for each subset of the patterns. As one pass for odd patterns and one for crisp patterns. Using more objects fails anymore, I can reduce the number of passes.

C # 4.5 adds support for a larger 64-bit array using 32-bit unsigned pointers for arrays (the specified limit is from 2 billion to 4 billion). See also http://msdn.microsoft.com/en-us/library/hh285054(v=vs.110).aspx . Not sure which of these objects will be useful, List <> may.

+3
source share

I think you have more serious problems, even if you are wondering if Dictionary with int64 key is int64 after 5 or 10 years.

Having a List or Dictionary of 2e + 10 elements in memory ( int32 ) does not seem like a good idea, not to mention 9e + 18 elements ( int64 ). In any case, the structure will never allow you to create a monster of this size (not even closed) and probably never will. (Keep in mind that a simple int[int.MaxValue] array already far exceeds the limits of the memory allocation for any given object).

And the question remains: why do you want your application to store a list of so many elements? It is better to use a specialized backend for storing data (databases) if you need to manage this amount of information.

+2
source share

All Articles