How much does iteration cost on a HashSet also depend on the capacity of the backup card?

From JavaDocs HashSet :

This class offers consistent time performance for basic operations (add, delete, contain and size), assuming that the hash function scattered items properly relate to buckets. Iterating over this set takes time proportional to the sum of the size of the HashSet instance (number of elements) plus the "capacity" of the HashMap instance (number of buckets). Therefore, it is very important not to set the initial power too high (or load factor too low) if iteration performance is important

Why does iteration take time proportional to the sum (number of elements in the set + capacity of the backup card), and not just the number of elements in the set itself?

.

+8
java hashtable hashmap algorithm time-complexity
source share
4 answers

HashSet displayed using a HashMap , where items are map keys. Since the map has a certain number of buckets that can contain one or more elements, the iteration should check each bucket to see if it contains elements or not.

+12
source share

Using a LinkedHashSet follows a β€œlinked” list of entries, so the number of spaces doesn't matter. Usually you will not have a HashSet, where the capacity is much more than twice the size used. Even if you do, scanning a million records, basically null does not take much time (milliseconds)

+3
source share

Why does iteration take time in proportion to the sum (number of elements in the set + capacity of the backup card), and not just the number of elements in the set itself?

Elements are distributed within the underlying HashMap , which is supported by the array.
Thus, it is not known which buckets are occupied (but it is known how many elements are fully available).
To iterate over all the items all , you need to check the checkboxes

0
source share

If your concern is the time it takes to iterate over the set, and you are using Java 6 or more, take a look at this beauty:

ConcurrentSkipListSet

0
source share

All Articles