People say that it depreciates O (1) to put in the hash table. Therefore, put n elements must be O (n). However, this is not true for large n, because, as the respondent said: “All you need to satisfy the expected amortized O (1) is to expand the table and rephrase everything with a new random hash function at any time in the collision.”
So: what is the average execution time for inserting n elements into a hash table? I understand that this probably depends on the implementation, so please indicate what type of implementation you are talking about.
For example, if there is (log n) equidistributed conflicts, and each collision takes O (k) to resolve, where k is the current size of the hash table, then you should have this repetition ratio:
T(n) = T(n/2) + n/2 + n/2
(i.e., you spend time inserting n / 2 elements, then you have a collision accepting n / 2 to solve, then you do the remaining n / 2 inserts without a collision). It still ends with O (n), so yay. But is that reasonable?
source share