I am not an expert, so I hope this is not too stupid, but will a vector combined with lower_bound be good?
If you use lower_bound to find the right place to insert new values, your vector will always be sorted as it is created, no sorting is required. When your vector is sorted, is not lower_bound a binary search with exponent of a logarithmic class?
Since it is sorted, finding the minimum value (or max) is a binding.
To reduce the key, you must perform a lower search, delete and make lower_bound again to insert the reduced key, which = 2 logarithmic classes. Still not bad.
Alternatively, you can update the key and sort the vector. I would suggest that with random access this should still be in a logarithmic class, but I don't know what stl does exactly.
With a sorted vector, if you know that the candidate key is smaller than the one that is there, perhaps you can even sort the part of the vector that has ever smaller values for even greater performance.
Another consideration: I think sets / maps have a bit more memory overhead than vectors?
source share