Do you mind spending O (2n) memory? You can use Queue <> in combination with Dictionary <,>. The queue handled queue and deactivation operations, and the dictionary would provide unique entries. A simple wrapper class can combine the two, and this will give you an O (log n) queue and decompression time.
Example:
public class SetQueue<T> { private readonly Dictionary<T, bool> duplicates = new Dictionary<T, bool>(); private readonly Queue<T> queue = new Queue<T>(); public bool Enqueue(T item) { if (!duplicates.ContainsKey(item)) { duplicates[item] = true; queue.Enqueue(item); return true; } return false; } public T Dequeue() { if (queue.Count >0) { var item = queue.Dequeue(); if (!duplicates.ContainsKey(item)) throw new InvalidOperationException("The dictionary should have contained an item"); else duplicates.Remove(item); return item; } throw new InvalidOperationException("Can't dequeue on an empty queue."); } }
Inserting into this custom data structure checks to see if the dictionary already contains an element. This operation uses the ContainsKey method, which is an O (log n) operation. If the element was already contained in the data structure, how does the method exit. If the element is not contained, then the element will be inserted into the queue, which is a constant operation O (1). It will also be added to the dictionary. When the number of the dictionary is less than the volume, it will approach the time constant O (1). Thus, the total queue time will be O (log n).
The same goes for the dequeuing method.
This solution is basically the same as the built-in OrderedDictionary data structure, however, since this solution uses a common one, it does not have the overhead of boxing / unpacking operations, which makes it much faster.
Kasper Holdum
source share