Just because order is important doesn't mean you can't sort the list (or both). This means that you will have to copy first before you can sort anything. Of course, additional memory is required for copying, and sorting requires additional processing time ... but I think that for all solutions that are better than O (n ^ 2), additional memory and processing time will be required (which is also true for the proposed ones) HashSet solutions - adding all values ββto a HashSet costs extra memory and processing time).
Sorting both lists is possible in O (n * log n) time; searching for common items after sorting lists is possible in O (n) time. Whether it will be faster than your own approach O (n ^ 2) depends on the size of the lists. In the end, only testing different approaches can tell you which approach is the fastest (and these tests should use realistic list sizes, as expected in your final code).
The Big-O designation is not a designation that tells you anything about absolute speed, it only tells you about relative speed. For instance. if you have two algorithms for calculating the value from the input set of elements, one is O (1) and the other is O (n), this does not mean that the solution O (1) is always faster. This is a big misconception of Big-O notation! This means that if the number of input elements doubles, then the solution O (1) will take approx. same amount of time, while O (n) solution takes approx. twice as much time as before. Therefore, there is no doubt that, by constantly increasing the number of input elements, there must be a point where the solution O (1) becomes faster than the solution O (n), but for a very small set of elements O (1) the solution can actually be slower, than the solution O (n).
Mecki source share