Algorithm Complexity in Online Testing

I had to fill out an online test for an internship, and I had a question on complexity analysis. I answered the question, and it was marked as incorrect , and I just wanted to understand why, so I can improve.

Question:

Set the complexity for the following algorithm when reverseOrder is true and when it is false:

List<int> stackToList(Stack<int> stack, bool reverseOrder) { List<int> items = new List<int>(); while (stack.Count > 0) { int item = stack.Pop(); if (reverseOrder) { items.Insert(0, item); } else { items.Add(item); } } return items; } 

EDIT:. It was a multiple choice, and the possible answers were:

  • About (1)
  • About (NlogN)
  • O (n)
  • O (n ^ 2)

You can choose one of them when reverseOrder is true and the other when it is false.

My answer:

  • When reverseOrder is true: O (n 2 )
  • When reverseOrder is false: O (n 2 )

I came to this answer as follows:

  • The while loop will repeat n times, because it continues until there are no elements left.
  • Pop() O(1)
  • In the case of reverseOrder true , Insert is executed before the list. Since List<T> supported by the array, it is dynamically changed and each element is moved by one space, and the element is inserted into index 0. According to https://msdn.microsoft.com/en-us/library/sey5k5z4(v=vs.110 ) .aspx :

    This method is an O (n) operation, where n is Count.

  • In the case of reverseOrder , which is false , Add added to add the item to the end of the list. Since items do not have an initial size, Count not less than Capacity , which leads to a resize, so according to https://msdn.microsoft.com/en-us/library/3wcytfd1(v=vs.110).aspx :

    If Count is less than capacity, this method is O (1) operation. If capacity needs to be increased to accommodate a new item, this method becomes the O (n) operation, where n is Count.

I feel rather discouraged at the moment, since getting this incorrect result made my score plummet, although I got all the other questions on proper testing.

+7
c # asymptotic-complexity
source share
4 answers

You need to ask the people who wrote the test. Any answer here will be strictly based on opinions, since we do not have a full context, i.e. What would make the test author describe the complexity of the algorithm differently than you.

However, I agree with the author of the test in the reverseOrder == false script. Although it is true that you can perform the resize operation during the Add() call, the resize operation will result in a worse cost for log N , since the new size doubles with each resize.

You do not say what the correct answer should be, but I would give it as O (N log N).

+4
source share

Your assumption is that Capacity increases every time you add an item - this is not true. The documentation does not seem to mention the exact algorithm, but I believe that it doubles every time the capacity increases. You can see that in the dinosaur example, in the documentation you specified, they add 5 dinosaurs, and power reports add 8.

+1
source share

Considering line 6669 of

http://referencesource.microsoft.com/#mscorlib/system/collections/generic/list.cs,cf7f4095e4de7646

It is clear that the paste at the beginning of the list makes it copy all the way down the list. Therefore, each insert requires N moves. O (N ^ 2) seems to me.

0
source share

stackToList (stack, true ) = O (n) . In most cases, this call will be O (1). However, when the List.Add function needs to expand more than its capacity, the array must be copied to a new array with a capacity two times higher than the previous one and iterate over the previous elements in order to save them in the new array. Thus, we can represent the actual operation in excel as IF (nLOG (n, 2) (n / 2) / INT (nLOG (n, 2) (n / 2)) = 1, n, 1). Given the lack of resource bottlenecks, if this algorithm completes in 10 seconds with 10 million items to fill 100 million items, you expect this to take about 10 (10) seconds. Actually, we know that it will be slightly worse than Big O Notation, predicts, because the operation O (n) will take many operations O (1) to recoup.

On an enlarged scale, you can see how List.Copy () affects cumulative operations. 65 n first cumulative number of operations

Reduced, you can see that this does not affect the scale compared to the O (n) operation. 650 n first cumulative number of operations

stackToList (stack, false ) = O (n ^ 2) . The list insert function executes a copy of the array, which should move all the elements in the list to a new list. As the pointer begins to iterate through the parent stack, the number of operations starts at 0, and then grows until it reaches n. On average, this happens n / 2 times. Constant 2 is removed in Big O Notation, and you are left with n. Given the lack of resource bottlenecks, if this algorithm completed in 10 seconds with 10 million items to fill 100 million items, you expect it to take about 10 (10 ^ 2) seconds. Actually, we know that the second case will scale better than Big O Notation predicted, because it is actually n * (n-1), but it will not scale better than O (n * Log (n)), the next step down. Reduce comparison operations

Refusal of operations:

 List<int> stackToList(Stack<int> stack, bool reverseOrder) { List<int> items = new List<int>(); // O(1) while (stack.Count > 0) { // O(n): For every int in the supplied stack int item = stack.Pop(); // O(1): https://referencesource.microsoft.com/#System/compmod/system/collections/generic/stack.cs,222 if (reverseOrder) { // O(1) items.Insert(0, item); // O(n^2): https://referencesource.microsoft.com/#mscorlib/system/collections/generic/list.cs,669 } else { items.Add(item); // O(Slightly more than 1): https://referencesource.microsoft.com/#mscorlib/system/collections/generic/list.cs,220 } } return items; } 
0
source share

All Articles