How many consecutive elements are less before each element of the array

Given an array of elements N < 10 000 for each position i in the array, find (in the most efficient way) how many remaining consecutive elements left (ie, from position i-1 to 0 ) are less than or equal to array[i] .

here is an example:

 Array: 4 3 6 1 1 2 8 5 9 Res: 0 0 2 0 1 2 6 0 8 ( pos 0 (element 4) -> 0 consecutive elements before it, pos 1 (element 3) -> 0 consecutive elements before it smaller than 3 (4>3) pos 2 (element 6) -> 2 consecutive elements before it smaller than 6 (4,3) and so on.. ) 

I would suggest that this is a dynamic programming issue, because it says the problem is “the most efficient way”, and the solution talks about O(n) .

The solution O(n^2) is simple, two cycles counting the elements.

Here is my thought on how 0(n) . One could assume:

 for (int i = 1; i < array.Length; i++) { if (array[i-1] > array[i]) { c [i] = 0; } else { c [i] = c [i - 1] + MAGIC_FORMULA; } } 

Obviously, if I find an element larger than the next, then the result is clearly 0 (the numbers are not less than the left). But what does the previous result say, so I can use dynamic programming? I cannot find any repetition for this case. In addition, this formula would need to be obtained in O(1) so that the whole solution is O(n) , right? The thought of using a hash, but could not understand. I thought about using some modified version of the kadane algorithm, but no luck.

I am dying to understand the solution O(n) . I have been thinking about solving O(n) all day, and I am really stuck.

I am not native, so any help helping to make this question better / clearer will be really appreciated.

+5
source share
2 answers

There is a linear solution, but it does not use dynamic programming, but is a simple loop and stack. First, you can make the following observation: calculating "the number of consecutive elements is less than or equal to c[i] " is almost the same task as finding "a larger index j <= i such that c[j] > c[i] "

The idea is this: for each i (sweeping from the left from i = 0 to the right i = n - 1 ) we save the set of all indices j such that c[j] > c[k] for all j < k < i . This set can be stored on the stack, the lowest values ​​at the top. When you read c[i] , you place the elements until you get the index j , such as c[j] > c[i] . This is the desired index. Then you can push i on the stack.

Example: s is a stack. Here ans[i] will be max{j <= i | c[j] > c[i]} max{j <= i | c[j] > c[i]} . ans[i] will be -1 if the previous set is empty.

 i 0 1 2 3 4 5 6 7 8 c[i] 4 3 6 1 1 2 8 5 9 ------------------------ i = 0: - s = []: ans[0] = -1 - push i: s = [0] i = 1: - s = [0]: c[1] < c[0] -> ans[1] = 1 - push i: s = [0, 1] i = 2: - s = [0, 1]: c[2] >= c[1] -> pop s = [0]: c[2] >= c[0] -> pop s = []: ans[2] = -1 - push i: s = [2] i = 3: - s = [2]: c[3] < c[2] -> ans[3] = 2 - push i: s = [2, 3] i = 4: - s = [2, 3]: c[4] >= c[3] -> pop s = [2]: c[4] < c[2] -> ans[4] = 2 - push i: s = [2, 4] i = 5 - s = [2, 4]: c[5] >= c[3] -> pop s = [2]: c[5] < c[2] -> ans[5] = 2 - push i: s = [2, 5] i = 6 - s = [2, 5]: c[6] >= c[5] -> pop s = [2]: c[6] >= c[2] -> pop s = [] -> ans[6] = -1 - push i: s = [6] i = 7 - s = [6]: c[7] < c[6] -> ans[7] = 6 - push i: s = [6, 7] i = 8 - s = [6, 7]: c[8] >= c[7] -> pop s = [6]: c[8] >= c[6] -> pop s = [] -> ans[8] = -1 - push i: s = [8] 
+5
source

(Editors / moderators, please read my last comment on the selected answer to the question before deleting this one.)

Stack operations

In our first example of aggregate analysis, we analyze the stacks that were added to the new operation. Section 10.1 presents two basic operations of the stack, each of which takes O (1) time:

PUSH (S, x) pushes object x onto stack S.

POP (S) pops the top of the S stack and returns a pop-up object.

Since each of these operations is performed during O (1), we consider the cost of each of them 1. Thus, the total cost of the sequence of operations n PUSH and POP is n, and the actual operating time for n is therefore operation (n).

Now we add the stack operation MULTIPOP (S, k), which removes the k top objects of the stack S or pops the entire stack if it contains less than k objects. In the following pseudo-code, the STACK-EMPTY operation returns TRUE if there are no objects in the stack, otherwise FALSE.

What is the running time of MULTIPOP (S, k) on the stack of s objects? The actual runtime is linear in the number of POP operations actually performed, and therefore it is sufficient to analyze MULTIPOP in terms of abstract costs of 1 for PUSH and POP. The number of iterations of the while loop is the number of min (s, k) objects that jumped out of the stack. For each iteration of the loop, one call is made to the POP on line 2. Thus, the total cost of MULTIPOP is min (s, k), and the actual runtime is a linear function of this cost.

Let us analyze the sequence of operations n PUSH, POP and MULTIPOP on an initially empty stack. The worst cost of a MULTIPOP operation in a sequence is O (n), since the stack size is no more than n. In the worst case of any stack operation, therefore, O (n), and therefore a sequence of n operations costs O (n2), since we can have O (n) MULTIPOP operations evaluating O (n) each. Although this analysis is correct, the O (n2) result obtained by considering the worst cost of each operation separately is not dense.

Using aggregate analysis, we can get a better upper bound that takes into account the entire sequence of n operations. In fact, although a single MULTIPOP operation can be expensive, any sequence of n PUSH, POP, and MULTIPOP operations on an initially empty stack can cost no more than O (n). What for? Each object can be pushed no more than once for each click. Therefore, the number of times that a POP can be called on a non-empty stack, including calls inside MULTIPOP, is the largest number of PUSH operations that are at most n. For any value of n, any sequence of operations n PUSH, POP and MULTIPOP takes the total time O (n). The average transaction cost is O (n) / n = O (1). In an aggregate analysis, we assign the amortized cost of each transaction to an average cost. Therefore, in this example, all three stack operations have an amortized cost of O (1).

0
source

All Articles