Reduce the sequence in the most optimal way.

We are given a sequence of a numbers n . The decrease in the sequence a defined as the replacement of the elements a[i] and a[i+1] by max(a[i],a[i+1]) .

Each reduction operation has a cost defined as max(a[i],a[i+1]) . After n-1 contractions, a sequence of length 1 obtained.

Now our goal is to print the cost of optimal reduction of a given sequence a so that the resulting sequence of length 1 has a minimum cost.

eg:.

 1 2 3 Output : 5 

The solution O (N ^ 2) is trivial. Any ideas?

EDIT1: People ask about my idea, so my idea was to go through the sequence in pairs and check the cost for each pair and finally reduce the pair at the lowest cost.

 1 2 3 2 3 <=== Cost is 2 

So reduce the sequence above to

 2 3 

now cross the sequence again, we get the value as 3

 2 3 3 <=== Cost is 3 

Thus, the total cost is 2 + 3 = 5

The above algorithm has O (N ^ 2). That is why I asked for a more optimized idea.

+4
source share
4 answers

O(n) solution:

high level:

The basic idea is to re-merge any element e less than its neighbors ns and nl with its smallest neighbor ns . This leads to a minimum cost, since both the cost and the result of the merge max(a[i],a[i+1]) , which means that no merger can make the element smaller than at present, therefore, the cheapest possible the merge for e is ns , and this merge cannot increase the value of any other possible mergers.

This can be done using a one-pass algorithm, keeping the stack of elements from our array in descending order. We compare the current element with its neighbors (one of which is the top of the stack) and perform the appropriate merges until we finish.

Pseudo Code:

 stack = empty for pos = 0 to length // stack.top > arr[pos] is implicitly true because of the previous iteration of the loop if stack.top > arr[pos] > arr[pos+1] stack.push(arr[pos]) else if stack.top > arr[pos+1] > arr[pos] merge(arr[pos], arr[pos+1]) else while arr[pos+1] > stack.top > arr[pos] merge(arr[pos], stack.pop) 

Java code:

 Stack<Integer> stack = new Stack<Integer>(); int cost = 0; int arr[] = {10,1,2,3,4,5}; for (int pos = 0; pos < arr.length; pos++) if (pos < arr.length-1 && (stack.empty() || stack.peek() >= arr[pos+1])) if (arr[pos] > arr[pos+1]) stack.push(arr[pos]); else cost += arr[pos+1]; // merge pos and pos+1 else { int last = Integer.MAX_VALUE; // required otherwise a merge may be missed while (!stack.empty() && (pos == arr.length-1 || stack.peek() < arr[pos+1])) { last = stack.peek(); cost += stack.pop(); // merge stack.pop() and pos or the last popped item } if (last != Integer.MAX_VALUE) { int costTemp = Integer.MAX_VALUE; if (!stack.empty()) costTemp = stack.peek(); if (pos != arr.length-1) costTemp = Math.min(arr[pos+1], costTemp); cost += costTemp; } } System.out.println(cost); 
+2
source

I am confused if you mean by "cost" to reduce "computational cost", that is, an operation that takes time max(a[i],a[i+1]) , or just what you want to calculate. If this is the latter, then the following algorithm is better than O (n ^ 2):

  • sort the list or, more precisely, define b[i] st a[b[i]] - the sorted list: O (n), if you can use RADIX sorting, otherwise O (n log n).
  • starting from the second smallest element i in the sorted list: if left / right below i, do: O (1) for each element, the list of updates from 2, O (n) as a whole.

I have no idea if this is the optimal solution, but it is O (n) for integers and O (n log n), otherwise.

edit: implemented that removing the precalculation step made it a lot easier

0
source

Indeed, the greedy approach works.

You can always reduce the smallest number with your smaller neighbor.

Proof : we need to reduce the smallest number at some point. Any decrease in the neighbor will make the value of the neighbor at least equal (possibly) large, therefore, an operation that reduces the minimum element a [i] will always have the cost c> = min (a [i-1], a [i + 1])

Now we need

  • quickly find / delete the smallest number
  • find your neigbors

I would go with 2 RMQ on this. Perform operation 2 as a binary search. Which gives us O (N * log ^ 2 (N))

EDIT: first RMQ values. When you delete an item, add the second RMQ, presence, to it. 0 or 1 (the value is / does not exist). To find [for example] the left neighbor [i], you need to find the largest l , which sum[l,i-1] = 1 .

0
source

If you don't consider it mutable to sort the list, do it at n log n time, and then merge the first two entries recursively. The total cost in this case will be equal to the sum of the records minus the smallest record. This is optimal since

  • the value will be the sum of n-1 records (with valid repetitions)
  • i smallest entry can be displayed no more than i-1 times in the cost function

The same basic idea works even if the list is not sorted. The best solution is to combine the smallest element with its smallest neighbor. To make sure this is optimal, please note that

  • the value will be the sum of n-1 records (with valid repetitions)
  • entry a_i can be mapped no more than j-1 times in a cost function, where j is the length of the longest sequential subsequence containing a_i , so a_i is the maximum element of the subsequence

In the worst case, the sequence decreases, and the time is O(n^2) .

0
source

All Articles