Suppose we are given an input array of integers, how to find the longest convex subsequence satisfying the following condition:
c[i] < (c[i-1] + c[i+1]) / 2
c[i-1] , c[i] and c[i+1] are three consecutive elements in a subsequence.
For example, if the input array is { 1, 2, -1, 0, 3, 8, 5 } , the longest convex subsequence should be: { 1, -1, 0, 3, 8 } or { 2, -1, 0, 3, 8 } .
I tried to solve this problem using the same idea of โโdynamic programming in the "Longest Increasing Subsequence" (LIS) problem. But since each element in the subsequence depends on the previous two elements, it seems that the solution O (n ^ 2) is impossible. Thanks for the help.
algorithm
user3116259
source share