Our professor gave us the following task:
A “correct” row is one in which the sum of its members is equal to the index of its first member.
The program is supposed to find the length of the LONGEST "correct" series in a series of n numbers.
For example: if the input row will be the arr[4]={1, 1, 0, 0}
output (the longest "correct" series) will be 3.
arr[0]=1. 0!=1, so the longest episode is here 0.
arr[1]=1,and 1=1., but the following members also summarize to 1, as shown below:
1=arr[1]+arr[2]+arr[3] = 1+ 0 + 0therefore the longest series is here 3.
The result in this example 3.
This is what I have so far:
int solve(int arr[], int index, int length,int sum_so_far)
{
int maxwith,maxwithout;
if(index==length)
return 0;
maxwith = 1+ solve(arr,index+1,length,sum_so_far+arr[index]);
maxwithout = solve(arr,index+1,length,arr[index+1]);
if(sum_so_far+arr[index]==index)
if(maxwith>maxwithout)
return maxwith;
return maxwithout;
return 0;
}
int longestIndex(int arr[], int index,int length)
{
return solve(arr,0,length,0);
}
What am I doing wrong here?
We do not need loops for this assignment.