How to solve Big-O Notation for a prime function?

I am trying to understand Big-O Notation. Sorry if I ask something too obvious, but it seems like I can't wrap myself around this.

I have the following C # code function that I am trying to calculate Big-O Notation for.

for (i = 2; i < 100; i++) { for (j = 2; j <= (i / j); j++) if ((i % j) == 0) break; // if factor found, not prime if (j > (i / j)) Console.WriteLine("{0} is prime", i); } 

Now, what have I received so far, I believe that both if are considered constant O (1) and are not taken into account when calculating this algorithm? And also, if I correctly understood the single for the loop

 for(i = 0; i < 100; i++) 

since this is a linear function, there will be O (n) and a nested loop that is independent of the variable from the surrounding loop

 for(i = 0; i < 100; i++) for(j = 0; j < 100; j++) 

Is O (n ^ 2)? But how can I calculate a function like the top one, where the second cycle depends on the first cycle and creates a non-linear function?

Picture of my data points

I found a definition for linearithmic that says

The linear arithmetic algorithm scales to huge problems. Whenever N doubles, the runtime is longer (but not much longer) than doubled.

Although this is apparently a good description of how this piece of code is executed, it means that it is O (N Log [n]), and if so, how could I calculate this?

+7
c # algorithm big-o
source share
3 answers

@Jon is close, but its analysis is a bit wrong, and the real complexity of your algorithm is O(n*sqrt(n)) .

This is based on the fact that for each number i expected number of "work" that you should do in the inner loop is:

 1/2 + 2/3 + 3/4 + ... + (sqrt(i)-1)/sqrt(i) = = 1-1/2 + 1-1/3 + ... + 1-1/sqrt(i) = sqrt(i) - (1/2 + 1/3 + ... + 1/sqrt(i) = sqrt(i) - H_sqrt(i) 

since H_sqrt(i) ( Harmonic number ) is in O(log(sqrt(i)) = O(1/2*log(i) , we can conclude that the complexity is O(sqrt(i)-log(i)) = O(sqrt(i)) , for the first calculation of the quantity.

Since this is repeated for each i , the total complexity of the problem is O(sqrt(2) + sqrt(3) + ... + sqrt(n)) . According to this forum topic, the sum of the squares of the roots is in O(n*sqrt(n)) , which is worse than O(nlogn) .

Remarks:

  • The first sum is sqrt (i), since this is the point where j > (i / j) .
  • The first sum (j-1)/j for each j , since on average one of the elements j falls into the gap (1/3 elements are divisible by 3, 1/4 by 4 ...) this leaves us (j-1)/j which are not - and this is the expected work.
  • The equality O(log(sqrt(n)) = O(1/2*log(n) comes from O(log(n^k))=O(k*log(n))=O(log(n)) for any constant k . (in your case k = 1/2)
+6
source share

Analyzing your algorithm, I came up with the following:

  • The inner cycle does not repeat when i is in the interval [2, 3] .
  • The inner loop iterates once when i is in the interval [4, 8] .
  • The inner loop iterates twice when i is in the interval [9, 15] .
  • The inner loop iterates three times when i is in the interval [16, 24] .
  • The inner loop iterates four times when i is in the interval [25, 35] .
  • The inner loop iterates five times when i is in the interval [36, 48] .
  • The inner loop iterates six times when i is in the interval [49, 63] .
  • The inner loop iterates seven times when i is in the interval [64, 80] .
  • The inner loop iterates eight times when i is in the interval [81, 99] . I had to go over a range of over 100 to check above.
  • The inner loop iterates nine times when i is in the interval [100, 120] .

Intervals depending on the i's value can be represented as follows:

 [i^2, i * (i + 2)] 

Therefore, we can do this:

enter image description here

Empirical verification:

enter image description here

With a helpful WolframAlpha link:

 http://www.wolframalpha.com/input/?i=sum[+floor%28+i^%281%2F2%29%29+-+1+]+with+i+from+2+to+99. 

Formally, we can indicate the following:

enter image description here

+1
source share

I looked at your code - and no n. The code is independent of any n. It will work exactly at the same time. You can calculate how much time is required, but it is always the same, constant, time. As written, your code starting with "for (i = 2; i <100; ++ i)" runs in O (1).

So change the first line to

 for (i = 2; i < n; ++i) 

and now we have code that really depends on n. The inner loop works no more than sqrt (i) iterations, which is less than sqrt (n). The outer loop works with n iterations. Thus, the execution time is not more than O (n sqrt (n)) = O (n ^ 1.5).

In fact, it will work faster. It usually does not work until sqrt (i), but only until it finds the divisor i. Half of the numbers are divided by 2 (one iteration). A third of the rest is divided into 3 (two iterations). One fifth of the rest is divided by 5 (four iterations). About n / ln n numbers are primes with iterations sqrt (n). About n / ln n more numbers is the product of two primes> n ^ (1/3) up to sqrt (n) iterations. The rest has less than n ^ (1/3) iterations.

So, the code really works in O (n ^ 1.5 / ln n). You code is improving this by using a prime table to sqrt (n), and you should be up to O (n ^ 1.5 / ln ^ 2 n).

But in practice, you can bet that Console.WriteLine () will take a lot more time than checking that the number is prime. If you insist on listing all primes, O (n / ln n) time will dominate your algorithm with a very large constant coefficient to display the results until n becomes really large.

0
source share

All Articles