The complexity of time for the Babylonian method

What is the time complexity of the Babylonian method? is it log (n), where n is the number for which we want to find the root square? If so, why is this so?

+8
performance algorithm big-o time-complexity
source share
2 answers

Looking at the wikipedia section for the Babylonian method , we can see that the relative error at stage k, e k , satisfies the equation e k <2 -f (k) where f (k) is determined recursively as follows:

f (1) = 2 and f (k + 1) = 2 * f (k) + 1 for n> 1

By induction, f (k) = 3 * 2 k-1 - 1

Let n be the input to our algorithm, which stops when we are sure that the total error is less than the constant m

The error in step k, E k , satisfies the equation E k = e k * n

Therefore, the algorithm completes once when e k * n <t

This will happen when 2 f (k) > n / m, which is equivalent to f (k)> log 2 (n / m)

This equation is true when 2 k-1 > (log 2 (n / m) - 1) / 3, which is true when k> log 2 ((log 2 (n / m) - 1) / 3) + 1

Therefore, the algorithm will complete in O (log (log (n / m) +1)) steps.

Using this logarithmic summation formula , you can show that log (log (x) + c)) = O (log (log (x)).

Therefore, the Babylonian method takes the steps O (log (log (n / m))

+12
source share

I think that the steps will be O (log2 (log to the base E0 (m / n)) where E0 is an error at zero iteration, when we select the initial value m an error is allowed in ans, and n is an input to the algorithm. You can refer to this for a full explanation of the recursive error function: http://www.math.harvard.edu/library/sternberg/slides/lec1.pdf

0
source share

All Articles