Looking at the wikipedia section for the Babylonian method , we can see that the relative error at stage k, e k , satisfies the equation e k <2 -f (k) where f (k) is determined recursively as follows:
f (1) = 2 and f (k + 1) = 2 * f (k) + 1 for n> 1
By induction, f (k) = 3 * 2 k-1 - 1
Let n be the input to our algorithm, which stops when we are sure that the total error is less than the constant m
The error in step k, E k , satisfies the equation E k = e k * n
Therefore, the algorithm completes once when e k * n <t
This will happen when 2 f (k) > n / m, which is equivalent to f (k)> log 2 (n / m)
This equation is true when 2 k-1 > (log 2 (n / m) - 1) / 3, which is true when k> log 2 ((log 2 (n / m) - 1) / 3) + 1
Therefore, the algorithm will complete in O (log (log (n / m) +1)) steps.
Using this logarithmic summation formula , you can show that log (log (x) + c)) = O (log (log (x)).
Therefore, the Babylonian method takes the steps O (log (log (n / m))
laurie
source share