Swift LCM Doubling Algorithm

I need to turn an array of doubles into ints, while keeping their relationships the same and as simple as possible. For example, [0.7, 0, -0.7] should become [1, 0, -1], and [24, 12, 0] should become [2, 1, 0]. I'm not sure if this will be due to getting the least common multiple doubling or not, and how will this be done if so?

+1
source share
1 answer

First of all, for floating point numbers, there is no GCD or LCM. You first need to convert the input to rational numbers.

It is not as simple as it seems, because decimal fractions, such as 0.7, cannot be represented exactly as a binary floating-point number and will be stored as 0.69999999999999996in Double. So it’s not entirely clear how to get from there to 7/10.

Therefore, accuracy must be specified. Then you can use Continuation fractions to efficiently create (finite or infinite) a sequence of fractions h n / k n that are arbitrary good approximations to a given real number x.

Here is the translation of this JavaScript implementation in Swift:

typealias Rational = (num : Int, den : Int)

func rationalApproximationOf(x0 : Double, withPrecision eps : Double = 1.0E-6) -> Rational {
    var x = x0
    var a = floor(x)
    var (h1, k1, h, k) = (1, 0, Int(a), 1)

    while x - a > eps * Double(k) * Double(k) {
        x = 1.0/(x - a)
        a = floor(x)
        (h1, k1, h, k) = (h, k, h1 + Int(a) * h, k1 + Int(a) * k)
    }
    return (h, k)
}

Examples:

rationalApproximationOf(0.7)      // (7, 10) i.e. 7/10
rationalApproximationOf(0.142857) // (1, 7)  i.e. 1/7

I set the default precision to 1.0E-6, but you can customize this to your needs.

GCD ( ) LCM ( ). :

// GCD of two numbers:
func gcd(var a : Int, var b : Int) -> Int {
    while b != 0 {
        (a, b) = (b, a % b)
    }
    return abs(a)
}

// GCD of a vector of numbers:
func gcd(vector : [Int]) -> Int {
    return reduce(vector, 0) { gcd($0, $1) }
}

// LCM of two numbers:
func lcm(var a : Int, var b : Int) -> Int {
    return (a / gcd(a, b)) * b
}

// LCM of a vector of numbers:
func lcm(vector : [Int]) -> Int {
    return reduce(vector, 1) { lcm($0, $1) }
}

func simplifyRatios(numbers : [Double]) -> [Int] {
    // Normalize the input vector to that the maximum is 1.0,
    // and compute rational approximations of all components:
    let maximum = maxElement(map(numbers) { abs($0) } )
    let rats = map(numbers) { rationalApproximationOf($0/maximum) }

    // Multiply all rational numbers by the LCM of the denominators:
    let commonDenominator = lcm(map(rats) { $0.den })
    let numerators = map(rats) { $0.num * commonDenominator / $0.den }

    // Divide the numerators by the GCD of all numerators:
    let commonNumerator = gcd(numerators)
    return map(numerators) { $0 / commonNumerator }
}

:

simplifyRatios([0.7, 0, -0.7])   // [1, 0, -1]
simplifyRatios([24, 12, 0])      // [2, 1, 0]
simplifyRatios([1.3, 0.26, 0.9]) // [65, 13, 45]
+5

All Articles