The roots of the polynomial mod prime

I am looking for a quick algorithm for finding the roots of a one-dimensional polynomial in a finite finite field.

That is, if f = a 0 + a 1 x + a 2 x 2 + ... + a n x n (n> 0), then an algorithm that finds all r < p satisfying f(r) = 0 mod p , for a given prime p.

I found the Chiens search algorithm https://en.wikipedia.org/wiki/Chien_search , but I can not imagine that it is fast, for primes over 20 bits. Does anyone have experience with the Chien search algorithm or know a faster way? Is there a simplex module for this?

+8
algorithm sympy polynomials polynomial-math
source share
2 answers

This is pretty well studied, as pointed out by mcdowella's comment. This is how the random Cantor-Sassenhaus algorithm works for the case when you want to find the roots of a polynomial, rather than a more general factorization.

Note that in the ring of polynomials with coefficients mod p the product x (x-1) (x-2) ... (xp + 1) has all possible roots and is equal to x ^ px via Fermat’s Little Theorem and the only factorization in this ring.

Set g = GCD (f, x ^ px). Using the Euclidean algorithm to compute the GCD of two polynomials, is fast overall, performing a series of steps that are logarithmic to the maximum extent. This does not require you to define polynomials. g has the same roots as f in the field, and non-repeating factors.

Due to the special form of x ^ px with two nonzero terms, the first step of the Euclidean algorithm can be performed by re-squaring approximately 2 log_2 (p) with the participation of only polynomials of degree no more than two times the degree of f with coefficients mod p. We can compute x mod f, x ^ 2 mod f, x ^ 4 mod f, etc., and then multiply the numbers corresponding to nonzero places in the binary expansion of p to calculate x ^ p mod f and finally subtract x .

Repeat the following: select random d in Z / p. Calculate the GCD of g with r_d = (x + d) ^ ((p-1) / 2) -1, which we can quickly calculate using the Euclidean algorithm using repeated squaring in the first step. If the degree of this GCD is strictly between 0 and the degree of g, we find a non-trivial factor g, and we can recurse until we find linear factors, therefore, the roots of g and, therefore, f.

How often does it work? r_d has as its roots numbers that are less than d than the nonzero square mod p. Consider two different roots of g, a and b, therefore (xa) and (xb) are factors of g. If a + d is a nonzero square, but b + d is not, then (xa) is a common factor of g and r_d, while (xb) is not, which means that GCD (g, r_d) is a nontrivial factor of g, Similarly, if b + d is a nonzero square, but a + d is not, then (xb) is a common factor of g and r_d, and (xa) is not. According to number theory, one case or another case is close to half of the possible options for d, which means that on average it takes a constant number of options d before it finds a non-trivial factor g, which actually divides (xa) from (xb).

+12
source share

Your answers are good, but I think I found a wonderful method for finding roots modulo any number: this method is based on "LATTICES". Let r ≤ R be the root of mod p . We must find another function, such as h (x), that h is not large and r is the root of h. The lattice method will find this function. For the first time, we must create the basis of the polynomial for the lattice, and then, using the LLL algorithm, we find the “shortest vector” that has a root r without a module p. In fact, we remove modulo p this way.

For a more detailed explanation, see "Coppersmith D. Find Small Solutions for Small Polynomials. In Cryptography and Lattices."

+1
source share

All Articles