A quick compliance test with 100% certainty?

I use GMP (with MPIR) for data types of arbitrary size. I also use its fidelity check function, which uses the Miller-Rabin method, but this is not accurate. This is what I want to fix.

I was able to confirm that the number 18446744073709551253 is prime using brute force using the sqrt approach.

Is there a way to verify that large numbers are prime or not, with 100% accuracy?

  • You can not use too much memory / storage space, several megabytes are permissible.

  • It should be faster than the sqrt method I use.

  • It should work for numbers of at least 64 bits or more.

  • Lastly, it should be 100% accurate, without any marbles!

What are my options?

I could live with brute force (for 64-bit numbers), although, but out of interest, I want faster and more. Also, checking a 64-bit number was too slow: just 43 seconds!

+4
source share
4 answers

For very large numbers, the Alement primality Test is a determinate primitiveness criterion that runs over time O (log 7.5 n log log n), where n is the number of interest. This is exponentially faster than the O (? Radic; n) algorithm. However, the algorithm has large constant factors, so it is not practical until your numbers become quite large.

Hope this helps!

+5
source

As a common point, 100% certainty is impossible on a physical computer, since there is a small but finite possibility that some component failed invisibly and that the answer indicated at the end is incorrect. Given this fact, you can run reasonably probable Miller-Rabin tests that the probability that the number will be composite is much less than the probability of failure of your equipment. It is easy to check the confidence level 1 in 2 ^ 256:

boolean isPrime(num) limit <- 256 certainty <- 0 while (certainty < limit) if (millerRabin returns notPrime) return false exit else certainty <- certainty + 2 endif endwhile return true end isPrime 

This will verify that the number is prime, accurate to 1 in 2 ^ 256. Each MR test adds a four-fold confidence coefficient. I saw the strokes obtained, called "industrial primes," good enough for all practical purposes, but not quite for theoretical mathematical certainty.

+3
source

For small n, trial division works; the limit there is probably somewhere around 10 ^ 12. For somewhat large n there are various studies (see Works by Gerhard Jeshke and Zhou Zhang) that calculate the smallest pseudo-primitive for different collections of Miller-Rabin bases; it will take about 10 ^ 25. After that, things get complicated.

The "big guns" of proving primitiveness are the APRCL method (it can be called Jacobi sums or Gaussian sums) and the ECPP method (based on elliptic curves). Both are complex, so you'll want to find an implementation, don't write your own. These methods can handle several hundred digits.

The AKS method is a proven polynomial time and is easy to implement, but the proportionality constant is very high, so in practice this is not useful.

If you can define n-1 or even partially define it, the Pocklington method can determine the primitivity of n. The Pocklington method itself is fast, but factoring cannot be.

For all these purposes, you want to be sure that the number is prime before trying to prove it. If your number is not prime, all of these methods will correctly determine this, but at first they will spend a lot of time trying to prove that the compound number is prime.

I have implementations of AKS and Pocklington on my blog.

+1
source

The method of proof depends on the type of prime you are trying to prove (for example, Mersenne primes have special methods for proving primitiveness that work only with them) and the size in decimal digits. If you look at hundreds of numbers, then there is only one solution, albeit inadequate: the AKS algorithm. This is provably faster than other algorithms for proving the simplest for sufficiently large primes, but by the time it becomes useful, it will take so long that it really is not worth the trouble.

Priority proving large numbers is still a problem that has not yet been sufficiently resolved. If that were the case, EFF awards would be awarded, and cryptography would have some problems, not a list of primes, but the methods used to find them.

I believe that in the near future there will be a new algorithm for proving primitiveness, which does not depend on a pre-generated list of primes to the square root of n, and it does not rough to make sure that all primes (and the set of non-primes) are square root are used as witnesses to n primitives. This new algorithm is likely to depend on mathematical concepts that are much simpler than those used by analytic number theory. There are simple patterns, that's for sure. Identifying these patterns is a completely different matter.

0
source

All Articles