The fastest numerical solution of a real cubic polynomial?

R question: Looking for the fastest way to NUMBER solve a bunch of arbitrary cubes, which, as you know, have real coefficients and three real roots. It is reported that for the polynomial function in R, the Jenkins-Traub 419 algorithm is used for complex polynomials, but for real polynomials, the authors refer to their earlier work. What are the faster options for real cubes or, moreover, for a real polynomial?

+2
source share
7 answers

Smoothing Arietta's answer above:

> a <- c(1,3,-4) > m <- matrix(c(0,0,-a[1],1,0,-a[2],0,1,-a[3]), byrow=T, nrow=3) > roots <- eigen(m, symm=F, only.values=T)$values 

Whether it is faster or slower than using a cubic solver in the GSL package (as suggested above), it is a matter of comparing it with your system.

+2
source

The numerical solution for this many times in a reliable, stable way includes: (1) Forms a matrix of companions, (2) we find the eigenvalues ​​of the matrix of related ones.

You might think that solving this problem is more difficult than the original, but this is how the solution is implemented in most production codes (for example, Matlab).

For polynomial:

 p(t) = c0 + c1 * t + c2 * t^2 + t^3 

companion matrix:

 [[0 0 -c0],[1 0 -c1],[0 1 -c2]] 

Find the eigenvalues ​​of such a matrix; they correspond to the roots of the original polynomial.

To do this, load the routines of singular values ​​from LAPACK very quickly, compile them and bind them to your code. Do this in parallel if you have too many (say, about a million) sets of odds.

Please note that the coefficient at t^3 is equal to one, if this is not the case in your polynomials, you will need to divide everything by a coefficient and then continue.

Good luck.

Edit: Nump and octave also depend on this methodology for computing the roots of polynomials. See, for example, this link .

+6
source

The fastest way (that I know) to find real solutions to a system of arbitrary polynomials in n variables is polyhedral homotopy. A detailed explanation is probably beyond the scope of StackOverflow's answer, but essentially it is a path algorithm that uses the structure of each equation using toric geometries. Google will provide you with a number of documents .

Perhaps this question is better suited for mathoverflow ?

+4
source

Do you need all three roots or one? If only one, I would have thought that the Newton method would work fine. If all 3, then this can be problematic in cases where the two are close to each other.

+1
source

1) Solve for the derivative of the polynomial P 'to find your three roots. Look there to know how to do it right. Name these roots a and b (with the symbol <b)

2) For the middle root, use several halving steps between a and b, and when you're close enough, finish using the Newton method.

3) For the minimum and maximum root, they β€œhunt” for a solution. Max Root:

  • Start with x0 = b, x1 = b + (b - a) * lambda, where lambda is a moderate number (say 1.6)
  • do x_n = b + (x_ {n - 1} - a) * lambda, until P (x_n) and P (b) have other signs
  • Do bisection + newton between x_ {n - 1} and x_n
+1
source

General methods are available: Newton's method, bisection method, sequence, fixed-point iteration, etc. Google any of them.

If you have a nonlinear system on the other hand (for example, a system of N polynomial equations in N unknowns), a method such as high- you can use Newton .

0
source

Have you tried looking at the GSL package http://cran.r-project.org/web/packages/gsl/index.html ?

0
source

Source: https://habr.com/ru/post/1311876/


All Articles