Fast, approximate solution to linear equations?

3k Views Asked by At

I need to solve a system of N linear equations as an intermediate step in a numerical optimizer. AFAIK reasonably simple algorithms for doing so exactly are O(N^3) (though I saw a horibly complicated one in some math paper that could do it in something like O(N^2.8) with a huge constant). In some cases N is huge, i.e. several thousand.

Is there any good way to get a decent approximate solution to a system of linear equations in less than O(N^3)?

Edit:

Here are some more details if it helps at all.

  1. My matrix is symmetric, and not sparse.

  2. It's the second derivative matrix from Newton-Raphson. I'm trying to optimize something in a 2000-dimensional space.

3

There are 3 best solutions below

0
pnt On

Yes, if the matrix you get from their coefficients is sparse. There's the method of the "Right lay" (in Bulgarian, not sure about the exact translation) if you have a tri-diagonal matrix, for instance, which operates in O(N). There are other algorithms that are still O(N^3) but achieve incredible results if the matrix conforms to some invariant that they require (sparse, diagonal-prevailing, triangular, etc.).

If you're stuck to a specific method based on your invariant, the only way to further speed things up is to go multi-threaded.

Try this search.

0
toochin On

There are iterative methods like Jacobi, Gauss-Seidel, cg, GMRES etc.

0
celion On

For a symmetric matrix, the conjugate gradient method is simple to implement, and will beat most other iterative methods (e.g. Gauss-Seidel, SOR). The main loop consists of a matrix-vector multiply and a few other vector operations.

Once you've got it working, you can use preconditioning to improve the convergence even more.