I need to solve a system of N linear equations as an intermediate step in a numerical optimizer. AFAIK reasonably simple algorithms for doing so exactly are O(N^3) (though I saw a horibly complicated one in some math paper that could do it in something like O(N^2.8) with a huge constant). In some cases N is huge, i.e. several thousand.
Is there any good way to get a decent approximate solution to a system of linear equations in less than O(N^3)?
Edit:
Here are some more details if it helps at all.
My matrix is symmetric, and not sparse.
It's the second derivative matrix from Newton-Raphson. I'm trying to optimize something in a 2000-dimensional space.
Yes, if the matrix you get from their coefficients is sparse. There's the method of the "Right lay" (in Bulgarian, not sure about the exact translation) if you have a tri-diagonal matrix, for instance, which operates in O(N). There are other algorithms that are still O(N^3) but achieve incredible results if the matrix conforms to some invariant that they require (sparse, diagonal-prevailing, triangular, etc.).
If you're stuck to a specific method based on your invariant, the only way to further speed things up is to go multi-threaded.
Try this search.