I need to solve a system of N linear equations as an intermediate step in a numerical optimizer. AFAIK reasonably simple algorithms for doing so exactly are O(N^3) (though I saw a horibly complicated one in some math paper that could do it in something like O(N^2.8) with a huge constant). In some cases N is huge, i.e. several thousand.
Is there any good way to get a decent approximate solution to a system of linear equations in less than O(N^3)?
Edit:
Here are some more details if it helps at all.
My matrix is symmetric, and not sparse.
It's the second derivative matrix from Newton-Raphson. I'm trying to optimize something in a 2000-dimensional space.
For a symmetric matrix, the conjugate gradient method is simple to implement, and will beat most other iterative methods (e.g. Gauss-Seidel, SOR). The main loop consists of a matrix-vector multiply and a few other vector operations.
Once you've got it working, you can use preconditioning to improve the convergence even more.