I. Basic math.
 II. Pricing and Hedging.
 III. Explicit techniques.
 IV. Data Analysis.
 V. Implementation tools.
 VI. Basic Math II.
 VII. Implementation tools II.
 1 Calculational Linear Algebra.
 B. Method of steepest descent.
 C. Method of conjugate directions.
 E. Convergence analysis of conjugate gradient method.
 F. Preconditioning.
 G. Recursive calculation.
 H. Parallel subspace preconditioner.
 2 Wavelet Analysis.
 3 Finite element method.
 4 Construction of approximation spaces.
 5 Time discretization.
 6 Variational inequalities.
 VIII. Bibliography
 Notation. Index. Contents.

## Preconditioning.

ccording to the proposition ( Convergence of conjugate gradient method ), the procedure of the summary ( Conjugate gradients ) converges significantly better when and are close. For this reason one may attempt to consider instead of for some matrix that almost inverts . The matrix does not have to be symmetric or positive definite in . However, it is self-adjoint and positive definite with respect to : Therefore, it has all the necessary spectral properties and we can still apply the procedure ( Conjugate gradients ).

Another possibility is to try factorization Such decomposition always exists (but not unique) if is symmetric positive definite. We have We make the change multiply by and arrive to Note that (by similar calculation) The matrix is symmetric positive-definite in . The procedure ( Conjugate gradients ) then can be adapted for the equation with usual tricks to keep down the number of matrix multiplications.

 Notation. Index. Contents.