I. Basic math.
 II. Pricing and Hedging.
 III. Explicit techniques.
 IV. Data Analysis.
 V. Implementation tools.
 VI. Basic Math II.
 VII. Implementation tools II.
 1 Calculational Linear Algebra.
 A. Quadratic form minimum.
 B. Method of steepest descent.
 C. Method of conjugate directions.
 D. Method of conjugate gradients.
 E. Convergence analysis of conjugate gradient method.
 F. Preconditioning.
 G. Recursive calculation.
 H. Parallel subspace preconditioner.
 2 Wavelet Analysis.
 3 Finite element method.
 4 Construction of approximation spaces.
 5 Time discretization.
 6 Variational inequalities.
 VIII. Bibliography
 Notation. Index. Contents.

## Method of conjugate gradients.

eeping the directions (see the section ( Method of conjugate directions )) is memory consuming and the procedure for calculation of such vectors is expensive. According to the formula ( Orthogonality of residues 2 ) the vectors are linearly independent. We take to be initial point of the Gram-Schmidt orthogonalization leading to . According to the section ( Gram-Schmidt orthogonalization ), and according to the summary ( Conjugate directions ) Thus We continue

 (Conjugate gradient residue selection)
According to the formula ( Orthogonality of residues 2 ) We conclude Therefore, when we conduct -th step of Gram-Schmidt -orthogonalization: only one term is non-zero in the sum: We would like to remove matrix multiplications from the above relationship. We set in the equation : and apply the operation . Then According to , hence According to the summary ( Conjugate directions ), We combine the last two relationships: thus We substitute it into : By -orthogonality of and we also have thus We collect the results.

Algorithm

(Conjugate gradients) Start from any . Set For do To avoid accumulation of round off errors, occasionally restart with using last as . Violation of -orthogonality of is the criteria of error accumulation. Use condition of the type to stop.

 Notation. Index. Contents.