Content of present website is being moved to www.lukoe.com/finance . Registration of www.opentradingsystem.com will be discontinued on 2020-08-14.
 I. Basic math.
 II. Pricing and Hedging.
 III. Explicit techniques.
 IV. Data Analysis.
 V. Implementation tools.
 VI. Basic Math II.
 VII. Implementation tools II.
 1 Calculational Linear Algebra.
 A. Quadratic form minimum.
 B. Method of steepest descent.
 C. Method of conjugate directions.
 D. Method of conjugate gradients.
 E. Convergence analysis of conjugate gradient method.
 F. Preconditioning.
 G. Recursive calculation.
 H. Parallel subspace preconditioner.
 2 Wavelet Analysis.
 3 Finite element method.
 4 Construction of approximation spaces.
 5 Time discretization.
 6 Variational inequalities.
 VIII. Bibliography
 Notation. Index. Contents.

## Recursive calculation. roposition

(Recursive calculation) Suppose then for any and the sequence  converges to the solution of the equation (Recursive equation)

Proof

We subtract the relationships and obtain Hence There following are (at least) two ways one can arrive to the equation ( Recursive equation ) starting from the equation ( Linear equation ).

Proposition

(Reduction scheme 1) Suppose a non-singular matrix has the decomposition with an invertible matrix then the solution of the equation ( Linear equation ) also satisfies the equation Proof

We calculate Proposition

(Reduction scheme 2) Let are two non-singular matrixes then the solution of the equation ( Linear equation ) also satisfies the equation Proof

We calculate For both propositions ( Reduction scheme 1 ) and ( Reduction scheme 2 ) we want to almost invert the matrix of the original problem ( Linear equation ). The same is true for the preconditioner of the section ( Preconditioning ). It appears that preconditioner the section ( Preconditioning ) may be used as a matrix of the proposition ( Reduction scheme 2 ), provided that the condition holds. The reverse statement is also true. However, the preconditioned conjugate gradient technique converges faster as the following proposition shows.

Proposition

Assume that is symmetric, is symmetric positive definite and then is symmetric positive definite, (see the formula ( Condition number ) for definition of ) and (Compare with the proposition ( Convergence of conjugate gradient method )).

Proof

First, we prove that is positive definite. Assume the contrary: there exists an eigenvector of with non positive eigenvalue . Then The term is positive by assumption of the proposition. Then we cannot have . We arrive to a contradiction.

Next, we estimate . By we have Hence Finally, we estimate . The function is increasing for . Indeed, Hence,  Notation. Index. Contents.
 Copyright 2007