I. Basic math.
 1 Conditional probability.
 2 Normal distribution.
 3 Brownian motion.
 4 Poisson process.
 5 Ito integral.
 6 Ito calculus.
 7 Change of measure.
 8 Girsanov's theorem.
 9 Forward Kolmogorov's equation.
 10 Backward Kolmogorov's equation.
 11 Optimal control, Bellman equation, Dynamic programming.
 A. Deterministic optimal control problem.
 B. Stochastic optimal control problem.
 C. Optimal stopping time problem. Free boundary problem.
 II. Pricing and Hedging.
 III. Explicit techniques.
 IV. Data Analysis.
 V. Implementation tools.
 VI. Basic Math II.
 VII. Implementation tools II.
 VIII. Bibliography
 Notation. Index. Contents.

## Deterministic optimal control problem.

e assume everywhere that all the functions are regular enough to differentiate and solutions of ODEs exist.

Proposition

Assume existence of solution , of the ODE for some function and a family of functions , . The function satisfies

Proof

We use the short notation , . By definition of , Note that . Hence,

Proposition

Assume that function satisfies the relationships and function is defined by for a fixed .

Then for any .

Proof

We calculate Hence, or Note that . Hence,

 Notation. Index. Contents.
 Copyright 2007