Content of present website is being moved to . Registration of will be discontinued on 2020-08-14.
Quantitative Analysis
Parallel Processing
Numerical Analysis
C++ Multithreading
Python for Excel
Python Utilities
Printable PDF file
I. Basic math.
1. Conditional probability.
A. Definition of conditional probability.
B. A bomb on a plane.
C. Dealing a pair in the "hold' em" poker.
D. Monty-Hall problem.
E. Two headed coin drawn from a bin of fair coins.
F. Randomly unfair coin.
G. Recursive Bayesian calculation.
H. Birthday problem.
I. Backward induction.
J. Conditional expectation. Filtration. Flow of information. Stopping time.
2. Normal distribution.
3. Brownian motion.
4. Poisson process.
5. Ito integral.
6. Ito calculus.
7. Change of measure.
8. Girsanov's theorem.
9. Forward Kolmogorov's equation.
10. Backward Kolmogorov's equation.
11. Optimal control, Bellman equation, Dynamic programming.
II. Pricing and Hedging.
III. Explicit techniques.
IV. Data Analysis.
V. Implementation tools.
VI. Basic Math II.
VII. Implementation tools II.
VIII. Bibliography
Notation. Index. Contents.

Conditional expectation. Filtration. Flow of information. Stopping time.

e are considering development of some market model during a time interval $[0,T].$ At the time $t=0$ we know nothing about the future and we represent this fact with the trivial algebra MATH The $\Omega$ is the event space. It is the full description of what may happen in the model. By the time moment $t_{1}>0$ random variables of the model have certain realizations. One may make particular statements about such realizations that are perfectly verifiable from the point of view of information available at time $t_{1}$ . Such statements are represented by subsets of $\Omega$ and constitute an algebra MATH , (see section ( Operations on sets ) for explanation of the "algebra" term in this context). Similarly, for a time moment MATH we form an algebra MATH . Since market participants do not forget information, MATH is a subset of MATH . A family of such algebras MATH is called "filtration" or "flow of information".


Consider a discrete random walk $X_{t}$ described by the picture. The process $X_{t}$ starts at the point A at time $t_{0}$ . By the time $t_{1}$ the process may be observed at point $B_{1}$ or $B_{2}$ . Such outcome is uncertain and these are all possibilities at the time $t_{1}$ . Similarly, if the process is at point $B_{2}$ then it may jump up to $C_{3}$ or down to $C_{2}$ .


We introduce the notation MATH The highlighted path then is described by the elementary random event MATH . At the initial time moment $t_{0}$ our knowledge is given by the trivial algebra MATH with $\Omega$ being enumeration of everything that may happen: MATH and $\emptyset$ being the event that never happens. At the time $t_{1}$ we know where the process went at $t_{0}$ . Hence, MATH where the MATH means "take all intersections, unions and complements of the arguments and put them together into the algebra MATH ". For example, MATH contains the set MATH The information at $t_{2}$ is represented by the algebra MATH . For example, MATH contains the set MATH . The information at $t_{3}$ is represented by the algebra MATH , MATH , MATH , MATH , MATH , MATH , MATH , MATH , MATH .

By definition, a process $Y_{t}$ is adapted to a filtration MATH if for any particular $t$ the $Y_{t}$ is $\QTR{cal}{F}_{t}-$ measurable. Equivalently, $Y_{t}$ is adapted to MATH if for any $t$ the algebra $\QTR{cal}{F}_{t}$ contains the full description of the path MATH for all times up to $t$ . For a particular process $Y_{t}$ one may form a family MATH such that for any $t$ the algebra $\QTR{cal}{F}_{t}$ is the minimal algebra that makes the random variable $Y_{t}$ $\QTR{cal}{F}_{t}-$ measurable (for any $t$ the $\QTR{cal}{F}_{t}$ is the minimal description of all possible realizations of MATH ). Such a family MATH is called "the filtration generated by the $Y_{t}$ ". The notation MATH is commonly used to describe the generated filtration.

Suppose $Y_{t}$ is adapted to MATH and $s<t.$ The $Y_{s}$ is $\QTR{cal}{F}_{t}$ -measurable (because $\QTR{cal}{F}_{t}$ is sufficient to describe $Y_{t}$ , hence, it contains description of $Y_{s}$ as well). Regularly (unless $Y_{t}$ is deterministic during $(s,t]$ ), $Y_{t}$ is not $\QTR{cal}{F}_{s}$ -measurable because $\QTR{cal}{F}_{s}$ is lacking structure to describe $Y_{t}$ . However, we may create a crude adjustment of $Y_{t}$ to $\QTR{cal}{F}_{s}$ by taking the conditional expectation MATH for every set $A$ from $\QTR{cal}{F}_{s}.$ This creates a mapping from $\QTR{cal}{F}_{s}$ to the range of $Y_{t}$ . A proper restriction of such mapping is an $\QTR{cal}{F}_{s}-$ measurable random variable that we denote MATH The same (in "almost sure" sense) object MATH could be introduced by requiring that the variable MATH by definition, would be $\QTR{cal}{F}_{s}$ -measurable and satisfy MATH for any set $A$ from $\QTR{cal}{F}_{s}$ .

The condition MATH written in the form

MATH (Chain rule)
is called "the chain rule".


In the setting of the previous example, the MATH , MATH , is the random variable taking value MATH if $d_{0}$ and value MATH if $u_{0}$ . We would, of course, have to assign some probabilities to the events MATH and MATH to actually calculate these numbers. Observe that the chain rule ( Chain_rule ) is simply a grouping of summation terms in such situation.

We will say that the random variable $Y$ is independent from $\QTR{cal}{F}_{t}$ if for any smooth function MATH we have MATH .

A random variable $\tau$ is called the $\QTR{cal}{F}_{t}$ -stopping time if for any $t$ the random event MATH belongs to the $\QTR{cal}{F}_{t}$ . In other words, $\tau$ is a stopping time if at any moment $t$ we are able to tell with certainty which of the statements MATH and MATH is true.


For the process $X_{t}$ in our examples, the time moment MATH first time when the $X_{t}$ is above level $L\}$ is a stopping time. The time moment MATH the time when $X_{t}$ reaches maximum for MATH is not a stopping time.

Notation. Index. Contents.

Copyright 2007