I. Basic math.
 II. Pricing and Hedging.
 III. Explicit techniques.
 IV. Data Analysis.
 V. Implementation tools.
 VI. Basic Math II.
 1 Real Variable.
 2 Laws of large numbers.
 3 Characteristic function.
 4 Central limit theorem (CLT) II.
 5 Random walk.
 A. Zero-or-one laws.
 B. Optional random variable. Stopping time.
 C. Recurrence of random walk.
 D. Fine structure of stopping time.
 E. Maximal value of random walk.
 6 Conditional probability II.
 7 Martingales and stopping times.
 8 Markov process.
 9 Levy process.
 10 Weak derivative. Fundamental solution. Calculus of distributions.
 11 Functional Analysis.
 12 Fourier analysis.
 13 Sobolev spaces.
 14 Elliptic PDE.
 15 Parabolic PDE.
 VII. Implementation tools II.
 VIII. Bibliography
 Notation. Index. Contents.

## Optional random variable. Stopping time.

efinition

(Optional random variable) A r.v. is called "optional" relative to the stochastic process if it takes values in and satisfies the condition where the is the -algebra generated by the .

The points out some time index and does so based on the information about the path of the process up to that time index. For a particular path one can have for at most one index: For these reasons the is sometimes called "stopping time".

Definition

(Pre- field) The pre- field , , is the collection

The pre- field is a collection of scenarios that lead to the optional event . In particular,

Definition

(Post- process) The post- process is defined on the event using the relationship

The post- field is the field generated by the post- process.

Thus, for every the time is defined and then for . In other words, is the part of the path after the stopping time with the pre- part shifted forward and away:

Proposition

(Independency of pre-alpha and post-alpha fields) For a stationary independent process and an almost everywhere finite optional r.v. related to it, the pre- and post- fields are independent. Furthermore, the post- process is a stationary independent process with the same common distribution as the original one.

Proof

We seek to prove that The statement assumes that is almost everywhere finite: Hence, we calculate It follows from the definition of that . Therefore, we continue We note that the events and belong to the fields and on the opposite sides of the time index . We use the independence of : We can use the same trick to show that the events are independent. Hence,

Definition

(AlphaK and BetaK 1) We introduce the following random variables and :

Assuming the representation of the formula ( Random walk space ), In other words, is the stopping time calculated from the path obtained by shifting forward the original path by calculated on : and we continue The is the path obtained by shifting forward by the calculated on We call it : Hence, we obtain another way to understand and .

Definition

(AlphaK and BetaK 2) We introduce the random variables according to the rules The variable is defined by the property in other words, the data at the first position in is the data at -th position in the .

Proposition

(BetaK separation of random walk). Let be a stationary independent process. Then

1. The random vectors are iid.

2. For a Borel measurable function , the r.v. are iid.

Proof

Assuming the representation of the formula ( Random walk space ),

Note that points at the slot of . Hence, depends only on the part of the path between the -th position and the -the position and the manner of the dependency is the same for all .

Proposition

(Maximum of random walk) Let be a stationary independent process, is the associated random walk and the r.v. are defined by Then the statements A,B,C are equivalent and the statements a,b,c are equivalent.

A. .

B. .

C. .

a. .

b. .

c. .

Proof

Note that the events and are permutable events. Therefore, according to the proposition ( Hewitt and Savage zero-or-one law ) the values 1,0 are the only possible values for and . Hence, if we prove equivalence of A,B,C then we also obtain equivalence of a,b,c.

We prove as follows. According to the proposition ( BetaK separation of random walk ), the variables are iid. Hence, the proposition ( Strong law of large numbers for iid r.v. ) applies and we derive where . We also have Hence, Also, thus This implies B.

By definition of , hence, B implies C. The C implies A by definitions of and .

Proposition

(Eventuality of random walk) Let be a stationary independent process and is the associated random walk. There are only four mutually exclusive possibilities, each taking place a.s.

1. ,

2. ,

3. ,

4. .

Proof

If then (1) takes place. We exclude such possibility from further consideration. The is a permutable r.v., hence, by the proposition ( Hewitt and Savage zero-or-one law ) it is a constant almost surely. Note that thus Therefore, either or .

 Notation. Index. Contents.