I. Basic math.
 II. Pricing and Hedging.
 III. Explicit techniques.
 IV. Data Analysis.
 1 Time Series.
 2 Classical statistics.
 A. Basic concepts and common notation of classical statistics.
 B. Chi squared distribution.
 C. Student's t-distribution.
 D. Classical estimation theory.
 a. Sufficient statistics.
 b. Sufficient statistic for normal sample.
 c. Maximal likelihood estimation (MLE).
 d. Asymptotic consistency of MLE. Fisher's information number.
 e. Asymptotic efficiency of the MLE. Cramer-Rao low bound.
 E. Pattern recognition.
 3 Bayesian statistics.
 V. Implementation tools.
 VI. Basic Math II.
 VII. Implementation tools II.
 VIII. Bibliography
 Notation. Index. Contents.

## Asymptotic consistency of MLE. Fisher's information number.

roposition

Let be an iid sample from the population with the distribution . Assume that the distribution is a smooth function of for all possible , and let be the MLE of then in distribution as the sample size approaches infinity. Here is a standard normal variable and the is the Fisher's information number

Proof

We introduce the log-likelihood function and consider the Taylor expansion Since is the MLE we have , hence The is a sum of iid random variables. The mean of each is zero. Indeed, Hence, according to the CLT ( Central Limit theorem ) in distribution. Similarly, with some number . Since the expression is not zero, we obtain Observe that Indeed, Hence,

 Notation. Index. Contents.