Quantitative Analysis
Parallel Processing
Numerical Analysis
C++ Multithreading
Python for Excel
Python Utilities
Printable PDF file
I. Basic math.
II. Pricing and Hedging.
III. Explicit techniques.
IV. Data Analysis.
V. Implementation tools.
1. Finite differences.
2. Gauss-Hermite Integration.
3. Asymptotic expansions.
4. Monte-Carlo.
5. Convex Analysis.
A. Basic concepts of convex analysis.
B. Caratheodory's theorem.
C. Relative interior.
D. Recession cone.
E. Intersection of nested convex sets.
F. Preservation of closeness under linear transformation.
G. Weierstrass Theorem.
H. Local minima of convex function.
I. Projection on convex set.
J. Existence of solution of convex optimization problem.
K. Partial minimization of convex functions.
L. Hyperplanes and separation.
M. Nonvertical separation.
N. Minimal common and maximal crossing points.
O. Minimax theory.
P. Saddle point theory.
Q. Polar cones.
R. Polyhedral cones.
S. Extreme points.
T. Directional derivative and subdifferential.
U. Feasible direction cone, tangent cone and normal cone.
V. Optimality conditions.
W. Lagrange multipliers for equality constraints.
X. Fritz John optimality conditions.
Y. Pseudonormality.
Z. Lagrangian duality.
[. Conjugate duality.
VI. Basic Math II.
VII. Implementation tools II.
VIII. Bibliography
Notation. Index. Contents.

Optimality conditions.


(Minimum of a smooth function). Let MATH be a smooth function and let $x^{\ast\,}$ be a minimum of $f$ over the subset $X$ of $\QTR{cal}{R}^{n}$ . Then MATH Equivalently, MATH If $X$ is convex then MATH If $X=\QTR{cal}{R}^{n}$ then MATH


Let MATH , $y\not =0$ then there exists MATH such that MATH By smoothness of $f$ we have MATH Hence, MATH We pass the above to the limit and obtain MATH The rest of the proposition follows from the proposition ( Tangent cone 4 )-1.


(Minimum of a convex function). Let MATH be a convex function and let $X$ be a convex subset of $\QTR{cal}{R}^{n}$ . Then MATH

Equivalently, MATH


Assume MATH and MATH for any $x\in X$ . Then by the definition ( Subgradient and subdifferential ) MATH and thus MATH .

Conversely, let MATH . Then MATH for any $x\in X$ . According to the proposition ( Properties of subgradient )-1 MATH According to the proposition ( Existence of subdifferential ) the MATH is taken over a compact set. Also, the MATH is a continuous function of $d$ . Hence, the MATH is achieved at some $d^{\ast}$ . Such $d^{\ast}$ has the property MATH

The second part of the proposition MATH is evident because the statement MATH may be rewritten as MATH according to the definition ( Polar cone definition ).


(Local minimum of a sum). Let MATH be a convex function, MATH be a smooth function, $X$ be a subset of $\QTR{cal}{R}^{n}$ , $x^{\ast}$ be a local minimum of $f=f_{1}+f_{2}$ and let MATH be convex. Then MATH


The proof is a repetition of the proofs for the propositions ( Minimum of a smooth function ) and ( Minimum of a convex function ).

Optimality for smooth function figure 1
Optimality for smooth function figure 1

The figure ( Optimality for smooth function figure 1 ) illustrates the condition MATH . The painted triangle is the constraint set $X$ . The ellipses are the level curves of a function $f\left( x\right) $ with the internal ellipse is the level curve with the smallest value. The slightly transparent triangle is the set MATH . The arrow is the vector MATH . The MATH is orthogonal to the level curve that passes through $x^{\ast}$ and points to the direction of increase of $f$ . The MATH points in direction of decrease. Because the MATH lies within the MATH the point $x^{\ast}$ minimizes $f$ over $X$ . The alternative situation is presented on the picture ( Optimality for smooth function figure 2 ). Here, MATH lies outside of the MATH . In addition the MATH must be orthogonal to the level curve. Therefore, the level curve must cross into $X$ thus preventing $x^{\ast}$ from being the minimum.

Optimality for smooth function figure 2
Optimality for smooth function figure 2

Notation. Index. Contents.

Copyright 2007