Almost Sure

6 December 09

Martingales and Elementary Integrals

A martingale is a stochastic process which stays the same, on average. That is, the expected future value conditional on the present is equal to the current value. Examples include the wealth of a gambler as a function of time, assuming that he is playing a fair game. The canonical example of a continuous time martingale is Brownian motion and, in discrete time, a symmetric random walk is a martingale. As always, we work with respect to a filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge 0},{\mathbb P})}. A process {X} is said to be integrable if the random variables {X_t} are integrable, so that {{\mathbb E}[\vert X_t\vert]<\infty}.

Definition 1 A martingale, {X}, is an integrable process satisfying

\displaystyle  X_s={\mathbb E}[X_t\mid\mathcal{F}_s]

for all {s<t\in{\mathbb R}_+}.

A closely related concept is that of a submartingale, which is a process which increases on average. This could represent the wealth of a gambler who is involved in games where the odds are in his favour. Similarly, a supermartingale decreases on average.

Definition 2 A process {X} is a

  • submartingale if it is integrable and

    \displaystyle  X_s\le{\mathbb E}[X_t\mid\mathcal{F}_s]

    for all {s<t\in{\mathbb R}_+}.

  • supermartingale if it is integrable and

    \displaystyle  X_s\ge{\mathbb E}[X_t\mid\mathcal{F}_s]

    for all {s<t\in{\mathbb R}_+}.

This terminology can perhaps be a bit confusing, but it is related to the result that if {B} is an n-dimensional Brownian motion and {f\colon{\mathbb R}^n\rightarrow{\mathbb R}} then {f(B)} is a submartingale or supermartingale when {f} is a subharmonic or, respectively, a superharmonic function.

Clearly, {X} is a submartingale if and only if {-X} is a supermartingale, and is a martingale if and only if it is both a submartingale and a supermartingale.

Lemma 3 If {X} is a martingale and {f\colon{\mathbb R}\rightarrow{\mathbb R}} is a convex function such that {f(X)} is integrable, then, {f(X)} is a submartingale.

Proof: This is a direct consequence of Jensen’s inequality,

\displaystyle  {\mathbb E}[f(X_t)\mid\mathcal{F}_s]\ge f({\mathbb E}[X_t\mid\mathcal{F}_s])=f(X_s).

In particular, if {X} is any {L^p}-integrable martingale for {p\ge 1} then {\vert X\vert^p} is a submartingale. A similar result holds for submartingales, although the additional hypothesis that the function is increasing is required.

Lemma 4 Suppose that {X} is a submartingale and {f\colon{\mathbb R}\rightarrow{\mathbb R}} is an increasing convex function such that {f(X)} is integrable. Then, {f(X)} is a submartingale.

Proof: Again, Jensen’s inequality gives the result,

\displaystyle  {\mathbb E}[f(X_t)\mid\mathcal{F}_s]\ge f({\mathbb E}[X_t\mid\mathcal{F}_s])\ge f(X_s),

where the final inequality follows from the monotonicity of {X}. ⬜

So, the positive part {X^+} of a submartingale {X} is itself a submartingale.

Elementary integrals

Martingales and submartingales are especially well behaved under stochastic integration. An elementary or elementary predictable process is of the form

\displaystyle  \xi_t = Z_01_{\{t=0\}}+\sum_{k=1}^n Z_k1_{\{s_k<t\le t_k\}} (1)

for {n\ge 0}, times {s_k<t_k}, {\mathcal{F}_0}-measurable random variable {Z_0} and {\mathcal{F}_{s_k}}-measurable random variables {Z_k}. Alternatively, {\xi} is elementary if it is left-continuous and adapted, and there are times {0=t_0<t_n<\cdots<t_n} such that {\xi} is constant on each of the intervals {(t_{k-1},t_k)} and zero on {(t_n,\infty)}. In particular, these are predictable processes and, in fact, generate the predictable sigma-algebra. It is also clear that the set of elementary processes is closed under linear combinations and products, and {f(\xi)} is elementary for any elementary {\xi} and measurable function {f}.

Stochastic integrals of such elementary processes can be written out explicitly. The integral of the process given by (1) with respect to a stochastic process {X} is

\displaystyle  \int_0^\infty \xi\,dX = \sum_{k=1}^nZ_k(X_{t_k}-X_{s_k}).

The integral over a finite range {[0,t]} is,

\displaystyle  \int_0^t\xi\,dX=\int_0^\infty 1_{\{s\le t\}}\xi_s\,dX_s=\sum_{k=1}^nZ_k(X_{t_k\wedge t}-X_{s_k\wedge t}).

Note that, for this expression to make sense, it is only strictly necessary for {1_{\{s\le t\}}\xi_s} to be elementary for each {t\in{\mathbb R}_+}. Equivalently, {\xi} is given by expression (1) for {n=\infty} and {s_k\rightarrow\infty}. Alternatively, {\xi} is left-continuous and adapted, and there is a sequence of times {t_k\uparrow\infty} such that it is constant over each of the intervals {(t_{k-1},t_k)}.

Letting {t} run over the domain {{\mathbb R}_+}, the integral {\int_0^t\xi\,dX} defines a new process. I shall often write {Y=\int\xi\,dX}, dropping the limits, to express the stochastic integral as a process. This can equivalently be written in the differential form {dY=\xi\,dX}, which is just a shorthand for the integral expression.

These elementary integrals satisfy some basic properties which follow directly from the definitions. Linearity in both the integrand {\xi} and integrator {X} is clear. Furthermore, associativity holds. If {\alpha,\beta} are elementary and {Y=\int\alpha\,dX} then

\displaystyle  \int\beta\,dY=\int\beta\alpha\,dX.

In differential notation, this is simply {\beta(\alpha\,dX)=(\beta\alpha)\,dX}. Stopping an integral process at a random time {\tau} is the same as stopping the integrator,

\displaystyle  \left(\int\xi\,dX\right)^\tau = \int\xi\,dX^\tau. (2)

The full theory of stochastic calculus extends these elementary integrals to arbitrary predictable integrands. However, just the elementary case defined above is enough to get going with. First, considering expectations of stochastic integrals leads to the following alternative definition of martingales and submartingales.

Theorem 5 An adapted integrable process {X} is

  • a martingale if and only if

    \displaystyle  {\mathbb E}\left[\int_0^\infty \xi\,dX\right]=0

    for all bounded elementary processes {\xi}.

  • a submartingale if and only if
    \displaystyle  {\mathbb E}\left[\int_0^\infty\xi\,dX\right]\ge0 (3)

    for all nonnegative bounded elementary processes {\xi}.

Proof: It is enough to prove the second statement, because the first one follows immediately from applying this to both {X} and {-X}. So, suppose that {X} is a submartingale. An elementary and nonnegative elementary process {\xi} can be written in the form (1) for {Z_k} nonnegative. Then,

\displaystyle  {\mathbb E}\left[\int_0^\infty\xi\,dX\right] = \sum_{k=1}^n{\mathbb E}\left[Z_k(X_{t_k}-X_{s_k})\right] =\sum_{k=1}^n{\mathbb E}\left[Z_k({\mathbb E}[X_{t_k}\mid\mathcal{F}_{s_k}]-X_s)\right]\ge 0.

Conversely, suppose that inequality (3) holds. Choosing any {s<t\in{\mathbb R}_+} and {A\in\mathcal{F}_s}. The process {\xi_u=1_A1_{\{s<u\le t\}}} is elementary and

\displaystyle  {\mathbb E}\left[1_A(X_t-X_s)\right]={\mathbb E}\left[\int_0^\infty\xi\,dX\right]\ge 0.

Choosing {A=\{{\mathbb E}[X_t\mid\mathcal{F}_s]<X_s\}},

\displaystyle  0\ge{\mathbb E}\left[\min({\mathbb E}[X_t\mid\mathcal{F}_s]-X_s,0)\right]={\mathbb E}[1_A(X_t-X_s)]\ge 0.

Any non-positive random variable whose expectation is zero must itself be equal to zero, almost surely. So, {{\mathbb E}[X_t\mid\mathcal{F}_s]\ge X_s} as required. ⬜

Elementary integrals preserve the martingale property.

Lemma 6 Let {X} be a process and {\xi} be a bounded elementary process. Define {Y_t=\int_0^t\xi\,dX}. Then

  • If {X} is a martingale then so is {Y}.
  • If {X} is a submartingale and {\xi} is non-negative then {Y} is a submartingale.

Proof: The first statement follows from the second applied to both {X} and {-X}. So, it is enough to consider the case where {X} is a submartingale. If {\alpha} is a nonnegative elementary process then associativity of the integral gives

\displaystyle  {\mathbb E}\left[\int_0^\infty\alpha\,dY\right]={\mathbb E}\left[\int_0^\infty\alpha\xi\,dX\right]\ge 0.

This inequality follows from the previous lemma and, again by the previous lemma, it shows that {Y} is a submartingale. ⬜

Finally, optional stopping of martingales follows from the properties of elementary integrals. This extends the martingale property to random stopping times. The value {X_\tau} of an arbitrary process at the random time {\tau} need not be well behaved, or even measurable, unless we restrict to nice versions of processes. So, for this post, we shall call a stopping time simple if it only takes finitely many values in {\bar{\mathbb R}_+}. If {\tau} is a bounded simple stopping time taking values in {\{t_1<\cdots<t_n\}} then the process

\displaystyle  1_{\{t\le\tau\}}=1_{\{t=0\}}+\sum_{k=1}^n1_{\{\tau> t_{k-1}\}}1_{\{t_{k-1}<t\le t_k\}}

is elementary and, furthermore, equation (2) can be extended to give

\displaystyle  \left(\int\xi\,dX\right)^\tau = \int\xi\,dX^\tau=\int1_{\{t\le\tau\}}\xi\,dX.

The optional stopping theorem states that the class of martingales is closed under stopping at arbitrary stopping times.

Lemma 7 Let {X} be a martingale (resp. submartingale, supermartingale) and {\tau} be a simple stopping time. Then, the stopped process {X^\tau} is also a martingale (resp. submartingale, supermartingale).

Proof: This follows from applying Lemma 6 to the following identity

\displaystyle  X^\tau = X_0 + \int 1_{\{t\le\tau\}}\,dX.

Advertisements

Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Create a free website or blog at WordPress.com.