Almost Sure

3 May 20

Semimartingale Local Times

Filed under: Local Times — George Lowther @ 7:42 PM
Tags: , , , ,

Figure 1: Brownian motion B with local time L and auxilliary Brownian motion W

For a stochastic process X taking values in a state space E, its local time at a point {x\in E} is a measure of the time spent at x. For a continuous time stochastic process, we could try and simply compute the Lebesgue measure of the time at the level,

\displaystyle  L^x_t=\int_0^t1_{\{X_s=x\}}ds. (1)

For processes which hit the level {x} and stick there for some time, this makes some sense. However, if X is a standard Brownian motion, it will always give zero, so is not helpful. Even though X will hit every real value infinitely often, continuity of the normal distribution gives {{\mathbb P}(X_s=x)=0} at each positive time, so that that {L^x_t} defined by (1) will have zero expectation.

Rather than the indicator function of {\{X=x\}} as in (1), an alternative is to use the Dirac delta function,

\displaystyle  L^x_t=\int_0^t\delta(X_s-x)\,ds. (2)

Unfortunately, the Dirac delta is not a true function, it is a distribution, so (2) is not a well-defined expression. However, if it can be made rigorous, then it does seem to have some of the properties we would want. For example, the expectation {{\mathbb E}[\delta(X_s-x)]} can be interpreted as the probability density of {X_s} evaluated at {x}, which has a positive and finite value, so it should lead to positive and finite local times. Equation (2) still relies on the Lebesgue measure over the time index, so will not behave as we may expect under time changes, and will not make sense for processes without a continuous probability density. A better approach is to integrate with respect to the quadratic variation,

\displaystyle  L^x_t=\int_0^t\delta(X_s-x)d[X]_s (3)

which, for Brownian motion, amounts to the same thing. Although (3) is still not a well-defined expression, since it still involves the Dirac delta, the idea is to come up with a definition which amounts to the same thing in spirit. Important properties that it should satisfy are that it is an adapted, continuous and increasing process with increments supported on the set {\{X=x\}},

\displaystyle  L^x_t=\int_0^t1_{\{X_s=x\}}dL^x_s.

Local times are a very useful and interesting part of stochastic calculus, and finds important applications to excursion theory, stochastic integration and stochastic differential equations. However, I have not covered this subject in my notes, so do this now. Recalling Ito’s lemma for a function {f(X)} of a semimartingale X, this involves a term of the form {\int f^{\prime\prime}(X)d[X]} and, hence, requires {f} to be twice differentiable. If we were to try to apply the Ito formula for functions which are not twice differentiable, then {f^{\prime\prime}} can be understood in terms of distributions, and delta functions can appear, which brings local times into the picture. In the opposite direction, which I take in this post, we can try to generalise Ito’s formula and invert this to give a meaning to (3).

There are two main approaches to local times. One method, which is used in Markov process theory, makes use of dual projections to construct the continuous increasing process {L^x}. The other method uses stochastic integration of real-valued semimartingales. This post is concerned only with the semimartingale approach and, throughout, all semimartingales will be assumed to be real-valued. As usual, we work with respect to a complete filtered probability space {(\Omega,\mathcal F,\{\mathcal F_t\}_{t\ge0},{\mathbb P})}, and the starting point is the simple result below. Every convex function {f\colon{\mathbb R}\rightarrow{\mathbb R}} has well-defined left and right-hand derivatives, and I use {f^\prime(x)} for the left-hand derivative, although this is just convention and there would not be any significant change to any of the arguments of this post if the right-hand derivatives were used instead.

Lemma 1 Let X be a semimartingale and {f\colon{\mathbb R}\rightarrow{\mathbb R}} be convex. Then,

\displaystyle  f(X)=\int f^\prime(X_-)\,dX+V (4)

for a cadlag increasing process V.

Proof: First, in the case where {f} is convex and twice continuously differentiable, then (4) is immediately given by Ito’s formula with

\displaystyle  V_t = f(X_0)+\frac12\int_0^tf^{\prime\prime}(X)d[X]^c+\sum_{s\le t}\left(\Delta f(X)-f^\prime(X_{s-})\Delta X_s\right).

By convexity, {f^{\prime\prime}} and {\Delta f(X)-f^\prime(X_-)\Delta X} are nonnegative, so that V is increasing as required.

We extend to the non-twice-differentiable case by approximating with smooth functions. So, suppose that {f\colon{\mathbb R}\rightarrow{\mathbb R}} is convex. We convolve this with any twice continuously differentiable nonnegative function {\theta\colon(0,\infty)\rightarrow{\mathbb R}} with compact support and unit integral, {\int\theta(x)dx=1}, to obtain a sequence of smooth approximations,

\displaystyle  f_n(x)=\int_0^\infty f(x-y/n)\theta(y)dy.

Convexity and twice continuous differentiability of {f_n} follows from the corresponding properties of, respectively, {f} and {\theta}. So, (4) applies to {f_n} giving

\displaystyle  f_n(X)=\int f_n^\prime(X_-)dX+V^n

for increasing processes {V^n}. By continuity of {f} and left-continuity of its left-handed derivative, {f(x-y/n)} and {f^\prime(x-y/n)} converge respectively to {f(x)} and {f^\prime(x)} as {n} goes to infinity. So, by bounded convergence, {f_n} and {f_n^\prime} converge to {f} and {f^\prime} respectively. Furthermore, as {f} and its derivative are locally bounded, applying locally bounded convergence shows that {\int_0^t f_n^\prime(X_-)dX} converges in probability to {\int_0^t f^\prime(X_-)dX}. Hence, if we define V by (4) then {V^n_t\rightarrow V_t} in probability and, in particular, {V_s\le V_t} (almost surely) whenever {s < t}. As the stochastic integral is cadlag, V must also have cadlag sample paths, so is increasing up to evanescence. ⬜

Before applying lemma 1 to construct local times, I first note some straightforward consequences. In the case where X is a continuous local martingale, then {f(X)} is a local submartingale, and {\int f(X)dX} is a local martingale. So, (4) is the Doob-Meyer decomposition. More generally, for any semimartingale X, using the fact that stochastic integrals and FV processes are semimartingales, (4) shows that {f(X)} is also a semimartingale.

Corollary 2 Let X be a semimartingale and {f\colon{\mathbb R}\rightarrow{\mathbb R}} be convex. Then, {f(X)} is a semimartingale.

Lemma 1 also has the following simple consequence.

Corollary 3 If X is a semimartingale, then {\int1_{\{X_-=x\}}dX} is an FV process for any {x\in{\mathbb R}}.

Proof: Set {Y=-X} and, for a convex function {f\colon{\mathbb R}\rightarrow{\mathbb R}}, set {g(x)=f(-x)}. Then, {g} is convex satisfying {g(Y)=f(X)} and {g^\prime(Y)=f^\prime_+(X)}, using {f^\prime_+} for the right-hand derivative. So, applying (4) to {g(Y)} gives

\displaystyle  f(X_t)=f(X_0)+\int_0^tf^\prime_+(X_-)\,dX+V^\prime_t

for some cadlag increasing process {V^\prime}. In particular, taking the difference with (4) gives

\displaystyle  \int(f^\prime_+(X_-)-f^\prime_-(X_-))\,dX=V-V^\prime,

which is an FV process. In particular, for any fixed {x\in{\mathbb R}}, if we define {f(y)=(y-x)_+} then {f^\prime_+(y)-f^\prime_-(y)=1_{\{y=x\}}}, giving the result. ⬜

Corollary 3 can be significantly strengthened for continuous local martingales.

Corollary 4 Let X be a continuous local martingale. Then, {\int 1_{\{X=x\}}dX=0} for any {x\in{\mathbb R}}.

Proof: By corollary 3, {\int1_{\{X=x\}}dX} is an FV process and, as it is also a continuous local martingale then it must be constant. ⬜

I note that corollary 4 can be shown directly in the case where X is Brownian motion. Since {X_t} has a continuous probability distribution at all positive times, and applying the Ito isometry,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle {\mathbb E}\left[\left(\int_0^\infty1_{\{X_t=x\}}\,dX\right)^2\right] &\displaystyle ={\mathbb E}\left[\int_0^\infty1_{\{X_t=x\}}d[X]_t\right]\smallskip\\ &\displaystyle =\int_0^\infty{\mathbb P}(X_t=x)\,dt=0. \end{array}

An alternative proof of the corollary follows by representing the local martingale as a time-changed Brownian motion.

Moving on, we want use Lemma 1 to construct the local times of a semimartingale. If X is continuous, then so is the process V. However, if X is discontinuous then, since local times are defined to be continuous, we decompose V into its continuous and pure-jump components. Recall that the jumps of a cadlag process V are denoted by {\Delta V_t=V_t-V_{t-}} which, for an increasing process, are nonnegative. The following can be viewed as a generalisation of Ito’s formula where the term {\frac12\int f^{\prime\prime}(X)d[X]^c} has been replaced by a continuous increasing process A.

Lemma 5 Let X be a semimartingale and {f\colon{\mathbb R}\rightarrow{\mathbb R}} be convex. Then,

\displaystyle  f(X_t)=f(X_0)+\int_0^t f^\prime(X_-)dX+A_t+\sum_{s\le t}\left(\Delta f(X_s)-f^\prime(X_{s-})\Delta X_s\right).

for a continuous increasing process A starting from zero.

Proof: Defining the increasing cadlag process V by (4), it can be decomposed as

\displaystyle  V_t=V_0+A_t+\sum_{s\le t}\Delta V_s

where A is continuous and increasing. Clearly {V_0=f(X_0)} and, by properties of the stochastic integral, we have

\displaystyle  \Delta V_s=\Delta f(X_s)-f^\prime(X_{s-})\Delta X_s

as required. ⬜

By Ito’s lemma, if {f} is twice continuously differentiable then the process A is equal to {\frac12\int f^{\prime\prime}(X_-)\,d[X]^c}. If we try applying this equality for the (not twice differentiable) function {f(x)=\lvert x\rvert} then we obtain

\displaystyle  A_t=\int_0^t\delta(X_-)d[X]^{\rm c}

where {\delta(\cdot)} is the Dirac delta function, although this is clearly not rigorous and is not a well defined expression. However, lemma 5 still applies and gives a continuous increasing process which is called the local time of X at zero. I use {{\rm sgn}(x)} to denote the function equal to -1 for {x\le0} and equal to 1 for {x > 0}. Note the lack of symmetry at {x=0} where it takes the value -1, but this convention is convenient as it is equal to the left derivative of {\lvert x\rvert}. I also use {x_+=x\vee0} and {x_-=(-x)\vee0} for, respectively, the positive and negative parts of the real number {x}.

Definition 6 The local time of a semimartingale X at zero, denoted by {L_t}, is the unique process satisfying either (and then, all) of the following identities,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle \lvert X_t\rvert=\lvert X_0\rvert + \int_0^t{\rm sgn}(X_-)dX+L_t+J_t,\smallskip\\ &\displaystyle (X_t)_+=(X_0)_++\int_0^t1_{\{X_- > 0\}}dX+\frac12L_t+\frac12J_t,\smallskip\\ &\displaystyle (X_t)_-=(X_0)_--\int_0^t1_{\{X_- \le 0\}}dX+\frac12L_t+\frac12J_t, \end{array} (5)

where J is the pure jump process {\sum_{s\le t}\left(\Delta\lvert X_s\rvert-{\rm sgn}_-(X_{s-})\Delta X_s\right)}.

Then, the local time at {x\in{\mathbb R}}, denoted by {L^x_t}, is defined to be the local time of {X-x} at zero.

The equivalence of the three statements (5) is straightforward. Note that taking the sum of the first identity with the following and halving gives the second identity.

\displaystyle  X_t=X_0+\int_0^t dX

Similarly, taking this away from the second identity gives the third, and adding it to twice the third identity gives the first. So, all three identities of (5) are equivalent. The first of these identities is sometimes referred to as the Meyer-Tanaka formula.

By lemma 5, the local time {L^x_\cdot} is a continuous increasing process starting from zero and, we can show that it only increases when {X=x}.

Lemma 7 Let X be a semimartingale and {x\in{\mathbb R}}. Then, {L^x_t} is a continuous increasing process starting from zero such that {dL^x_t} is supported on the set {\{X=X_-=x\}}.

Proof: It is sufficient to prove this for {x=0}, so we just consider the local time at zero. Consider the process

\displaystyle  V_t=\lvert X_t\rvert-\int_0^t{\rm sgn}(X_-)\,dX.

By definition, the local time L is the continuous part of V. The idea is simple enough — over intervals where X does not hit zero or change sign then V is constant. To make this rigorous, fix a time {u\ge0} and define the stopping time

\displaystyle  \tau_u=\inf\left\{t\ge u\colon {\rm sgn}(X_u)X_t\le 0\right\}.

As {{\rm sgn}(X_-)={\rm sgn}(X_u)} is constant on the interval {(u,\tau_u]} we obtain,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle \int_u^{t\wedge\tau_u}{\rm sgn}(X_-)\,dX &\displaystyle=\int_0^t1_{(u,\tau_u]}{\rm sgn}(X_u)\,dX\smallskip\\ &\displaystyle={\rm sgn}(X_u)(X_{t\wedge\tau_u}-X_{t\wedge u}). \end{array}

For {t\in[u,\tau_u)}, the right hand side is equal to {\lvert X_t\rvert-\lvert X_u\rvert} and, hence, {V_t=V_u}, showing that V is almost surely constant on {[u,\tau_u)}. Hence, L is also constant on this interval and, by continuity, is almost surely constant on {[u,\tau_u]}.

Now, choose a sequence {u_1,u_2,\ldots} which is dense in {[0,\infty)}. For example, enumerate the positive rational numbers. As X is cadlag, for any {t > 0} with {X_{t-}\not=0} then the sign of X will not change sign on a sufficiently small interval {[t-\epsilon,t)}, so {\tau_{u_n}\ge t} whenever {u_n} is in this interval, giving

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle \int1_{\{X_-\not=0\}}dL &\displaystyle\le\int1_{\bigcup_n[u_n,\tau_{u_n}]}dL\smallskip\\ &\displaystyle\le\sum_n\int1_{[u_n,\tau_{u_n}]}dL=0 \end{array}

almost surely. Finally, as X is cadlag, {\Delta X=0} everywhere except for a countable set. So,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle \int1_{\{X_-\not=0\}\cup\{X\not=0\}}dL &\displaystyle=\int1_{\{X_-\not=0\}\cup\{\Delta X\not=0\}}dL\smallskip\\ &\displaystyle\le\int1_{\{X_-\not=0\}}dL+\int1_{\{\Delta X\not=0\}}dL\smallskip\\ &\displaystyle=0 \end{array}

almost surely, as required. ⬜

The local time of a continuous semimartingale can be expressed compactly as an integral with respect to {(X)_+}, which also makes clear that its increments are supported by the set {\{X=0\}}.

Lemma 8 Let X be a continuous semimartingale. Then, its local time at zero satisfies

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle L &\displaystyle =2\int1_{\{X=0\}}d(X)_+\smallskip\\ &\displaystyle=2\int1_{\{X=0\}}d(X)_-+2\int1_{\{X=0\}}dX\smallskip\\ &\displaystyle =\int1_{\{X=0\}}d\lvert X\rvert + \int1_{\{X=0\}}dX. \end{array} (6)

Proof: Integrating {1_{\{X=0\}}} with respect to the second of identities (5),

\displaystyle  \int1_{\{X=0\}}d(X)_+=\frac12\int1_{\{X=0\}}dL=\frac12L,

with the second equality from lemma 7. This is the first equality of (6). The second and third equalities then follow from {2(X)_-+2X=\lvert X\rvert+X=2(X)_+}. ⬜

We obtain particularly simple expressions for the the local time of a continuous local martingale.

Lemma 9 Let X be a continuous local martingale. Then, its local time at zero satisfies

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle L &\displaystyle =\int1_{\{X=0\}}d\lvert X\rvert\smallskip\\ &\displaystyle =2\int1_{\{X=0\}}d(X)_+\smallskip\\ &\displaystyle =2\int1_{\{X=0\}}d(X)_-. \end{array} (7)

Proof: Combine lemma 8 with corollary 4. ⬜

The local time of a continuous semimartingale can also be constructed as a running maximum. I use the notation {X^*_t=\sup_{s\le t}X_s} for the running maximum of process X, which is automatically increasing and is continuous whenever X is.

Lemma 10 Let X be a continuous semimartingale. Then, {\lvert X\rvert=L-Y}, where Y is a continuous semimartingale and L is the both the local time at zero of X and is equal to {Y^*\vee0}.

Specifically, {Y=-\lvert X_0\rvert-\int{\rm sgn}(X)dX}.

Proof: First, if L is the local time then, from the definition, {Y=-\lvert X_0\rvert-\int{\rm sgn}(X)dX}, which is a continuous semimartingale.

Now, consider a fixed time {t\ge0}. On the event that {X_s\not=0} for all {s\le t} then, for {s\le t}, lemma 7 states that {L_s=0} and, so, {Y_s=-\lvert X_s\rvert}. This gives {L_t=0=Y^*_t\vee0}.

On the other hand, on the event that {X_s=0} for some {s\le t}, we can let {\sigma} be the last time in the interval {[0,t]} at which {X_\sigma=0}. Then, {L_\sigma=Y_\sigma} and lemma 7 states that {L_s=L_\sigma} for all {\sigma\le s\le t}. As L is increasing, {L_s\le L_\sigma} for all {s\le t} so,

\displaystyle  Y_s=L_s-\lvert X_s\rvert\le L_\sigma=Y_\sigma.

This shows that {L_\sigma=Y_\sigma=Y^*_t}. Therefore, {L_t=L_\sigma=Y^*_t=Y^*_t\vee0} as required. ⬜

Lemma 10 is particularly helpful when applied to Brownian motion as, in that case, it can be shown that the semimartingale Y is itself a Brownian motion. Recall that B is a Brownian motion with respect to a filtration {\mathcal F_t} if it is a continuous process such that {B_t-B_s} has a centered normal distribution with variance {t-s} and is independent of {\mathcal F_s}, for all times {s < t}. For standard Brownian motion, it is common to also require that it starts from zero, although that will not be required here. An example plot demonstrating theorem 11 is shown in figure 1 above.

Theorem 11 Let B be a Brownian motion. Then, {\lvert B\rvert=L-W} for a Brownian motion W with {W_0=-\lvert B_0\rvert}, where L is both the local time at zero of B and is equal to {W^*\vee0}.

Proof: We can apply lemma 10, so that it just needs to be shown that W is a Brownian motion. Since we know that {W=-\lvert B_0\rvert-\int{\rm sgn}(B)dB}, which is a local martingale,

\displaystyle  [W]_t=\int_0^t{\rm sgn}(B)^2d[B]=\int_0^t{\rm sgn}(B_s)^2ds=t.

The first equality is applying the quadratic variation of stochastic integrals, the second is plugging in the quadratic variation of Brownian motion {[B]_s=s}, and the third is using {{\rm sgn}(X)^2=1}. So, by Lévy’s characterisation, W is a Brownian motion. ⬜

Theorem 11 has the following immediate consequence, which allows us to completely determine the distribution of Brownian local times.

Theorem 12 Let B and W be standard Brownian motions such that {\lvert B_0\rvert} and {-W_0} have the same distribution. If L is the local time of B at zero, then {(\lvert B\rvert,L)} and {((W^*)_+-W,(W^*)_+)} have the same joint distribution.

Proof: Theorem 11 says that {(\lvert B\rvert,L)=((W^*)_+-W,(W^*)_+)} for some Brownian motion W with {W_0=-\lvert B_0\rvert}. As the distribution of a Brownian motion only depends on its starting distribution, we can replace W by any Brownian motion such that {W_0} has the same distribution as {-\lvert B_0\rvert}. ⬜

For example, if B is a Brownian motion starting from zero, then it is well-known that {B^*_t} has the same distribution as {\lvert B_t\rvert}. This is commonly shown as an application of the reflection principle. Theorem 12 then states that the local time of B at zero, {L_t}, also has the same distribution as {\lvert B_t\rvert}. That is, it is distributed as the absolute value of a centred normal of variance t.

Continuing our investigation of local times of general semimartingales, we show that they behave as expected under optional stopping. As usual, for a random time {\tau}, {X^\tau_t=X_{t\wedge\tau}} represents the process X stopped at time {\tau}. We also use {X^{\tau-}_t} for the pre-stopped process, defined to be {X_t} for {t < \tau} and {X_{\tau-}} for {t\ge\tau}. I also use the notation {L^x_t(X)} to denote the local time of a semimartingale X at level {x\in{\mathbb R}} and time {t > 0}.

Lemma 13 Let X be a semimartingale and {x\in{\mathbb R}}. Then,

\displaystyle  L^x(X^\tau)=L^x(X^{\tau-})=L^x(X)^\tau

for all stopping times {\tau}.

Proof: It is sufficient to prove the result for {x=0}. Stopping (4) at time {\tau}, with {f(x)=\lvert x\rvert}, gives

\displaystyle  \lvert X^\tau\rvert=\int{\rm sgn}(X_-)dX^\tau+V^\tau,

where we have used the fact that stopping the stochastic integral at time {\tau} gives the integral with respect to {X^\tau}. As the continuous part of V is just {L^0(X)}, the continuous part of {V^\tau} is {L^0(X)^\tau}, from which we obtain {L^0(X^\tau)=L^0(X)^\tau}. Similarly, stopping just before time {\tau} gives

\displaystyle  \lvert X^{\tau-}\rvert=\int{\rm sgn}(X_-)dX^{\tau-}+V^{\tau-},

from which we obtain {L^0(X^{\tau-})=L^0(X)^{\tau-}} which, by continuity of the local time, is equal to {L^0(X)^\tau}. ⬜

Local times are also stable under continuous time changes which, in particular, allows us to compute local times of any continuous local martingale in terms of Brownian local times.

Lemma 14 Let X be a semimartingale with local time {L^x_t} at {x\in{\mathbb R}}, and {\{\tau_t\}_{t\in{\mathbb R}^+}} be a collection of stopping times which are continuous and increasing in t. Then, {\tilde X_t=X_{\tau_t}} is a semimartingale with respect to the filtration {\tilde{\mathcal F}_t=\mathcal F_{\tau_t}}, and has local time at {x} given by

\displaystyle  \tilde L^x_t=L^x_{\tau_t}-L^x_{\tau_0}. (8)

Proof: From the post on time changes, we know that {\tilde X} is a semimartingale with respect to the time-changed filtration. We just need to prove the time-change property for the local time at {x=0}, and the general case follows from applying this to {X-x}. Let us define the cadlag increasing processes V and {\tilde V} by

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle \lvert X_t\rvert = \lvert X_0\rvert + \int_0^t{\rm sgn}(X_-)dX+V_t,\smallskip\\ &\displaystyle \lvert \tilde X_t\rvert = \lvert \tilde X_0\rvert + \int_0^t{\rm sgn}(\tilde X_-)d\tilde X+\tilde V_t. \end{array}

Again, applying lemma 4 of the post on time changes,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle \tilde V_t &\displaystyle=\lvert X_{\tau_t}\rvert-\lvert X_{\tau_0}\rvert-\int_{\tau_0}^{\tau_t}{\rm sgn}(X_-)dX\smallskip\\ &= V_{\tau_t}-V_{\tau_0}. \end{array}

So, {\tilde V} is the time change of V, shifted to start from zero. It follows that the same holds for the continuous and pure jump components of {\tilde V} and V and, in particular, taking the continuous component gives (8). ⬜

I finish by looking at an important property of local times of continuous local martingales. That is, whenever the process hits a level x then, unless it remains constant at that level, the local time {L^x} has to increase. This is similar to the property of the quadratic variation {[X]} which, as we saw in a previous post, must always increase whenever X moves. The difference here is that we are localising to a level {x}. I start by looking at time zero.

Lemma 15 Let X be a continuous local martingale with {X_0=0}. Then, with probability one, {L_t > 0} for all times {t > 0} at which {X_t\not=0}.

Proof: By continuity of X and L, it is sufficient to prove the result for t restricted to a countable dense subset of the positive reals. Then, by countable additivity, it is enough to prove the result for each fixed positive time t. Now, fix {\epsilon > 0} and define the stopping time

\displaystyle  \tau=\inf\left\{s\ge0\colon L_s\ge\epsilon\right\}.

We use the fact that {\lvert X\rvert-L=\lvert X_0\rvert+\int{\rm sgn}(X)dX} is a local martingale. then {X^\tau-L^\tau} is a local martingale bounded below by {-\epsilon} and, hence, is a supermartingale. So,

\displaystyle  {\mathbb E}[\lvert X_{t\wedge\tau}\rvert]\le{\mathbb E}[L_{t\wedge\tau}]\le\epsilon

at each positive time t. We then compute

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle {\mathbb P}(L_t < \epsilon, \lvert X_t\rvert > \delta) &\displaystyle \le{\mathbb P}(\lvert X_{t\wedge\tau}\rvert > \delta)\smallskip\\ &\displaystyle \le\delta^{-1}{\mathbb E}[\lvert X_{t\wedge\tau}\rvert]\le\delta^{-1}\epsilon. \end{array}

Letting {\epsilon} decrease to zero gives {{\mathbb P}(L_t=0,\lvert X_t\rvert > \delta)=0} and, then, letting {\delta} go to zero gives {{\mathbb P}(L_t=0,X_t\not=0)=0}. ⬜

We extend the previous result to all positive times.

Lemma 16 Let X be a continuous local martingale and {x\in{\mathbb R}}. Then, with probability one, the local time {L^x} strictly increases whenever X hits level {x} without remaining there.

More precisely, with probability one, for all times {s < t < u} for which {X_t=x\not=X_u} then {L^x_s < L^x_u}.

Proof: Without loss of generality, we take {x=0}. Then, for any time t define the stopping time

\displaystyle  \tau_t=\inf\left\{s\ge t\colon X_s=0\right\}.

Then, {\tilde X_s=1_{\{\tau_t < \infty\}}X_{\tau_t+s}} is a continuous local martingale starting from zero, and its local time at zero satisfies {\tilde L_s=1_{\{\tau_t < \infty\}}(L_{\tau_t+s}-L_{\tau_t})}. This is straightforward, and follows from lemma 14 restricted to {\tau < \infty}. Applying lemma 15 gives {L_s > L_{\tau_t}} for any time {s > \tau_t} at which {X_s\not=0}.

Now choose a sequence of times {t_1,t_2,\cdots} which are dense in {{\mathbb R}^+}. With probability one, we have {L_s > L_{\tau_n}} at all times {s > \tau_n} for which {X_s\not=0}. Now, for any such sample path, consider times {s < t < u} for which {X_t=0\not=X_u}. As the sequence is dense, we have {t_n\in(s,t)} for some n and, then, {s\le\tau_{t_n}\le t}. As {X_u\not=0}, this gives {L_u > L_{\tau_{t_n}}\ge L_s} as required. ⬜

Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Blog at