Almost Sure

18 July 11

Predictable FV Processes

By definition, an FV process is a cadlag adapted stochastic process which almost surely has finite variation over finite time intervals. These are always semimartingales, because the stochastic integral for bounded integrands can be constructed by taking the Lebesgue-Stieltjes integral along sample paths. Also, from the previous post on continuous semimartingales, we know that the class of continuous FV processes is particularly well behaved under stochastic integration. For one thing, given a continuous FV process X and predictable {\xi}, then {\xi} is X-integrable in the stochastic sense if and only if it is almost surely Lebesgue-Stieltjes integrable along the sample paths of X. In that case the stochastic and Lebesgue-Stieltjes integrals coincide. Furthermore, the stochastic integral preserves the class of continuous FV processes, so that {\int\xi\,dX} is again a continuous FV process. It was also shown that all continuous semimartingales decompose in a unique way as the sum of a local martingale and a continuous FV process, and that the stochastic integral preserves this decomposition.

Moving on to studying non-continuous semimartingales, it would be useful to extend the results just mentioned beyond the class of continuous FV processes. The first thought might be to simply drop the continuity requirement and look at all FV processes. After all, we know that every FV process is a semimartingale and, by the Bichteler-Dellacherie theorem, that every semimartingale decomposes as the sum of a local martingale and an FV process. However, this does not work out very well. The existence of local martingales with finite variation means that the decomposition given by the Bichteler-Dellacherie theorem is not unique, and need not commute with stochastic integration for integrands which are not locally bounded. Also, it is possible for the stochastic integral of a predictable {\xi} with respect to an FV process X to be well-defined even if {\xi} is not Lebesgue-Stieltjes integrable with respect to X along its sample paths. In this case, the integral {\int\xi\,dX} is not itself an FV process. See this post for examples where this happens.

Instead, when we do not want to restrict ourselves to continuous processes, it turns out that the class of predictable FV processes is the correct generalisation to use. By definition, a process is predictable if it is measurable with respect to the set of adapted and left-continuous processes so, in particular, continuous FV processes are predictable. We can show that all predictable FV local martingales are constant (Lemma 2 below), which will imply that decompositions into the sum of local martingales and predictable FV processes are unique (up to constant processes). I do not look at general semimartingales in this post, so will not prove the existence of such decompositions, although they do follow quickly from the results stated here. We can also show that predictable FV processes are very well behaved with respect to stochastic integration. A predictable process {\xi} is integrable with respect to a predictable FV process X in the stochastic sense if and only if it is Lebesgue-Stieltjes integrable along the sample paths, in which case stochastic and Lebesgue-Stieltjes integrals agree. Also, {\int\xi\,dX} will again be a predictable FV process. See Theorem 6 below.

In the previous post on continuous semimartingales, it was also shown that the continuous FV processes can be characterised in terms of their quadratic variations and covariations. They are precisely the semimartingales with zero quadratic variation. Alternatively, they are continuous semimartingales which have zero quadratic covariation with all local martingales. We start by extending this characterisation to the class of predictable FV processes. As always, we work with respect to a complete filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})} and two stochastic processes are considered to be equal if they are equivalent up to evanescence. Recall that, in these notes, the notation {[X]^c_t=[X]_t-\sum_{s\le t}(\Delta X_s)^2} is used to denote the continuous part of the quadratic variation of a semimartingale X.

Theorem 1 For a process X, the following are equivalent.

  1. X is a predictable FV process.
  2. X is a predictable semimartingale with {[X]^c=0}.
  3. X is a semimartingale such that {[X,M]} is a local martingale for all local martingales M.
  4. X is a semimartingale such that {[X,M]} is a local martingale for all uniformly bounded cadlag martingales M.

The equivalence of the statements in Theorem 1 is a stronger result than it might appear at first sight. Suppose, for example, that {\tau} is a stopping time whose graph {[\tau]} is predictable. Then, {X=1_{[\tau,\infty)}} is a predictable FV process and Theorem 1 implies that {1_{[\tau,\infty)}\Delta M_\tau=[X,M]} is a local martingale for all local martingales M. So, the statement that `predictable times are fair’ is encapsulated by the implication 1 ⇒ 3. Going in the other direction, the processes X satisfying property 4 of the theorem is just the set of semimartingales orthogonal to the square integrable local martingales, as used in the proof of the Bichteler-Dellacherie theorem. The fact that these are all predictable FV processes leads to relatively simple proofs of results such as the Doob-Meyer decomposition. Furthermore, due to the fact that cadlag predictable processes are locally bounded, this completes the comments following the proof of the Bichtelerie-Dellacherie theorem where it is argued that the local martingale term in the decomposition can be taken to be locally bounded. The proof of Theorem 1 is left until the end of this post.

Moving on, we can prove the following.

Lemma 2 A local martingale is a predictable FV process if and only if it is constant.

Proof: As an adapted constant process is trivially predictable, only the converse needs to be shown. So, suppose that X is a local martingale and a predictable FV process. Being a predictable local martingale, X is continuous. However, continuous FV local martingales are constant. \Box

Suppose that {X=M+V} is the decomposition of a process X into a local martingale M and predictable FV process V. If {X=M^\prime+V^\prime} was any other such decomposition then {M-M^\prime=V^\prime-V} would be a predictable FV local martingale, so is constant. This will be used in later posts to establish uniqueness of semimartingale decompositions.

In the proof of Lemma 2 we used the fact that predictable local martingales are continuous, which was proven in an earlier post involving some rather advanced results on predictable stopping times. This enabled us to give a quick proof here, but it is interesting to note that very little stochastic calculus is required to establish this result. In fact, for any continuously differentiable {f\colon{\mathbb R}\rightarrow{\mathbb R}}, using the `change of variables formula’ for Lebesgue-Stieltjes integration along the sample paths of X gives

\displaystyle  f(X_t)=f(X_0)+\int_0^t\left(1_{\{\Delta X=0\}}f^\prime(X)+1_{\{\Delta X\not=0\}}\Delta f(X)/\Delta X\right)\,dX.

If {f} has bounded derivative then the integrand is bounded and, by preservation of the local martingale property, this shows that {f(X)} is a local martingale. In fact, it is not even necessary to have a theory of stochastic integration to state this. For an FV local martingale X and bounded predictable {\xi} it can be shown that {\int\xi\,dX} defined by Lebesgue-Stieltjes integration is itself a local martingale. This follows from an application of the monotone class theorem to extend from the simple case where {\xi} is elementary predictable to arbitrary bounded predictable integrands. Now, choosing {f} to be bounded with bounded derivative then {f(X-X_0)} is a martingale, so {{\mathbb E}[f(X_t-X_0)]=f(0)}. If {f(x) > f(0)} for all {x\not=0} (e.g., {f(x)=x^2/(1+x^2)}) then this shows that {X_t=X_0} almost surely.

Lemma 2 also follows as an immediate consequence of Theorem 1. If X was both a predictable FV process and a local martingale, then Theorem 1 says that {[X]=[X,X]} is a local martingale. Then the Ito Isometry {{\mathbb E}[(X_t-X_0)^2]={\mathbb E}[[X]_t]=0} shows that X is constant. As we have not yet established Theorem 1, this method was not employed for the proof of Lemma 2 above.

For a cadlag process X and semimartingale Y, the stochastic integrals {\int X\,dY} and {\int\Delta X\,dY} are not well-defined unless X is predictable. However, if X is a predictable FV process, we can use these integrals to express the quadratic covariation {[X,Y]} and, in equation (2) below, give a simple form of integration by parts avoiding covariation terms. Compare also with the integration by parts formula previously stated for FV processes.

Lemma 3 Let X be a cadlag predictable process and Y be a semimartingale. Then,

\displaystyle  \int_0^t\Delta X\,dY = \sum_{s\le t}\Delta X_s\Delta Y_s (1)

for all times t. If, furthermore, X is an FV process then (1) is equal to the quadratic covariation {[X,Y]_t}, and the following integration by parts formula holds.

\displaystyle  XY=X_0Y_0+\int X\,dY+\int Y_-\,dX. (2)

Proof: Let’s start by considering the integral of a predictable process of the form {U1_{[\tau]}} for some predictable stopping time {\tau > 0} and bounded {\mathcal{F}_{\tau-}}-measurable random variable U. Then, there exists a sequence of stopping times {\tau_n < \tau} tending to {\tau} as n goes to infinity and a bounded predictable process {\xi} with {\xi_\tau=U}. Setting {Z\equiv\int\xi\,dY} then, by standard properties of integration, {\Delta Z_\tau=U\Delta Y_\tau}. Bounded convergence (in probability) for stochastic integration gives

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle\int_0^tU1_{[\tau]}\,dY&\displaystyle=\int_0^t\xi1_{[\tau]}\,dY=\lim_{n\rightarrow\infty}\int_0^t\xi1_{(\tau_n,\tau]}\,dY\smallskip\\ &\displaystyle=\lim_{n\rightarrow\infty}(Z_{\tau\wedge t}-Z_{\tau_n\wedge t})\smallskip\\ &\displaystyle=1_{\{\tau\le t\}}U\Delta Y_\tau \end{array} (3)

Now consider any cadlag predictable X. As this is locally bounded, {\Delta X} is automatically Y-integrable and, by stopping, it can be assumed that X is uniformly bounded. Then, there exists a sequence of predictable stopping times {\tau_n} such that {{\mathbb P}(\tau_m\not=\tau_n<\infty)=0} for all {m\not=n}, {\{\Delta X\not=0\}=\bigcup_n[\tau_n]}, and {\Delta X_{\tau_n}} is {\mathcal{F}_{\tau_n}}-measurable. This implies that {\Delta X} can be written as the sum {\sum_n\Delta X_{\tau_n}1_{[\tau_n]}}. So, using bounded convergence for the stochastic integral and applying (3) gives,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle\int_0^t\Delta X\,dY&\displaystyle=\sum_{n=1}^\infty\int_0^t\Delta X_{\tau_n}1_{[\tau_n]}\,dY\smallskip\\ &\displaystyle=\sum_{n=1}^\infty1_{\{\tau_n\le t\}}\Delta X_{\tau_n}\Delta Y_{\tau_n}\smallskip\\ &\displaystyle=\sum_{s\le t}\Delta X_s\Delta Y_s. \end{array}

Finally, suppose that X is a predictable FV process. Then, the covariation {[X,Y]_t} is equal to {\sum_{s\le t}\Delta X_s\Delta Y_s}. By equation (1) this is equal to {\int_0^t\Delta X\,dY} so, applying the standard integration by parts formula gives,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle XY&\displaystyle=X_0Y_0+\int X_-\,dY+\int Y_-\,dX + \int\Delta X\,dY\smallskip\\ &\displaystyle= X_0Y_0+\int(X_-+\Delta X)\,dY + \int Y_-\,dX. \end{array}

As {X=X_-+\Delta X}, this proves (2). \Box

Next, we give a version of the Radon-Nikodym theorem applicable to predictable FV processes. Suppose that X and Y are two increasing processes such that dX is `absolutely continuous’ with respect to dY, in the sense that the process {\int1_A\,dX} is identically zero for all jointly measurable sets A with {\int1_A\,dY} equal to zero. Then, the Radon-Nikodym derivative {\xi=dX/dY} exists, and is a jointly measurable Y-integrable process satisfying {dX=\xi\,dY}. In the case considered below, where X and Y are predictable, the derivative {dX/dY} can also be taken to be predictable.

Lemma 4 Let X and Y be cadlag increasing predictable processes such that {\int1_A\,dX=0} (almost surely) for all predictable sets A satisfying {\int1_A\,dY=0}. Then, there exists a nonnegative Y-integrable and predictable process {\xi} such that {X=X_0+\int\xi\,dY}.

Proof: Denoting the predictable sigma-algebra by {\mathcal{P}}, define the following measures on {({\mathbb R}_+\times\Omega,\mathcal{P})},

\displaystyle  \mu(\alpha)={\mathbb E}\left[\int_0^\infty\alpha\,dX\right],\ \nu(\alpha)={\mathbb E}\left[\int_0^\infty\alpha\,dY\right],

for nonnegative predictable processes {\alpha}. As X and Y are predictable, they are locally bounded. That is, there exists a sequence of stopping times {\tau_n\uparrow\infty} such that {1_{\{\tau_n > 0\}}X^{\tau_n}} and {1_{\{\tau_n > 0\}}Y^{\tau_n}} are uniformly bounded and, hence, {\mu([0,\tau_n])} and {\nu([0,\tau_n])} are finite. So, {\mu} and {\nu} are sigma-finite. If {\nu(A)=0} for some predictable set A then {\int1_A\,dY=0} (almost surely). So, {\int1_A\,dX=0} and we see that {\mu(A)=0}. This shows that {\mu} is absolutely continuous with respect to {\nu}. The Radon-Nikodym derivative, {\xi\equiv d\mu/d\nu}, is a nonnegative predictable process such that {\mu(\alpha)=\nu(\alpha\xi)} for all nonnegative predictable {\alpha}. In particular, if {\alpha} is bounded then

\displaystyle  {\mathbb E}\left[\int\alpha\xi\,dY^{\tau_n}\right]={\mathbb E}\left[\int\alpha\,dX^{\tau_n}\right]\le{\mathbb E}\left[X_{\tau_n}-X_0\right] < \infty.

Defining {M\equiv\int\xi\,dY-X+X_0}, the stopped process {M^{\tau_n}} is integrable and satisfies {{\mathbb E}[\int\alpha\,dM^{\tau_n}]=0} for all bounded predictable {\alpha}. This shows that {M^{\tau_n}} is a martingale. Furthermore, {\Delta M=\xi\Delta Y-\Delta X} is predictable, so M is a predictable FV local martingale. Lemma 2 says that it is constant, so {M=0} and {X=X_0+\int\xi\,dY}. \Box

A consequence of this result is that the variation and increasing and decreasing parts of a predictable FV process can be expressed as an integral. As discussed in the post on continuous semimartingales, the variation V of a process X is the minimum nonnegative increasing process such that {X+V} and {X-V} are increasing. It follows from this that {\Delta V=\vert\Delta X\vert} so, in particular, if X is predictable then, as {V_-} is left-continuous and adapted, {V=V_-+\vert\Delta X\vert} is predictable. The increasing and decreasing parts of X, {X^+=(V+X-X_0)/2} and {X^-=(V-X+X_0)/2} respectively, will also be predictable.

Lemma 5 Let X be a predictable FV process. Then there exists a predictable {\xi} with {\vert\xi\vert=1} such that {\int\xi\,dX} is increasing. Furthermore, in that case, {\int\xi\,dX}, {\int1_{\{\xi > 0\}}\,dX} and {-\int1_{\{\xi < 0\}}\,dX} are respectively the variation, increasing part and decreasing parts of X.

Proof: Let V be the variation process of X and suppose that {\int1_A\,dV=0} for some jointly measurable set A. Then, {\int1_A\,d(V+X)=-\int1_A\,d(V-X)} is both increasing and decreasing, so is zero. Lemma 4 implies that there exists a predictable and V-integrable process {\alpha} such that {V+X=X_0+\int\alpha\,dV}. Letting {\xi={\rm sgn}(\alpha-1)} (which we take to be 1 when {\alpha-1=0}) then,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle\int\xi\,dX&\displaystyle=\int\xi\alpha\,dV-\int\xi\,dV=\int\xi(\alpha-1)\,dV\smallskip\\ &\displaystyle=\int\vert\alpha-1\vert\,dV \end{array}

is increasing. The `furthermore’ part of the statement is given by Lemma 4 from the previous post on continuous semimartingales. \Box

Next, as promised above, we show that stochastic integration preserves the class of predictable FV processes, on which it coincides with Lebesgue-Stieltjes integration along the sample paths.

Theorem 6 Let X be a predictable FV process and {\xi} be predictable. Then, the following are equivalent.

  • {\xi} is X-integrable (in the stochastic sense).
  • {\int_0^t\vert\xi\vert\,\vert dX\vert} is almost surely finite, for each time t.

In that case, the stochastic integral {\int\xi\,dX} is also a predictable FV process and almost surely coincides with the Lebesgue-Stieltjes integral along sample paths.

Proof: By Lemma 5, there exists a predictable process {\alpha} such that {\vert\alpha\vert=1} and {\int\alpha\,dX} is increasing. Then, the equivalence of the conditions in the statement of the theorem and the fact that the stochastic integral coincides with Lebesgue-Stieltjes integration is given by Lemma 5 of the previous post on continuous semimartingales. It only remains to show that {Y\equiv\int\xi\,dX} is a predictable FV process. However, its variation is given by {\int_0^t\vert\xi\vert\,\vert dX\vert}, which is finite. Also, {\Delta Y=\xi\Delta X} is a product of predictable processes and, hence, is predictable. So {Y=Y_-+\Delta Y} is predictable. \Box

Finally, it is possible to extend Lemma 4 to all predictable FV processes.

Theorem 7 If X and Y are predictable FV processes then the following are equivalent.

  • For all {A\in\mathcal{P}}, if {\int1_A\,dY=0} then {\int1_A\,dX=0} (almost surely).
  • {X=X_0+\int\xi\,dY} for some Y-integrable predictable process {\xi}.

Proof: First, suppose that {X=X_0+\int\xi\,dY} for a Y-integrable process {\xi}. Given {A\in\mathcal{P}}, write {V=\int1_A\,dY}. Applying associativity of integration, {\int1_A\,dX=\int1_A\xi\,dY=\int\xi\,dV}. So, if {V=0} almost surely then it follows that {\int1_A\,dX=0}.

Now, suppose that the first condition holds. By Lemma 5, there are predictable processes {\vert\alpha\vert=\vert\beta\vert=1} such that {V\equiv\int\alpha\,dX} and {W\equiv\int\beta\,dY} are increasing. Then, {\int1_A\,dW=0} implies that {\int1_A\,dY=\int1_A\beta\,dW=0} so, by hypothesis, {\int1_A\,dX=0}. Hence, {\int1_A\,dV=\int1_A\alpha\,dX=0}. Then, Lemma 4 says that there is a predictable W-integrable process {\gamma} such that {V=\int\gamma\,dW}. Using associativity of integration, {\xi\equiv\alpha\gamma\beta} is Y-integrable and

\displaystyle  \int\xi\,dY=\int\alpha\gamma\,dW=\int\alpha\,dV=X-X_0

as required. \Box

Proof of Theorem 1

I will now give a proof of the equivalence of the four statements in Theorem 1. As this is rather involved, it will be split up into several smaller lemmas. First, using results already established in these notes, the implications 1 ⇒ 2 ⇒ 3 ⇒ 4 are not difficult to prove.

Proof of 1 ⇒ 2: It is an elementary property of quadratic variations that {[X]^c} vanishes if X is an FV process. {\Box}

Proof of 2 ⇒ 3: As the Cauchy-Schwarz inequality gives {([X,M]^c)^2\le[X]^c[M]^c=0}, the quadratic covariation {[X,M]_t} just consists of the pure jump component {\sum_{s\le t}\Delta X_s\Delta M_s}. Applying Lemma 3,

\displaystyle  [X,M]=\int\Delta X\,dM.

Since stochastic integration with locally bounded integrands preserves the local martingale property, {[X,M]} is a local martingale. {\Box}

Proof of 3 ⇒ 4: As cadlag bounded martingales are special cases of local martingales, this is trivial.{\Box}

This only leaves the implication 4 ⇒ 1, which is the most difficult but, also, the furthest reaching part of the theorem. This implication will be used later in these notes to give quick proofs of the main decomposition theorem for locally integrable semimartingales, from which celebrated results such as the Doob-Meyer decomposition theorem follow as corollaries. We break the proof of 4 ⇒ 1 into several lemmas.

Lemma 8 Let {\tau} be a stopping time and U be an integrable and {\mathcal{F}_\tau}-measurable random variable such that {\mathbb{E}[U\vert\mathcal{F}_{\tau-}]=0}. Then, {M\equiv U1_{[\tau,\infty)}} is a martingale.

Proof: As U is {\mathcal{F}_\tau}-measurable, M is adapted. For it to be a martingale, it is sufficient to show that {{\mathbb E}[1_AM_t]={\mathbb E}[1_AM_\infty]} for all {t\in{\mathbb R}_+} and {A\in\mathcal{F}_t}. However,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{\mathbb E}[1_AM_\infty]-{\mathbb E}[1_AM_t]&\displaystyle={\mathbb E}[1_A1_{\{\tau < \infty\}}U]-{\mathbb E}[1_A1_{\{t\ge\tau\}}U]\smallskip\\ &=\displaystyle{\mathbb E}\left[1_A1_{\{t<\tau<\infty\}}U\right]. \end{array}

This vanishes, as {A\cap\{t < \tau < \infty\}\in\mathcal{F}_{\tau-}} and {\mathbb{E}[U\vert\mathcal{F}_{\tau-}]=0} \Box

Substituting martingales of the form given by Lemma 8 into the quadratic covariation term {[X,M]} gives the following. As stopping times are allowed to be infinite, I take {\Delta X_\tau} to be zero whenever {\tau=\infty}.

Lemma 9 Let X be a semimartingale satisfying property 4 of Theorem 1. Then, {\Delta X_\tau} is {\mathcal{F}_{\tau-}}-measurable for all stopping times {\tau}.

Proof: If U is a bounded {\mathcal{F}_\tau}-measurable random variable satisfying {{\mathbb E}[U\vert\mathcal{F}_{\tau-}]=0}, Lemma 8 says that {M\equiv1_{[\tau,\infty)}U} is a martingale. So, the quadratic variation

\displaystyle  [X,M]_t=\sum_{s\le t}\Delta X_s\Delta M_s=1_{\{\tau\le t\}}\Delta X_\tau U

is a local martingale. Let {\sigma_n} be a sequence of stopping times increasing to infinity such that {[X,M]^{\sigma_n}} are uniformly integrable martingales. Writing

\displaystyle  [X,M]_{\sigma_n}=1_{\{\sigma_n\ge\tau\}}\Delta X_\tau U

shows that {1_{\{\sigma_n\ge\tau\}}\Delta X_\tau U} is integrable and has zero expectation. We have to be careful here, because we do not know that {\Delta X_\tau} is itself integrable. However, choosing any bounded {\mathcal{F}_{\tau}}-measurable random variable V, we can take {U=V-{\mathbb E}[V\vert\mathcal{F}_{\tau-}]}. If V is nonnegative,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{\mathbb E}\left[1_{\{\sigma_n\ge\tau\}}V{\mathbb E}[\vert\Delta X_\tau\vert\vert\mathcal{F}_{\tau-}]\right] &\displaystyle={\mathbb E}\left[1_{\{\sigma_n\ge\tau\}}{\mathbb E}[V\vert\mathcal{F}_{\tau-}]\vert\Delta X_\tau\vert\right]\smallskip\\ &\displaystyle={\mathbb E}\left[1_{\{\sigma_n\ge\tau\}}\left\vert\Delta X_\tau(V-U)\right\vert\right]. \end{array}

In particular, taking {V=1_{\{\vert\Delta X_\tau\vert < K\}}} for positive K, then {\Delta X_\tau V} is bounded and this expectation is finite. So, {1_{\{\sigma_n\ge\tau\}}V{\mathbb E}[\vert\Delta X_\tau\vert\vert\mathcal{F}_{\tau-}]} is almost surely finite. Letting K and n increase to infinity shows that {{\mathbb E}[\vert\Delta X_\tau\vert\vert\mathcal{F}_{\tau-}]} is almost surely finite.

We can now take {V={\rm sgn}(\Delta X_\tau-{\mathbb E}[\Delta X_\tau\vert\mathcal{F}_{\tau-}])} to obtain

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{\mathbb E}\left[\left\vert\Delta X_\tau-{\mathbb E}[\Delta X_\tau\vert\mathcal{F}_{\tau-}]\right\vert\vert\mathcal{F}_{\tau-}\right] &\displaystyle={\mathbb E}\left[(\Delta X_\tau-{\mathbb E}[\Delta X_\tau\vert\mathcal{F}_{\tau-}])V\vert\mathcal{F}_{\tau-}\right]\smallskip\\ &\displaystyle={\mathbb E}\left[\Delta X_\tau U\vert\mathcal{F}_{\tau-}\right]. \end{array}

So, {1_{\{\sigma_n\ge\tau\}}\left\vert\Delta X_\tau-{\mathbb E}[\Delta X_\tau\vert\mathcal{F}_{\tau-}]\right\vert} has zero expectation. Letting n increase to infinity shows that {\Delta X_\tau={\mathbb E}[\Delta X_\tau\vert\mathcal{F}_{\tau-}]} is {\mathcal{F}_{\tau-}}-measurable. \Box

This is getting close to showing that X is predictable. For a process X to be predictable, it is certainly a necessary condition that {1_{\{\tau < \infty\}}X_\tau} is {\mathcal{F}_{\tau-}}-measurable for stopping times {\tau}. It is not a sufficient condition though. For example, Poisson processes satisfy this property with respect to their natural filtration (in fact, {\mathcal{F}_\tau=\mathcal{F}_{\tau-}} in this case), but are not predictable. However, this condition is strong enough to be able to find a predictable process which coincides with {\Delta X} at all jump times of X.

Lemma 10 Let X be a cadlag adapted process such that {\Delta X_\tau} is {\mathcal{F}_{\tau-}}-measurable for all stopping times {\tau}. Then, there exists a predictable process {\xi} with {\Delta X=1_{\{\Delta X\not=0\}}\xi}.

Proof: Fix {\epsilon > 0} and define an increasing sequence of stopping times {\tau_n} by {\tau_0=0} and

\displaystyle  \tau_{n+1}=\inf\left\{t > \tau_n\colon\vert\Delta X_t\vert\ge\epsilon\right\}.

Now, as {\Delta X_{\tau_n}} is {\mathcal{F}_{\tau_n-}}-measurable, there exist predictable processes {\xi^n} such that {1_{\{\tau_n < \infty\}}\xi^n_{\tau_n}=\Delta X_{\tau_n}}. So,

\displaystyle  \xi\equiv\sum_{n=1}^\infty1_{(\tau_{n-1},\tau_n]}\xi^n

is a predictable process satisfying {1_{\{\tau_n < \infty\}}\xi_{\tau_n}=\Delta X_{\tau_n}}. This shows that {\xi_t=\Delta X_t} at all times for which {\vert\Delta X_t\vert\ge\epsilon}.

For any positive integer m, the argument above shows that there exists a predictable process {\xi^m} with {\xi^m_t=\Delta X_t} at all times for which {\vert\Delta X_t\vert\ge1/m}. Then, {\xi^m_t\rightarrow\Delta X_t} as m goes to infinity, whenever {\Delta X_t\not=0}. So, we can take

\displaystyle  \xi_t=1_{\{\lim_{m\rightarrow\infty}\xi^m_t{\rm\ exists}\}}\lim_{m\rightarrow\infty}\xi^m_t.


With these lemmas completed, we can now prove the implication 4 ⇒ 1 in Theorem 1. That X has locally finite variation follows from an intermediate lemma given in the proof of the Bichtelerie-Delacherie theorem. The only technical issue is that, there, only locally integrable semimartingales were considered whereas now we do not assume any such restriction. This issue will be avoided by applying a small trick to reduce to the case where X has bounded jumps. Next, in the case where {X=1_{[\tau,\infty)}} consists of a single jump of size 1, the fact that X is predictable is given by the characterisation of predictable times as being ‘fair’. The general case can be reduced to the single unit jump situation by looking at the integral of a predictable process with respect to X, forcing its jumps to be of size 1.

Proof of 4 ⇒ 1: Start by fixing an {\epsilon > 0}. By lemmas 8 and 9, there exists a predictable process {\xi} with {\Delta X=1_{\{\Delta X\not=0\}}\xi}. Let us set {Y=\int(1+\vert\xi\vert)^{-1}\,dX}, so that {\Delta Y=(1+\vert\Delta X\vert)^{-1}\Delta X} is uniformly bounded by 1. So, Y is a locally bounded process and

\displaystyle  [Y,M]=\int(1+\vert\xi\vert)^{-1}\,d[X,M]

for any uniformly bounded cadlag martingale M. By property 4 together with the fact that stochastic integration preserves the local martingale property, this shows that {[Y,M]} is a local martingale. Using Lemma 5 from the post on the Bichteler-Dellacherie theorem, this implies that Y is an FV process.

To show that Y is predictable, let {\tau_{n,\epsilon}} denote the n‘th time at which X has a jump {\vert\Delta X\vert > \epsilon}. These are stopping times such that {\bigcup_{n\in{\mathbb N}}\bigcup_{\epsilon\in{\mathbb Q}_+}[\tau_{n,\epsilon}]} covers the jump times of X and Y. We already know that {Y_{\tau_{n,\epsilon}}} is {\mathcal{F}_{\tau_{n,\epsilon}-}}-measurable. If we can show that {\tau_{n,\epsilon}} are predictable times, then Lemma 4 of the previous post will show that Y is predictable . So, Theorem 6 will imply that {X=X_0+\int(1+\vert\xi\vert)\,dY} is a predictable FV process.

Fix {\epsilon > 0} and {n\in{\mathbb N}}. The process {Z=\int1_{\{\vert\xi\vert > \epsilon\}}\xi^{-1}\,dX} has jumps {\Delta Z=1_{\{\vert\Delta X\vert > \epsilon\}}}, so that {\tau_{n,\epsilon}} is the n‘th jump time of Z. Also, Z is the integral of the bounded process {1_{\{\vert\xi\vert > \epsilon\}}(1+\vert\xi\vert)\xi^{-1}} with respect to Y, so is an FV process. This means that, for any bounded cadlag martingale, we can calculate the quadratic covariation

\displaystyle  [Z,M]_t=\sum_{s\le t}\Delta Z_s\Delta M_s=\sum_{m=1}^\infty1_{\{\tau_{m,\epsilon}\le t\}}\Delta M_{\tau_{m,\epsilon}}.

Since this is a local martingale,

\displaystyle  1_{[\tau_{n,\epsilon},\infty)}\Delta M_{\tau_{n,\epsilon}}=[X,M]^{\tau_{n,\epsilon}}-[X,M]^{\tau_{n-1,\epsilon}}

is also a local martingale. Finally, Theorem 1 of the previous post on predictable stopping times shows that {\tau_{n,\epsilon}} is predictable.{\Box}


Theorem 1 goes a long way towards proving decomposition results such as the Doob-Meyer decomposition, which states that every class (D) submartingale uniquely decomposes as the sum of a martingale and a cadlag predictable increasing process (starting from 0). I will cover this in a later post. It is also true that most other approaches to the Doob-Meyer decomposition do, at some point, require proving the equivalence of statements 1 and 4 — at least, for the case where X has integrable variation.

A cadlag adapted process X of integrable variation and satisfying property 4 above is alternatively known as natural. In that case, there are various ways of re-stating the definition of natural processes (of integrable variation and starting from zero) other than requiring {[X,M]} to be a local martingale for all cadlag bounded martingales M. Any of the following equations can be used instead, and are used in the literature,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle{\mathbb E}\left[[X,M]_\infty\right]=0,\smallskip\\ &\displaystyle{\mathbb E}\left[\int_0^\infty\Delta M\,dX\right] =0,\smallskip\\ &\displaystyle{\mathbb E}\left[\int_0^\infty M_{-}\,dX\right]={\mathbb E}\left[M_\infty X_\infty\right],\smallskip\\ &\displaystyle{\mathbb E}\left[\int_0^\infty{}^p\xi\,dX\right]={\mathbb E}\left[\int_0^\infty\xi\,dX\right]. \end{array}

In the final equation here, {{}^p\xi} denotes the predictable projection of bounded measurable processes {\xi}.

The equivalence stated in Theorem 1 does not require X to be of integrable variation. So, it is a rather stronger result than that normally used when proving the equivalence of predictable FV and natural processes. In these notes, Theorem 1 will represent the main part of the proof of the Doob-Meyer decomposition whereas, in most approaches, it represents a much smaller part.

Interestingly, in the original statement of the Doob-Meyer decomposition the increasing part of the decomposition was required to be natural, rather than predictable. It was only later that Catherine Doléans proved the equivalence of the properties of being natural and being predictable.

Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Create a free website or blog at