Almost Sure

3 October 11

Special Semimartingales

For stochastic processes in discrete time, the Doob decomposition uniquely decomposes any integrable process into the sum of a martingale and a predictable process. If {\{X_n\}_{n=0,1,\ldots}} is an integrable process adapted to a filtration {\{\mathcal{F}_n\}_{n=0,1,\ldots}} then we write {X_n=M_n+A_n}. Here, M is a martingale, so that {M_{n-1}={\mathbb E}[M_n\vert\mathcal{F}_{n-1}]}, and A is predictable with {A_0=0}. By saying that A is predictable, we mean that {A_n} is {\mathcal{F}_{n-1}} measurable for each {n\ge1}. It can be seen that this implies that

\displaystyle  A_n-A_{n-1}={\mathbb E}[A_n-A_{n-1}\vert\mathcal{F}_{n-1}]={\mathbb E}[X_n-X_{n-1}\vert\mathcal{F}_{n-1}].

Then it is possible to write A and M as

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle A_n&\displaystyle=\sum_{k=1}^n{\mathbb E}[X_k-X_{k-1}\vert\mathcal{F}_{k-1}],\smallskip\\ \displaystyle M_n&\displaystyle=X_n-A_n. \end{array} (1)

So, the Doob decomposition is unique and, conversely, the processes A and M constructed according to equation (1) can be seen to be respectively, a predictable process starting from zero and a martingale. For many purposes, this allows us to reduce problems concerning processes in discrete time to simpler statements about martingales and separately about predictable processes. In the case where X is a submartingale then things reduce further as, in this case, A will be an increasing process.

The situation is considerably more complicated when looking at processes in continuous time. The extension of the Doob decomposition to continuous time processes, known as the Doob-Meyer decomposition, was an important result historically in the development of stochastic calculus. First, we would usually restrict attention to sufficiently nice modifications of the processes and, in particular, suppose that X is cadlag. When attempting an analogous decomposition to the one above, it is not immediately clear what should be meant by the predictable component. The continuous time predictable processes are defined to be the set of all processes which are measurable with respect to the predictable sigma algebra, which is the sigma algebra generated by the space of processes which are adapted and continuous (or, equivalently, left-continuous). In particular, all continuous and adapted processes are predictable but, due to the existence of continuous martingales such as Brownian motion, this means that decompositions as sums of martingales and predictable processes are not unique. It is therefore necessary to impose further conditions on the term A in the decomposition. It turns out that we obtain unique decompositions if, in addition to being predictable, A is required to be cadlag with locally finite variation (an FV process). The processes which can be decomposed into a local martingale and a predictable FV process are known as special semimartingales. This is precisely the space of locally integrable semimartingales. As usual, we work with respect to a complete filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})} and two stochastic processes are considered to be the same if they are equivalent up to evanescence.

Theorem 1 For a process X, the following are equivalent.

  • X is a locally integrable semimartingale.
  • X decomposes as

    \displaystyle  X=M+A (2)

    for a local martingale M and predictable FV process A.

Furthermore, choosing {A_0=0}, decomposition (2) is unique.

Theorem 1 is a general version of the Doob-Meyer decomposition. However, the name `Doob-Meyer decomposition’ is often used to specifically refer to the important special case where X is a submartingale. Historically, the theorem was first stated and proved for that case, and I will look at the decomposition for submartingales in more detail in a later post. Compare with the Bichteler-Dellacherie theorem, which says that a process is a semimartingale if and only if it decomposes as the sum of a local martingale and an FV process. Unfortunately, that decomposition is not unique. By Theorem 1, uniqueness is obtained by requiring the finite variation term to also be predictable, at the cost of restricting the result to apply only to locally integrable processes. Compare also with the unique decomposition of continuous semimartingales into a continuous local martingale and a continuous FV process. When looking at non-continuous processes, predictable FV processes are the correct generalisation of continuous FV processes.

The proof of Theorem 1 follows almost immediately from the proof of the Bichteler-Dellacherie theorem described in these notes, together with the classification of predictable FV processes. That is, we orthogonally project the process X onto a local martingale M, then show that {A=X-M} is a predictable FV process. However, this only immediately applies to locally square-integrable processes. The proof for a general locally integrable process will require approximating by the square-integrable case, for which the following lemma will be used. Here, {\int_0^\tau\,\vert dX\vert} denotes the variation of X across the interval {[0,\tau]}, so the the inequality below is trivial unless {X^\tau} has integrable variation.

Lemma 2 If M is a local martingale, A is a predictable FV process and {X=M+A} then

\displaystyle  {\mathbb E}\left[\int_0^\tau\,\vert dA\vert\right] \le {\mathbb E}\left[\int_0^\tau\,\vert dX\vert\right]

for all stopping times {\tau}.

Proof: As A is a predictable FV process, there exists a predictable process {\xi} with {\vert\xi\vert=1} such that {\int\xi\,dA} is equal to the variation {\int\,\vert dA\vert}. Then,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle\int\,\vert dA\vert&\displaystyle=\int\xi\,dX-\int\xi\,dM\smallskip\\&\displaystyle\le\int\,\vert dX\vert-\int\xi\,dM. \end{array}

As {N\equiv\int\xi\,dM} is a local martingale, there exist bounded stopping times {\tau_n} increasing to infinity such that the stopped processes {N^{\tau_n}} are martingales. For any stopping time {\tau}, {N_{\tau\wedge\tau_n}} has zero expectation. So,

\displaystyle  {\mathbb E}\left[\int_0^{\tau\wedge\tau_n}\,\vert dA\vert\right]\le{\mathbb E}\left[\int_0^{\tau\wedge\tau_n}\,\vert dX\vert\right].

Letting n go to infinity and using monotone convergence gives the result. \Box

With this lemma out of the way, we can now give the proof of Theorem 1.

Proof of Theorem 1: If {X=M+A=M^\prime+A^\prime} were two such decompositions with {A_0=A^\prime_0=0} then {M-M^\prime=A^\prime-A} is a predictable FV local martingale starting from 0, hence is identically zero. So, subject to A starting from zero, decomposition (2) is unique.

Next, sums of local martingales and FV processes are semimartingales. Also, local martingales are locally integrable, as are cadlag predictable processes. So, the first statement follows from the second. In the case where X is a locally square-integrable semimartingale, the reverse implication follows quickly from previous results of these notes. As in the proof of the Bichteler-Dellacherie theorem, any such X decomposes as {M+A} for a locally square-integrable martingale M and a cadlag adapted A such that {[A,N]} is a local martingale for all cadlag bounded martingales N. Then, A is a predictable FV process.

It just remains to show that decomposition (2) exists for all locally integrable semimartingales X. Choose a sequence of stopping times {\tau_n} increasing to infinity such that the pre-stopped processes

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle X^{\tau_n-}_t&\displaystyle=1_{\{t < \tau_n\}}X_t+1_{\{t\ge\tau_n > 0\}}X_{\tau_n-}\smallskip\\ &\displaystyle=1_{\{\tau_n > 0\}}X^{\tau_n}-\Delta X_{\tau_n}1_{[\tau_n,\infty)} \end{array}

are locally square integrable. For example, taking {\tau_n} to be the first time at which {\vert X\vert} exceeds n, then {\vert X^{\tau_n-}\vert} will be uniformly bounded by n. As {X^{\tau_n-}} differs from the semimartingale {X^{\tau_n}} by a step process consisting of a single jump, it is a semimartingale. As the result has already been established for square integrable semimartingales, we can decompose

\displaystyle  X^{\tau_n-}=M^n+A^n

for locally square integrable martingales {M^n} and predictable FV processes {A^n} with {A^n_0=0}. It needs to be shown that {M^n,A^n} converge to processes {M,A} in decomposition (2) for X.

Choosing m,n and a stopping time {\sigma\le\tau_m\wedge\tau_n}, Lemma 2 gives

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{\mathbb E}\left[\sup_{t\le\sigma}\vert A^m_t-A^n_t\vert\right]&\displaystyle\le{\mathbb E}\left[\int_0^\sigma\,\vert d(A^m-A^n)\vert\right]\smallskip\\ &\displaystyle\le{\mathbb E}\left[\int_0^\sigma\,\vert d(X^{\tau_m-}-X^{\tau_n-})\vert\right]\smallskip\\ &\displaystyle\le{\mathbb E}\left[1_{\{\tau_m\wedge\tau_n=\sigma\}}\vert\Delta X_\sigma\vert\right]. \end{array}

As X is locally integrable, there exists a sequence of finite stopping times {\sigma_k} increasing to infinity such that {\Delta X_{\sigma_k}} is integrable. Then,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{\mathbb E}\left[\sup_{t\le\sigma_k\wedge\tau_m\wedge\tau_n}\vert A^m_t-A^n_t\vert\right]&\displaystyle\le{\mathbb E}\left[1_{\{\tau_m\wedge\tau_n\le\sigma_k\}}\vert\Delta X_{\sigma_k}\vert\right]\smallskip\\ &\rightarrow0 \end{array}

as m,n go to infinity. So {\sup_{t\le\sigma_k}\vert A^m_t-A^n_t\vert} tends to zero in probability as m,n tend to infinity. This shows that {A^n} is a Cauchy convergent sequence under the ucp topology. By completeness it has a limit A and, by passing to a subsequence if necessary, we can assume that {\sup_{t\le T}\vert A^n_t-A_t\vert} tends to zero as {n\rightarrow\infty} for all finite times T with probability one. So, A is predictable (up to a set of zero probability). Applying Fatou’s lemma

\displaystyle  {\mathbb E}\left[\int_0^{\sigma_k\wedge\tau_n}\,\vert d(A-A^n)\vert\right]\le\lim_{m\rightarrow\infty}{\mathbb E}\left[\int_0^{\sigma_k\wedge\tau_n}\,\vert d(A^m-A^n)\vert\right]\rightarrow0

as n goes to infinity. In particular, A has finite variation on the range {[0,\sigma_k\wedge\tau_n]}, so it is an FV process. Finally, we have just shown that {A-A^n} locally tends to zero in {L^1}. Setting {M=X-A}, this shows that the local martingales {M^n=X^{{\tau_n}-}-A^n} locally converge to M in {L^1}. Also, {M^m-M^n=(X^m-X^n)-(A^m-A^n)} has integrable variation and, hence, is a martingale over {[0,\tau_m\wedge\tau_n]}. So, M is a local martingale as required. {\Box}

Special Semimartingales as Integrators

Having established the canonical decomposition (2) for special semimartingales, we can now move on to stochastic integration. In fact, the decomposition is particularly well behaved under integration, inasmuch as it commutes with the integral. The integral of a predictable process with respect to a special semimartingale can be split up into separate integrals with respect to the martingale and predictable FV parts. This compatibility of stochastic integration with semimartingale decompositions was previously seen in the special case of continuous semimartingales. However, for noncontinuous semimartingales, there is one restriction for this to hold. We are only guaranteed that the integrals with respect to the separate components are well-defined if it is assumed that the integral is itself locally integrable.

In some approaches, the compatibility of integration with the canonical decomposition is used in the construction of the stochastic integral. This is the case, for example, in Protter (Stochastic Integration and Differential Equations). There, a predictable process {\xi} is said to be X-integrable if there exists a sequence of stopping times {\tau_n} increasing to infinity such that the pre-stopped processes {X^{\tau_n-}} are locally integrable and {\xi} satisfies property 3 of Theorem 3 with respect to the decomposition of {X^{\tau_n-}}.

Compare Theorem 3 with the more general decomposition of a semimartingale into local martingale and FV terms given by the Bichteler-Dellacherie theorem. In that case, it is possible for a predictable process {\xi} to be X-integrable but not be integrable with respect to either of the terms in the decomposition. This can happen even when the integral {\int\xi\,dX} is very well behaved (such as locally integrable, locally bounded, etc). It can be seen that, as a consequence of Theorem 3, this definition of X-integrable processes coincides with that used in these notes.

Theorem 3 Let X be a special semimartingale and {X=M+A} be decomposition (2). Then, given a predictable process {\xi}, the following are equivalent.

  1. {\xi} is X-integrable and {\int\xi\,dX} is locally integrable.
  2. {\xi} is both M-integrable and A-integrable, such that {\int\xi\,dM} is a local martingale.
  3. {\sqrt{\int\xi^2\,d[M]}+\int\vert\xi\vert\,\vert dA\vert} is locally integrable.

Then,

\displaystyle  \int\xi\,dX=\int\xi\,dM+\int\xi\,dA (3)

is the unique decomposition of {\int\xi\,dX} into a local martingale and predictable FV process starting from 0.

Proof: 1 ⇒ 2: As {Y=\int\xi\,dX} is a locally integrable semimartingale, decomposition (2) can be applied to write {Y=N+B} for a local martingale N and predictable FV process B with {B_0=0}. Choose a bounded nonzero predictable process {\alpha} such that {\alpha\xi} is bounded. For example, {\alpha=1/(1+\vert\xi\vert)}. Then,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle\int\alpha\,dN+\int\alpha\,dB& \displaystyle=\int\alpha\,dY\smallskip\\ &\displaystyle=\int\alpha\xi\,dX\smallskip\\ &\displaystyle=\int\alpha\xi\,dM+\int\alpha\xi\,dA. \end{array}

However, {\int\alpha\,dN} and {\int\alpha\xi\,dM} are local martingales, and {\int\alpha\,dB} and {\int\alpha\xi\,dA} are predictable FV processes. So, by the uniqueness of decomposition (2) applied to {\int\alpha\,dY},

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle\int\alpha\,dN=\int\alpha\xi\,dM,\smallskip\\ &\displaystyle\int\alpha\,dB=\int\alpha\xi\,dA. \end{array}

Integrating {\alpha^{-1}} with respect to both sides and applying associativity of integration, we see that {\xi} is M-integrable with {\int\xi\,dM=N} being a local martingale and {\xi} is A-integrable with {\int\xi\,dA=B} being a predictable FV process. This shows that the second statement of the theorem and decomposition (3) both hold.

2 ⇒ 3: By assumption, {N\equiv\int\xi\,dM} is a local martingale, so is locally integrable. Therefore, {\Delta N} is locally integrable. As the quadratic variation {[N]} has jumps {\Delta[N]=(\Delta N)^2}, it is locally L1/2-integrable. So, {[N]^{1/2}} is locally integrable. Next, as {\xi} is A-integrable, {\int\vert\xi\vert\,\vert dA\vert} is almost surely finite at all times. Since this integral is also predictable, it is locally integrable. Adding these together,

\displaystyle  \sqrt{\int\xi^2\,d[M]}+\int\vert\xi\vert\,\vert dA\vert=\sqrt{[N]}+\int\vert\xi\vert\,\vert dA\vert

is locally integrable.

3 ⇒ 1: It was previously shown that, as a consequence of the Burkholder-Davis-Gundy inequality, local integrability of {\left(\int\xi^2\,d[M]\right)^{1/2}} is enough to guarantee that {\xi} is M-integrable and that {\int\xi\,dM} is a local martingale, hence locally integrable. Also, local integrability of {\int\vert\xi\vert\,\vert dA\vert} is enough to infer that {\xi} is A-integrable and, being bounded by {\int\vert\xi\vert\,\vert dA\vert}, the integral {\int\xi\,dA} is also locally integrable. Adding these together shows that {\xi} is X-integrable and

\displaystyle  \int\xi\,dX=\int\xi\,dM+\int\xi\,dA

is locally integrable as required. \Box

15 Comments »

  1. Hi,

    There is a notation that I don’t fully understand in Lemma 2 and after, what do you mean exactly by \int_0^{\tau} |dX| ?

    For example, I don’t see what it means for a Brownian Motion.

    Regards

    Comment by TheBridge — 3 October 11 @ 4:21 PM | Reply

    • That’s just the variation of X on the range [0,τ]. It is infinite for Brownian motion. Maybe I could have been a bit clearer. The inequality is trivial for processes with infinite variation.

      Comment by George Lowther — 3 October 11 @ 4:37 PM | Reply

      • Ok I got it thank’s

        Comment by TheBridge — 3 October 11 @ 4:46 PM | Reply

    • I added an extra sentence just before Lemma 2 explaining the notation.

      Comment by George Lowther — 3 October 11 @ 11:44 PM | Reply

  2. Hi,

    In theorem 3, point 2, given the fact that we know that \xi is M-Integrable, doesn’t it imply automatically that \int \xi dM is a local martingale ?

    If true, I find the idea of pointing that out a little a bit superfluous, if not true then I can’t see a counterexample of a \xi being predictable and M-Integrable such that \int \xi dM is not a local martingale.

    Best regards and thanks for this new post that sheds some light on places (of stochastic integration) I couldn’t even realise there were darkness.

    Comment by TheBridge — 4 October 11 @ 5:18 PM | Reply

    • Hi. Actually, just knowing that \xi is M-integrable is not enough to be able to say that \int\xi\,dM is a local martingale. If \xi is locally bounded then it is enough. More generally, if \int\xi\,dM is locally integrable then it is a local martingale (as local martingales are locally integrable, this is an ‘if and only if’ condition). See my earlier post Preservation of the Local Martingale Property.

      There do exist local martingales M and M-integrable predictable processes \xi such that \int\xi\,dM is not locally integrable, and not a local martingale. I have an example in the same post linked above (Loss of the Local Martingale Property). So, it does have to be explicitly stated.

      Hope that helps.

      Comment by George Lowther — 4 October 11 @ 9:03 PM | Reply

      • Ok thank’s for those detailed explanations

        Best regards

        Comment by TheBridge — 5 October 11 @ 8:17 AM | Reply

  3. Hi,

    I have a suggestion about this sentence in the proof of theorem 1,

    “As X is locally integrable, there exists a sequence of finite stopping times increasing to infinity such that \Delta X_{\sigma_k} is integrable.”

    It might be worth for the reader to hyperlink this assertion to “lemma 8″ from the post “localization” of 23 december 2009.

    Best regards

    Comment by TheBridge — 5 October 11 @ 9:43 AM | Reply

  4. Hi,
    I have yet another question about the proof of Theorem 1. When you claim that \sup_{t\le T}|A^n_t-A_t|\to 0 a.s. entails that A is predictable, I can’t really see why this is obvious from the context, could you elaborate a little more on this ?

    Best regards

    Comment by Anonymous — 9 October 11 @ 4:57 PM | Reply

    • The limit of a sequence of real-valued measurable functions is itself measurable (from any measurable space to the real numbers, with the Borel sigma algebra). More precisely, if (E,ℰ) is a measurable space and fnE → ℝ is a sequence of measurable functions then S = {x ∈ E: limn fn(x) exists} is in ℰ, and the limit limn fn is a measurable function on S. Apply this to the processes An thought of as measurable functions with respect to the predictable sigma algebra.
      I’ll add an extra line to the post to try and make this a bit clearer.

      Comment by George Lowther — 11 October 11 @ 1:53 AM | Reply

      • Hi George,

        Using your line of argument, would it be correct to claim that if we have a sequence X_n :\Omega, \mathcal{P} \to F,\mathcal{B}(F,d) where F is a metric space (or maybe Polish ?) together with the borelian sigma algebra associated to the distance topology, then S=\{\omega \in \Omega: \exists X(\omega) \in F~ s.t.~ lim_n d(X_n(\omega),X(\omega))=0 \} is a \mathcal{P} -measurable set and X is \mathcal{P} -measurable ?

        Using this with F=\{{\rm cadlag\ function\ over\ }[0,T]\}, d(f,g)=sup_{t\le T}|f(t)-g(t)| and where \Omega is endowed with the predictable sigma field \mathcal{P}, would yield the result, wouldn’t it ?

        Best Regards

        Comment by TheBridge — 11 October 11 @ 1:03 PM | Reply

        • I think so, but that is more complicated than necessary. It’s more direct to consider An as functions from \Omega\times\mathbb{R}_+ with the predictable sigma algebra to the real numbers with the Borel sigma algebra.

          Btw, what is \Omega in your comment? It sounds like you are taking it to be the underlying probability space, but the predictable sigma algebra is defined on \Omega\times\mathbb{R}_+.

          Comment by George Lowther — 11 October 11 @ 11:58 PM

        • Ok I got it,

          Reading your answer, I indeed suspected I was over-complexifying things.

          That’s because (in my mind) processes take values in the càdlàg space and for each \omega \in \Omega the canonical application gives a càdlàg function (in this precise context).

          Seeing predictable sigma algebra \mathcal{P} over the space \Omega \times \mathbb{R}_+ really makes life much easier.

          I think there must be some measurable application from \Omega, \mathcal{P} to \mathcal{P}\times \mathbb{R}_+, \mathcal{B}(\mathbb{R}_+) here that makes things strictly equivalent here but I’m not sure this really helps clarifying things.

          Best regards

          Comment by TheBridge — 12 October 11 @ 7:36 AM

  5. centralhacks

    Special Semimartingales | Almost Sure

    Trackback by centralhacks — 20 September 14 @ 10:16 AM | Reply


RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

The Rubric Theme. Create a free website or blog at WordPress.com.

Follow

Get every new post delivered to your Inbox.

Join 141 other followers