The Doob-Meyer decomposition was a very important result, historically, in the development of stochastic calculus. This theorem states that every cadlag submartingale uniquely decomposes as the sum of a local martingale and an increasing predictable process. For one thing, if *X* is a square-integrable martingale then Jensen’s inequality implies that is a submartingale, so the Doob-Meyer decomposition guarantees the existence of an increasing predictable process such that is a local martingale. The term is called the *predictable quadratic variation* of *X* and, by using a version of the Ito isometry, can be used to define stochastic integration with respect to square-integrable martingales. For another, semimartingales were historically defined as sums of local martingales and finite variation processes, so the Doob-Meyer decomposition ensures that all local submartingales are also semimartingales. Going further, the Doob-Meyer decomposition is used as an important ingredient in many proofs of the Bichteler-Dellacherie theorem.

The approach taken in these notes is somewhat different from the historical development, however. We introduced stochastic integration and semimartingales early on, without requiring much prior knowledge of the general theory of stochastic processes. We have also developed the theory of semimartingales, such as proving the Bichteler-Dellacherie theorem, using a stochastic integration based method. So, the Doob-Meyer decomposition does not play such a pivotal role in these notes as in some other approaches to stochastic calculus. In fact, the special semimartingale decomposition already states a form of the Doob-Meyer decomposition in a more general setting. So, the main part of the proof given in this post will be to show that all local submartingales are semimartingales, allowing the decomposition for special semimartingales to be applied.

The Doob-Meyer decomposition is especially easy to understand in discrete time, where it reduces to the much simpler Doob decomposition. If is an integrable discrete-time process adapted to a filtration , then the Doob decomposition expresses *X* as

(1) |

As previously discussed, *M* is then a martingale and *A* is an integrable process which is also predictable, in the sense that is -measurable for each . Furthermore, *X* is a submartingale if and only if or, equivalently, if *A* is almost surely increasing.

Moving to continuous time, we work with respect to a complete filtered probability space with time index *t* ranging over the nonnegative real numbers. Then, the continuous-time version of (1) takes *A* to be a right-continuous and increasing process which is predictable, in the sense that it is measurable with respect to the σ-algebra generated by the class of left-continuous and adapted processes. Often, the Doob-Meyer decomposition is stated under additional assumptions, such as *X* being of class (D) or satisfying some similar uniform integrability property. To be as general possible, the statement I give here only requires *X* to be a local submartingale, and furthermore states how the decomposition is affected by various stronger hypotheses that *X* may satisfy.

Theorem 1 (Doob-Meyer)Any local submartingaleXhas a unique decomposition

(2) where

Mis a local martingale andAis a predictable increasing process starting from zero.Furthermore,

- if
Xis a proper submartingale, thenAis integrable and satisfies

(3) for all uniformly bounded stopping times .

Xis of class (DL) if and only ifMis a proper martingale andAis integrable, in which case

(4) for all uniformly bounded stopping times .

Xis of class (D) if and only ifMis a uniformly integrable martingale and is integrable. Then, and exist almost surely, and (4) holds for all (not necessarily finite) stopping times .

Note that, by definition, local submartingales and local martingales are always cadlag processes and, hence, in decomposition (2), *A* is automatically required to be cadlag. So, right-continuity of *A* did not have to be explicitly stated. Also, Theorem 1 implies that any local submartingale *X* decomposes as a local martingale plus a finite variation process. As mentioned above, this implies that *X* is a semimartingale. Here, we work in the opposite direction, proving first that every local submartingale is a semimartingale and, then, use this to give a proof of Theorem 1. Recall that, in these notes, a cadlag adapted process *X* is defined to be a semimartingale if and only if the stochastic integral is defined for all bounded predictable integrands .

Lemma 2Every local submartingale is a semimartingale.

*Proof:* It was prevously shown that *X* is a semimartingale if, for each fixed , the set

(5) |

is bounded in probability. This characterization was also stated as part of the Bichteler-Dellacherie theorem, and is what we will use to show that a local submartingale *X* is a semimartingale. The proof is similar to the earlier proof that local martingales are semimartingales. By localization, we can suppose that *X* is a proper submartingale.

For any elementary process and fixed time , there exist times such that

and are -measurable random variables. Consider the discrete-time process , which is a submartingale adapted to the discrete filtration . Applying the Doob decomposition (1) to , we can write , where *M* is a discrete-time martingale and *A* is increasing with . So, if ,

(6) |

for all . Also, as previously shown in the proof that martingales are semimartingales, there exists a constant , independent of choice of *X*, and *K*, such that

(7) |

So, choosing *K* large, the probability on the left hand side can be made as small as we like, independently of . This shows that the set (5) is bounded in probability as required. ⬜

With Lemma 2 out of the way, all the ingredients are now in place to give a proof of the Doob-Meyer decomposition. This is just an application of the special semimartingale decomposition from an earlier post, although there is still a small amount of work required to show that the compensator *A* is indeed increasing. Let us start by proving the first part of Theorem 1 where it is only assumed that *X* is a local submartingale.

Lemma 3Every local submartingaleXuniquely decomposes as , whereMis a local martingale andAis an increasing predictable process starting from zero.

*Proof:* By Lemma 2, we know that *X* is a semimartingale. Also, as it is a local submartingale, *X* is locally integrable. So, it uniquely decomposes as , where *M* is a local martingale and *A* is a predictable FV process with . It only remains to be shown that *A* is increasing.

As is a local submartingale, there exist stopping times increasing to infinity such that are submartingales. Also, as *A* is predictable, can be chosen such that has uniformly bounded and, hence, integrable total variation. Then, by the properties of submartingales,

(8) |

for any bounded elementary process . Let us denote the set of uniformly bounded predictable processes for which (8) holds by *S*. Then, as just shown, *S* contains all bounded elementary processes. Next, consider a sequence of predictable processes in *S*, uniformly bounded by a constant *K*, and converging to the limit . As has integrable total variation, dominated convergence gives

So, . Therefore, by the monotone class theorem, *S* contains all uniformly bounded predictable processes.

Now, as *A* is a predictable FV process, it can be written as where and are increasing processes, and for some predictable process . By (8),

Letting *n* increase to infinity, monotone convergence gives . Finally, as it is nonnegative and increasing, we almost surely have for all *t* and, hence, is increasing. ⬜

To complete the proof of Theorem 1, just the `furthermore’ part remains to be shown. That is, under the additional hypotheses for *X* we obtain stronger properties satisfied by *M* and *A*. We prove this now. Note that nowhere in this part of the proof requires the fact that *A* is predictable, just that it is an increasing process starting from zero.

*Proof of Theorem 1:*

Suppose that *X* is a proper submartingale. Then, choose stopping times increasing to infinity such that the stopped processes are proper martingales. For any uniformly bounded stopping time , optional sampling says that is integrable with expectation bounded by . So,

Letting *n* increase to infinity and using monotone convergence gives inequality (3) so, in particular, *A* is integrable.

Now, suppose that *X* is of class (DL). Then, it is a proper submartingale so, as shown above, *A* is integrable. As it is also nonnegative and increasing, *A* is dominated in on each bounded interval. So, is of class (DL) and, hence, is a proper martingale. Then, for any uniformly bounded stopping time , optional sampling gives

This proves equality (4).

Conversely, suppose that *M* is a proper martingale and *A* is integrable. Then, *M* is of class (DL) and, as it is dominated in on finite intervals, *A* is also of class (DL). Therefore, is of class (DL).

Now, suppose that *X* is of class (D). In particular, it is of class (DL) and, as shown above, *M* is a proper martingale and *A* is integrable. By submartingale convergence, the limit exists almost-surely. Applying (3) and monotone convergence,

The last equality here uses uniform integrability of over , since *X* is of class (D). So, (3) holds for all stopping times and, by taking , we see that is integrable. As it is dominated in , *A* is in class (D). So, is also in class (D) and, hence, is a uniformly integrable martingale.

Conversely, suppose that *M* is a uniformly integrable martingale and that is integrable. Then, *M* is of class (D) and, as it is dominated in , *A* is also of class (D). So, is of class (D) as required. ⬜

#### Approximating the Compensator

The process *A* appearing in decomposition (2) is called the compensator of *X*. By definition, it is the unique predictable FV process, starting from zero, such that is a local martingale. However, this is considerably different from the definition of the compensator in the dicrete-time Doob decomposition (1). In this section, I will show how the continuous-time compensator does arise as the limit of discrete-time compensators along partitions. As previously discussed for approximating the compensator of integrable variation processes, we start by defining the notion of a stochastic partition, *P*, of . This is just a sequence of stopping times

The mesh of the partition is denoted by . For a process *X*, we calculate the compensator along the partition *P* as the process defined by

(9) |

Here, we are only going to consider processes *X* which are submartingales of class (D). This ensures that the limit exists, and that is integrable for all stopping times . So, the expectations in (9) are well defined.

If *X* is a class (D) submartingale, then Theorem 1 says that for a uniformly integrable martingale *M* and predictable increasing process *A* starting from zero and such that is integrable. Therefore, by optional sampling, we have for each stopping time . This implies that , so (9) can be rewritten as

(10) |

This expresses in terms of the integrable and increasing process *A*. In the case where *X* is quasi-left-continuous, so that *A* is continuous, the approximation to the compensator calculated along partitions *P* converges uniformly to *A* as the mesh goes to zero. Here, denotes the limit as the mesh goes to zero in probability.

Theorem 4LetXbe a cadlag and quasi-left-continuous submartingale of class (D), andAbe as in decomposition (2). Then, uniformly in as in probability. That is,

(11)

*Proof:* As *X* is quasi-left-continuous, its compensator *A* is continuous. Then, (10) expresses in terms of the continuous process *A* of integrable total variation. Theorem 10 of the post on compensators then states that the limit (11) holds. ⬜

In the case where *X* is not quasi-left-continuous, then convergence to the compensator is not guaranteed in such a strong sense as above. As was previously shown by an example, even in the very simple case where for some stopping time , it is not guaranteed that the limit of exists in probability as the mesh goes to zero. Instead, we have to work with respect to weak convergence in .

Theorem 5LetXbe a cadlag submartingale of class (D), andAbe as in decomposition (2). Then, weakly in as in probability, for any random time . That is,

(12)

for all uniformly bounded random variablesY.

*Proof:* As in the proof of Theorem 4, we note that (10) expresses in terms of the integrable variation process *A*. Theorem 11 of the post on compensators states that the limit (12) holds. ⬜

#### Notes

As discussed above, the approach used to prove the Doob-Meyer decomposition in this post is quite different from the methods which are often used elsewhere. This is because we have already developed much of the theory of semimartingales independently of the Doob-Meyer decomposition, so it seems natural to simply apply this theory here rather than starting from scratch. I will, however, briefly outline some alternative approaches which are sometimes used.

One method of proving the existence of the Doob-Meyer decomposition is via the *Doléans measure* of a class (D) cadlag submartingale *X*. This is a finite measure on the predictable σ-algebra satisfying

for all bounded elementary processes . Proving that the function does extend to a measure can be done by a relatively straightforward application of the Caratheodory extension theorem. Next, the Doléans measure is used to define the following finite measure on the underlying measurable space . Given any bounded -measurable random variable *Y*, choose a cadlag modification of the -martingale . Then, letting , define by

It can be shown that is absolutely continuous with respect to , so the Radon-Nikodym derivative

exists. Furthermore, *A* is adapted, increasing and right-continuous in probability. So, it has a right-continuous modification. Also, from the construction, it is straightforward to show that is a martingale and

(13) |

for all cadlag and bounded martingales *M*. The fact that *A* is predictable follows from (13) together with the techniques developed in the earlier post on predictable FV processes. Filling in the details, this provides a proof of the Doob-Meyer decomposition for class (D) submartingales.

The method just described, making use of the Doléans measure of *X*, is close to the standard `classical’ proof of the Doob-Meyer decomposition. The method is closely related to the idea of *dual predictable projection*, which is sometimes employed in the proof. The most difficult part is showing that any process *A* satisfying (13) is indeed predictable.

An alternative method which is sometimes used is to construct the compensator *A* directly, by taking the limit of the discrete approximation along a sequence of partitions. That is, prove Theorem 5 without assuming a-priori that the compensator *A* exists. Again, showing that the process *A* is indeed predictable is often the most tricky part of this approach. Similarly, the decomposition could be constructed by first proving Theorem 4. This would prove the Doob-Meyer decomposition for quasi-left-continuous cadlag submartingales where, now, the compensator *A* is the unique *continuous* increasing process starting from zero such that is a martingale. Extending to general cadlag submartingales is relatively straightforward. We would simply subtract out the jumps of *X* which occur at predictable times, and construct the compensators at these jump times explicitly.

Finally, as quite a lot of the theory of semimartingales and stochastic integration has already been developed in these notes, and was drawn upon in the proof given here, I think that it is worth briefly mentioning what was really needed for the proof above. The main point is that, for every submartingale *X*, the set (5) is bounded in probability. Furthermore, this implies that the quadratic variation exists. Sometimes, semimartingales are *defined* as cadlag adapted processes such that (5) is bounded in probability. Then, quadratic variations and covariations of semimartingales always exist. This allows us to define the vector space *V* of semimartingales *X* such that is integrable, with the inner product . So, for any submartingale *X* such that is integrable, we can define *M* to be the orthogonal projection in *V* of *X* onto the subspace of uniformly square integrable martingales. In these notes, this projection was done as part of the proof of the Bichteler-Dellacherie theorem. The process obtained is characterized by the property that is a local martingale for all local martingales *N*, which we showed is equivalent to *A* being a predictable FV process.

For a recent paper on the Doob-Meyer decomposition theorem, see

M. Beiglböck, W. Schachermayer, B. Veliyev. A short proof of the Doob-Meyer Theorem. Preprint, 2010.

It uses Komlos lemma as in previous works of W. Schachermayer.

Comment by adrien — 30 December 11 @ 7:31 PM |

Thanks, I see it is on the arXiv (A Short Proof Of The Doob-Meyer Theorem). I’ll check that out.

Comment by George Lowther — 30 December 11 @ 8:06 PM |

Ok, I’ve read it now. It seems like a very neat approach, and is something that I have thought about before. It constructs the compensator

Aby calculating the limit of discrete Doob decompositions along a sequence of partitions. This is (roughly) similar to proving Theorem 5 above, for a specific sequence of partitions (dyadic partitions) without a-priori assuming the existence of a compensator. One way is to use the fact that is uniformly integrable asPruns through the partitions, which implies that this sequence is compact in the weak topology onL^{1}(uniformly integrable subsets ofL^{1}are weakly relatively compact). You still need need to show that any limit pointAobtained is predictable. In the paper you mention, they use the fact that by taking convex combinations, you can pass from a uniformly integrable sequence to one converging both inL^{1}and almost-surely (and, once you know this fact, you can pass to convex combinations without having to explicitly mention weak convergence). Then, by showing thatAcan be obtained almost surely as a limit of convex combinations of the left-continuous and adapted (hence, predictable) processesA, you get that^{P}Ais predictable.Constructing

Ausing a limit (or limit point) of Doob-style approximations on a sequence of partitions is not particularly new — this is what Kallenberg does inFoundations of Modern Probability(iirc). The trick is in how you show thatAis predictable. In the linked paper, Komlos’ lemma allows you to show that it is a limit of convex combinations of the predictable processesA, which establishes predictability in a particularly simple way.^{P}Thanks for bringing this paper to my attention.

Comment by George Lowther — 31 December 11 @ 12:05 AM |