As was mentioned in the initial post of these stochastic calculus notes, it is important to choose good versions of stochastic processes. In some cases, such as with Brownian motion, it is possible to explicitly construct the process to be continuous. However, in many more cases, it is necessary to appeal to more general results to assure the existence of such modifications.

The theorem below guarantees that many of the processes studied in stochastic calculus have a right-continuous version and, furthermore, these versions necessarily have left limits everywhere. Such processes are known as *càdlàg* from the French for “continu à droite, limites à gauche” (I often drop the accents, as seems common). Alternative terms used to refer to a cadlag process are rcll (right-continuous with left limits), R-process and right process. For a cadlag process , the left limit at any time is denoted by (and ). The jump at time is denoted by .

We work with respect to a complete filtered probability space .

Theorem 1 below provides us with cadlag versions under the condition that elementary integrals of the processes cannot, in a sense, get too large. Recall that elementary predictable processes are of the form

for times , -measurable random variable and -measurable random variables . Its integral with respect to a stochastic process is

An elementary predictable set is a subset of which is a finite union of sets of the form for and for nonnegative reals and . Then, a process is an indicator function of some elementary predictable set if and only if it is elementary predictable and takes values in .

The following theorem guarantees the existence of cadlag versions for many types of processes. The first statement applies in particular to martingales, submartingales and supermartingales, whereas the second statement is important for the study of general semimartingales.

Theorem 1LetXbe an adapted stochastic process which is right-continuous in probability and such that either of the following conditions holds. Then, it has a cadlag version.

Xis integrable and, for every ,is bounded.

- For every the set
is bounded in probability.

This existence of cadlag versions stated by this theorem is a special case of the following slightly more general result, which drops the requirement for the process to be right-continuous in probability. In the following, the modification has a right limit at every point. If the process is right-continuous in probability, then at each time, with probability one. By countable additivity, this remains true simultaneously at all times in any given countable set and therefore is a cadlag version.

Theorem 2LetXbe an adapted stochastic process, and suppose that either of the two conditions of Theorem 1 holds. Then, it has a versionYwhich has left and right limits everywhere and such that there is a countable subset for which is right-continuous at every .

A proof of this theorem is given below, using the ideas on upcrossings of a process as discussed in the previous post.

** Cadlag Martingales **

The main result for existence of cadlag martingales is as follows.

Theorem 3LetXbe a martingale, submartingale or supermartingale which is right-continuous in probability. Then, it has a cadlag version.

*Proof:* By applying the statement to , it suffices to prove the result for submartingales. However, in this case, for any elementary predictable set ,

(1) |

The first condition of Theorem 1 is satisfied, showing that cadlag versions exist.

Even if the condition of right-continuity in probability is dropped, then it is still possible to pass to well-behaved modifications. Although a martingale can fail to have a cadlag version, it is always the case that there exists a version which is cadlag everywhere outside of a fixed countable set of times.

Theorem 4LetXbe a martingale, submartingale or supermartingale. Then, it has a versionYwhich has left and right limits everywhere and such that there is a countable subset for which is right-continuous at every .

*Proof:* By applying the statement to , we may suppose that the process is a submartingale. Then, inequality (1) applies, so the first statement of Theorem 1 holds. Theorem 2 gives the modification *Y*.

Often, the underlying filtrations used are assumed to satisfy the *usual conditions*. That is, they are required to be right-continuous as well as being complete. In this case the existence of cadlag versions is particularly general. Every martingale has a cadlag version.

Theorem 5Suppose that the filtration is right-continuous. Then, every martingale has a cadlag version.

More generally, a submartingale or supermartingaleXhas a cadlag version if and only if is right-continuous.

*Proof:* By applying the result to , we may suppose that the process is a submartingale. Furthermore, for martingales the function , being constant, is trivially right-continuous. So, it suffices to prove the second more general statement. Theorem 4 gives a version *Y* which has left and right limits everywhere and is cadlag outside of some countable set . It only remains to be shown that , almost surely, for each .

Choose a sequence strictly decreasing to . Then , by the submartingale property. The idea is to commute the limit as with the conditional expectation to obtain

(2) |

This can be done under the condition that the sequence is uniformly integrable. For a martingale, these are all conditional expectations , so uniform integrability is guaranteed. In fact, Lemma 6 states that this sequence is uniformly integrable whenever is a submartingale, so inequality (2) holds.

Similarly, using uniform integrability together with the right-continuity of gives

so that is a nonnegative random variable with zero expectation. This shows that almost surely. Finally, the right-continuity of the filtration, , is applied. As is necessarily -measurable, .

The proof above made use of the following simple, but useful, statement regarding uniform integrability of submartingales.

Lemma 6Let be a submartingale with respect to a filtered probability space .

Then, is uniformly integrable for any decreasing sequence bounded below in .

*Proof:* The idea is to apply a `Doob-style decomposition’ of the submartingale into a martingale and an increasing process at the sequence of times. Let be a lower bound for and set

which, by the submartingale property, is a sum of nonnegative terms. Furthermore, by monotone convergence

so are integrable random variables, and is decreasing in . Furthermore, from the definition, the martingale property

is satisfied, so is a uniform integrable sequence. Finally, are uniformly integrable, and the result follows for .

** Proof of Cadlag Versions **

I now give a proof of Theorem 2. This will make use of the ideas from previous post on upcrossings and downcrossings to show that, when restricted to a countable set of times, has left and right limits everywhere. The result will follow from this.

For any , let be a finite subset of . The number of upcrossings of an interval for times in satisfies the bound

(3) |

for some elementary set . Furthermore, letting be the first time in at which , the stochastic interval is elementary and,

Applying the same idea to shows that there are elementary predictable sets such that

(4) |

Now suppose that the first condition of the theorem holds, so that is bounded by some positive constant for all elementary sets . Taking expectations of inequalities (3) and (4) gives

Letting increase to a countably infinite subset of and applying monotone convergence, these inequalities generalize to all countable subsets . In particular, the number of upcrossings of and the supremum of are almost surely bounded on .

Alternatively, suppose that the second statement of the theorem holds. Then, there exists a function with as and for all and elementary predictable sets . Inequalities (3) and (4) give

Again, letting increase to a countably infinite subset of and applying monotone convergence extends these inequalities to all countable subsets . Letting go to infinity then shows that the number of upcrossings of and the supremum of are bounded on .

In either case, applying countable additivity, the above shows that for time restricted to a countable subset the process , with probability one, is bounded and has finitely many upcrossings of for all rational on bounded time intervals. Replacing by the identically zero process outside this set of probability one, it can be assumed that this holds everywhere.

By the results of the previous post, this implies that for time restricted to such a countable set , almost surely has left and right limits everywhere. Assuming that is also dense in (e.g., ), this defines a cadlag process

(5) |

for all , and where is restricted to in this limit.

Note that enlarging by any countable set will not change the process (up to a set of probability zero), by the arbitrariness of sequences in (5).

It remains to be shown that outside of some countable set of times. In fact, for any positive integer , the set of times at which is finite. If not, there would be an increasing or decreasing sequence of such times and, enlarging to include this sequence, would converge to zero, giving the contradiction as . Consequently, letting , there are only countably many times at which . Without loss of generality, we may suppose that includes all such times.

Finally, the process is defined by

As with time restricted to the set has left and right limits everywhere, it follows that also has left and right limits everywhere. Furthermore, is cadlag outside of the countable set .

Dear George,

I have a question concerning Theorem 4

^{1}: Given a right-continuous filtration, is any left-continuous martingale in fact continuous, since it has a cadlag version? Is this the reason why one does usually not consider left-continuous martingales?Thanks for your Stochastic Calculus Notes, I enjoy reading them!

Pyramus

^{1}G.L.: this is now theorem 5, since the previous edit.Comment by Pyramus — 26 April 11 @ 2:40 PM |

Hi.

No, it is not true that a left-continuous martingale has to be continuous. Consider, for example, a compensated Poisson process

X. Then,Xis just a Poisson process with a constant drift subtracted, so its jumps are the same as for a homogeneous Poisson process (which have zero probability of occuring at any given fixed time). So,Xis a cadlag martingale and its left limitY=_{t}X_{t–}= lim_{s↑↑t}Xis a left-continuous process. As P(_{s}Y=_{t}X) = 1, it follows that_{t}Yis a left-continuous martingale, but is not continuous.However, there is some truth in which you suggest. A left-continuous martingale has to be continuous, with probability one, at any given fixed time. It can only jump at times which are continuously distributed. There are also other issues with using left-continuous martingales. Optional sampling would not hold. That is, the martingale property would not extend to bounded stopping times at which the process can jump, so E[

X] = E[_{T}X_{0}] would not hold. In fact, the only left-continuous martingales for which this holds are the continuous ones. Also, the definition of local martingales would not make a lot of sense, since it relies on the fact that the space of cadlag martingales is stable under optional stopping which, as with the optional sampling theorem, doesn’t hold for general left-continuous martingales.Hope that helps!

George.

Comment by George Lowther — 29 April 11 @ 11:21 AM |

Update:I added an extra statement to this post, Theorem 4, which provides sufficiently nice versions of martingales in the case where they are not right-continuous in probability.Comment by George Lowther — 28 November 11 @ 12:34 AM |

Dear George,

I am curious whether completeness of the filtration is required for the existence of measurable cadlag modifications. The proof you present does not seem to require such an assumption, but books such as Karatzas and Shreve include that assumption. There is a paper by Hans Follmer on exit times of supermartingales which doesn’t require completeness, but which has additional requirements on the filtration. Hence I am curious about your view on the matter (whether completeness is required).

Thank you again for these excellent posts.

Note 1: I think you need the filtration to be right-continuous in your proof in order to get an adapted modification.

Note 2: It wasn’t fully clear why the sequence of times you construct in the second to last paragraph would have a subsequence converging from the right. Convergence from the right of t_k to some limit point t seems a key requirement in order to get

(i) \tilde{X}_(t_k) to converge to \tilde{X}_(t)

and

(ii) X_(t_k) to converge to \tilde{X}_(t)

the combination of which leads to the contradiction you mention.

Comment by Tigran — 10 February 12 @ 12:55 PM |

No, you don’t need completeness of the filtration for the existence of a cadlag modification, but it is necessary to assume this (or a similar condition) if you want the modification to be adapted. I did assume completeness throughout this post. Rather than completeness, you only need the weaker condition that contains all sets in with zero probability. Right-continuity of the filtration is not required except in special cases, such as Theorem 5 above where it is needed to guarantee that all martingales are right-continuous in probability.

In fact, suppose that

Xis any adapted process which has a cadlag modification with respect to the completion of the filtration (or, with respect to any enlargement of the filtration). Then we can defineAs

Xdoes have a cadlag modification, with respect to some enlargement of the filtration, it must be equal to almost surely. So, is almost-surely cadlag. Also, as almost surely (for each fixedt) and is -measurable, it follows that will be an adapted and almost-surely cadlag modification ofXso long as contains all sets in with zero probability. If we want a modification which is actually cadlag rather than just almost surely cadlag, you can set to be identically 0 in the event where it is not cadlag (which can be seen -measurable by expressing it in terms of upcrossings ofXon ).So, this shows that: for a cadlag modification, no assumptions on the filtration are required. For a cadlag adapted modification, requiring to contain all sets in of zero probability is enough. For an adapted and almost-surely cadlag modification, requiring to contain all zero probability sets in is sufficient.

For example, let ε

_{1}, ε_{2}be a sequence of Bernoulli IID random variables, each equal to 1 and -1 with probability 1/2. Lett_{1},t_{2}be a sequence of times strictly increasing to 1. Define,(and set

X_{t}to zero if this sum does not converge). This is right-continuous and has left limits everywhere except, possibly, att= 1 in the case where fails to converge. By martingale convergence of uniformly square integrable martingales, this converges almost surely, soXis almost surely cadlag – but not necessarilly cadlag. If is a modification ofXand adapted to the natural filtration ofX, then for each $t < 1$ (and not just almost surely, because is finite with no nonempty zero probability sets). Then, $\tilde X_t$ fails to have a limit att= 1 wheneverXdoes. In fact, the natural filtration here is already right-continuous, showing that even in the right-continuous case we need to enlarge the filtration by adding zero probability sets to to get a modification which is right-continuous.For another example, let

Ybe a Poisson process and . It can be seen thatYis a cadlag modification ofXbut is not adapted with respect to the natural filtration ofX. To get an adapted modification which is almost surely cadlag, you need to enlarge the filtration by adding zero probability sets of to (for almost all timest).In answer to,

Note 1: No, right-continuity is not required except where explicitly stated (e.g., Theorem 5). Hopefully my comment above helps explain this – otherwise could you state where you think right continuity is required?

Note 2: Any uncountable set of times will contain a strictly decreasing subsequence. This is not required though, as the argument will also work for increasing sequences. Letting

tbe the limit of the sequencet, we are using the fact that_{k}Xhas left and right limits att, so both and converge to the left (resp. right) limit if the sequence is increasing (resp. decreasing).Comment by George Lowther — 12 February 12 @ 11:02 AM |

Thanks for this detailed answer!

In note 1, I meant you need right continuity for to be adapted.

In note 2, I still think you need a decreasing sequence (even if left limits exist) since you define as the limit from the right and since you are comparing the value of X

_{t}with to get the contradiction. But maybe I am missing a detail and haven’t thought it through carefully enough yet đź™‚Comment by Tigran — 17 February 12 @ 8:55 PM |

In note 1: we have almost surely. Combining this with completeness of the filtration, the fact that X is adapted implies that is adapted. So, right-continuity is not required.

In note 2: I think the bit that is concerning you is where I state that “ would converge to zero”. This is true, even for increasing sequences. Fix a sample path which has left limits on S (which is true for almost all sample paths). Consider choosing a sequence of times with for k even and for k odd. The fact that this is an increasing sequence means that converges to a limit. So, converges to 0. However, by choosing very close to , we can ensure that is bounded by 1/k and, hence, tends to zero. So, tends to zero. Maybe that was a bit of a big step to make in the proof…

Comment by George Lowther — 22 February 12 @ 1:40 AM

Dear George,

Thanks a lot for the interesting blog. Concerning Theorems 1 and 2, I was looking for references in the literature, but so far I haven’t found any. Can you help here?

Marcus

Comment by Marcus — 9 August 17 @ 1:33 PM |