Now that it has been shown that stochastic integration can be performed with respect to any local martingale, we can move on to the following important result. *Stochastic integration preserves the local martingale property*. At least, this is true under very mild hypotheses. That the martingale property is preserved under integration of bounded elementary processes is straightforward. The generalization to predictable integrands can be achieved using a limiting argument. It is necessary, however, to restrict to locally bounded integrands and, for the sake of generality, I start with local sub and supermartingales.

Theorem 1LetXbe a local submartingale (resp., local supermartingale) and be a nonnegative and locally bounded predictable process. Then, is a local submartingale (resp., local supermartingale).

*Proof:* We only need to consider the case where *X* is a local submartingale, as the result will also follow for supermartingales by applying to *-X*. By localization, we may suppose that is uniformly bounded and that *X* is a proper submartingale. So, for some constant *K*. Then, as previously shown there exists a sequence of elementary predictable processes such that converges to in the semimartingale topology and, hence, converges ucp. We may replace by if necessary so that, being nonnegative elementary integrals of a submartingale, will be submartingales. Also, . Recall that a cadlag adapted process *X* is locally integrable if and only its jump process is locally integrable, and all local submartingales are locally integrable. So,

is locally integrable. Then, by ucp convergence for local submartingales, *Y* will satisfy the local submartingale property. ⬜

For local martingales, applying this result to gives,

Theorem 2LetXbe a local martingale and be a locally bounded predictable process. Then, is a local martingale.

This result can immediately be extended to the class of local -integrable martingales, denoted by .

Corollary 3Let for some and be a locally bounded predictable process. Then, .

*Proof:* The previous theorem shows that is a local martingale. By localization, we may suppose that is uniformly bounded, so for some constant *K*. As is locally -integrable,

will also be locally -integrable, as required. ⬜

Moving on, it seems natural to ask if Theorem 2 also applies to proper martingales. That is, if *X* is a cadlag martingale and is a uniformly bounded predictable process, then is the integral necessarily a martingale? Unfortunately, this is not true. Theorem 2 shows that the integral will be a local martingale, but there are examples where it is not a proper martingale. However, by placing some restriction on *X*, it is possible to get a positive result. In particular, the following result says that will be a martingale if *X* is also square integrable. In fact, the result generalizes to -integrable martingales for all , although the proof of that more general statement will have to wait until after we have introduced the Burkholder-Davis-Gundy inequalities.

Lemma 4LetXbe a cadlag square integrable martingale and be a bounded predictable process. Then, is a square integrable martingale.

*Proof:* Suppose that for some constant *K* and set . Then, choose elementary predictable such that in the semimartingale topology and, hence, in probability for each time *t*. Then, are martingales and, by the inequality for elementary integrals of square integrable martingales given in the previous post,

So, is -bounded and, hence, is uniformly integrable. As convergence in probability of a uniformly integrable sequence implies -convergence, as *n* goes to infinity. Therefore, *Y* is a martingale. Finally, passing to a subsequence so that almost surely, Fatou’s lemma can be applied

So *Y* is square integrable as required. ⬜

Finally, let us consider how Theorem 2 can be extended to arbitrary X-integrable integrands. In order for to be a local martingale it must, at the very least, be locally integrable. In fact, it only needs to be shown that either the positive part or negative part of *Y* is locally integrable, as the following result shows.

Recall that we say that a process *Y* is locally integrable if there is a sequence of stopping times such that the stopped processes are all integrable, where is the maximum process of *Y*. The standard definition is often taken to apply only to nonnegative increasing processes, for which . The more general definition is used here, as it seems to give slightly cleaner statements and proofs.

Theorem 5Let for a local martingaleXandX-integrable process . Then, the following are equivalent.

Yis a local martingale.Yis locally integrable.- is locally integrable.
- is locally integrable.

*Proof:* Property 1 implies 2, because all local martingales are locally integrable. Then, 3 and 4 follow directly from 2. It only remains to show that 3 implies 1 because, applying the same result to *-Y*, 4 would then also imply 1, showing that all the conditions are equivalent.

So, suppose that is locally integrable. Consider the bounded predictable processes . These satisfy and have the same sign as at all times. Therefore, setting gives and . So, and , have the same sign. In particular, . Furthermore, by Theorem 2, are local martingales and, by dominated convergence, tend ucp to *Y*. Passing to a subsequence, we may suppose that converge uniformly to *Y* on bounded intervals, with probability one.

Set

which is a cadlag adapted process. Being left-continuous, is locally integrable. So is also locally integrable. Then,

is locally integrable, and so is *M*. By localizing, we may suppose that has finite expectation.

As convex functions of martingales are submartingales, are local submartingales, for any constant *a*. Furthermore, as is bounded by the integrable random variable , they are proper submartingales. Then, by the dominated convergence theorem for convergence of random variables, as *n* goes to infinity. It follows that is a submartingale. For any , monotone convergence gives

This shows that, after localizing, *Y* becomes a submartingale. Therefore, in general, *Y* is a local submartingale.

As local submartingales are locally integrable, it follows that *Y* and, in particular, are locally integrable. Then, the same argument as above can be applied to –*Y* to show that it is also a local submartingale. Finally, we have shown that locally, *Y* and –*Y* are both submartingales, so *Y* is a local martingale. ⬜

#### Example: Loss of the Local Martingale Property

Theorem 5 is as far as we can go in establishing the preservation of the local martingale property. There do exist integrals, with respect to local martingales, which are not locally integrable and, hence, cannot themselves be local martingales. To demonstrate this, consider the following simple example.

Let be independent random variables such that is uniformly distributed over the interval [0,1] and *U* has the discrete distribution . Then define the process

With respect to its (completed) natural filtration , *X* is a martingale. The only stopping times defined on this filtration are of the form for a fixed time . To see this, let *t* be the supremum of all times *s* satisfying . Then, for any , the random variables are all zero when restricted to the set and, therefore, any set satisfies or 1. In particular, and, therefore, has zero probability. This holds for all , giving . By construction, almost surely, giving .

Now consider the integral,

If is any stopping time with a positive probability of being non-zero, then for some fixed positive time *t*. So,

Therefore *Y* is not locally integrable, and cannot be a local martingale.

locally bounded predictable processDoes this just mean that the predictable process is finite?

Or in other words, what is an example for a predictable process that is not locally bounded?

Thanks a lot. Not just for a prospective answer but for the blog in general. It’s the best textbook on stochastic calculus that I’ve ever seen.

Comment by nutshell — 7 April 11 @ 7:06 PM |

Hi. For a process to be locally bounded means that there is a sequence of stopping times increasing to infinity such that the stopped processes are uniformly bounded (as defined here). Equivalently, are uniformly bounded.

This certainly implies that is finite. In fact, it implies that is almost surely finite for each time

t. The converse does not hold in general, but for a right-continuous filtration and predictable process , the converse does hold (its not easy to prove this without using some advanced results though).An example of a predictable but not locally bounded processes is . An example of a predictable process for which is finite but is not locally bounded can be constructed as follows. Let

Ube an unbounded random variable (e.g., standard normal). Set . With respect to its natural filtration, this is not locally bounded. As is trivial, any stopping time with positive probability of being positive is almost-surely positive. Then, the supremum of isU, which is not locally bounded. Note that if we passed to the right-continuous filtration then will be locally bounded. As |U| is -measurable, you can choose to be equal to zero whenever |U| is greater thannand infinity otherwise.Examples of non-predictable processes (but adapted to a right-continuous filtration) which are not locally bounded but for which is finite are given by Lévy processes with unbounded jump size.

Comment by George Lowther — 8 April 11 @ 12:58 AM |