# Almost Sure

## 18 June 12

### Rao’s Quasimartingale Decomposition

In this post I’ll give a proof of Rao’s decomposition for quasimartingales. That is, every quasimartingale decomposes as the sum of a submartingale and a supermartingale. Equivalently, every quasimartingale is a difference of two submartingales, or alternatively, of two supermartingales. This was originally proven by Rao (Quasi-martingales, 1969), and is an important result in the general theory of continuous-time stochastic processes.

As always, we work with respect to a filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}$. It is not required that the filtration satisfies either of the usual conditions — the filtration need not be complete or right-continuous. The methods used in this post are elementary, requiring only basic measure theory along with the definitions and first properties of martingales, submartingales and supermartingales. Other than referring to the definitions of quasimartingales and mean variation given in the previous post, there is no dependency on any of the general theory of semimartingales, nor on stochastic integration other than for elementary integrands.

Recall that, for an adapted integrable process X, the mean variation on an interval ${[0,t]}$ is

$\displaystyle {\rm Var}_t(X)=\sup{\mathbb E}\left[\int_0^t\xi\,dX\right],$

where the supremum is taken over all elementary processes ${\xi}$ with ${\vert\xi\vert\le1}$. Then, X is a quasimartingale if and only if ${{\rm Var}_t(X)}$ is finite for all positive reals t. It was shown that all supermartingales are quasimartingales with mean variation given by

 $\displaystyle {\rm Var}_t(X)={\mathbb E}\left[X_0-X_t\right].$ (1)

Rao’s decomposition can be stated in several different ways, depending on what conditions are required to be satisfied by the quasimartingale X. As the definition of quasimartingales does differ between texts, there are different versions of Rao’s theorem around although, up to martingale terms, they are equivalent. In this post, I’ll give three different statements with increasingly stronger conditions for X. First, the following statement applies to all quasimartingales as defined in these notes. Theorem 1 can be compared to the Jordan decomposition, which says that any function ${f\colon{\mathbb R}_+\rightarrow{\mathbb R}}$ with finite variation on bounded intervals can be decomposed as the difference of increasing functions or, equivalently, of decreasing functions. Replacing finite variation functions by quasimartingales and decreasing functions by supermartingales gives the following.

Theorem 1 (Rao) A process X is a quasimartingale if and only if it decomposes as

 $\displaystyle X=Y-Z$ (2)

for supermartingales Y and Z. Furthermore,

• this decomposition can be done in a minimal sense, so that if ${X=Y^\prime-Z^\prime}$ is any other such decomposition then ${Y^\prime-Y=Z^\prime-Z}$ is a supermartingale.
• the inequality
 $\displaystyle {\rm Var}_t(X)\le{\mathbb E}[Y_0-Y_t]+{\mathbb E}[Z_0-Z_t],$ (3)

holds, with equality for all ${t\ge0}$ if and only if the decomposition is minimal.

• the minimal decomposition is unique up to a martingale. That is, if ${X=Y-Z=Y^\prime-Z^\prime}$ are two such minimal decompositions, then ${Y^\prime-Y=Z^\prime-Z}$ is a martingale.

## 12 April 12

### Quasimartingales

Quasimartingales are a natural generalization of martingales, submartingales and supermartingales. They were first introduced by Fisk in order to extend the Doob-Meyer decomposition to a larger class of processes, showing that continuous quasimartingales can be decomposed into martingale and finite variation terms (Quasi-martingales, 1965). This was later extended to right-continuous processes by Orey (F-Processes, 1967). The way in which quasimartingales relate to sub- and super-martingales is very similar to how functions of finite variation relate to increasing and decreasing functions. In particular, by the Jordan decomposition, any finite variation function on an interval decomposes as the sum of an increasing and a decreasing function. Similarly, a stochastic process is a quasimartingale if and only if it can be written as the sum of a submartingale and a supermartingale. This important result was first shown by Rao (Quasi-martingales, 1969), and means that much of the theory of submartingales can be extended without much work to also cover quasimartingales.

Often, given a process, it is important to show that it is a semimartingale so that the techniques of stochastic calculus can be applied. If there is no obvious decomposition into local martingale and finite variation terms, then, one way of doing this is to show that it is a quasimartingale. All right-continuous quasimartingales are semimartingales. This result is also important in the general theory of semimartingales with, for example, many proofs of the Bichteler-Dellacherie theorem involving quasimartingales.

In this post, I will mainly be concerned with the definition and very basic properties of quasimartingales, and look at the more advanced theory in the following post. We work with respect to a filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}$. It is not necessary to assume that either of the usual conditions, of right-continuity or completeness, hold. First, the mean variation of a process is defined as follows.

Definition 1 The mean variation of an integrable stochastic process X on an interval ${[0,t]}$ is

 $\displaystyle {\rm Var}_t(X)=\sup{\mathbb E}\left[\sum_{k=1}^n\left\vert{\mathbb E}\left[X_{t_k}-X_{t_{k-1}}\;\vert\mathcal{F}_{t_{k-1}}\right]\right\vert\right].$ (1)

Here, the supremum is taken over all finite sequences of times,

$\displaystyle 0=t_0\le t_1\le\cdots\le t_n=t.$

A quasimartingale, then, is a process with finite mean variation on each bounded interval.

Definition 2 A quasimartingale, X, is an integrable adapted process such that ${{\rm Var}_t(X)}$ is finite for each time ${t\in{\mathbb R}_+}$.

## 30 December 11

### The Doob-Meyer Decomposition

The Doob-Meyer decomposition was a very important result, historically, in the development of stochastic calculus. This theorem states that every cadlag submartingale uniquely decomposes as the sum of a local martingale and an increasing predictable process. For one thing, if X is a square-integrable martingale then Jensen’s inequality implies that ${X^2}$ is a submartingale, so the Doob-Meyer decomposition guarantees the existence of an increasing predictable process ${\langle X\rangle}$ such that ${X^2-\langle X\rangle}$ is a local martingale. The term ${\langle X\rangle}$ is called the predictable quadratic variation of X and, by using a version of the Ito isometry, can be used to define stochastic integration with respect to square-integrable martingales. For another, semimartingales were historically defined as sums of local martingales and finite variation processes, so the Doob-Meyer decomposition ensures that all local submartingales are also semimartingales. Going further, the Doob-Meyer decomposition is used as an important ingredient in many proofs of the Bichteler-Dellacherie theorem.

The approach taken in these notes is somewhat different from the historical development, however. We introduced stochastic integration and semimartingales early on, without requiring much prior knowledge of the general theory of stochastic processes. We have also developed the theory of semimartingales, such as proving the Bichteler-Dellacherie theorem, using a stochastic integration based method. So, the Doob-Meyer decomposition does not play such a pivotal role in these notes as in some other approaches to stochastic calculus. In fact, the special semimartingale decomposition already states a form of the Doob-Meyer decomposition in a more general setting. So, the main part of the proof given in this post will be to show that all local submartingales are semimartingales, allowing the decomposition for special semimartingales to be applied.

The Doob-Meyer decomposition is especially easy to understand in discrete time, where it reduces to the much simpler Doob decomposition. If ${\{X_n\}_{n=0,1,2,\ldots}}$ is an integrable discrete-time process adapted to a filtration ${\{\mathcal{F}_n\}_{n=0,1,2,\ldots}}$, then the Doob decomposition expresses X as

 $\displaystyle \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle X_n&\displaystyle=M_n+A_n,\smallskip\\ \displaystyle A_n&\displaystyle=\sum_{k=1}^n{\mathbb E}\left[X_k-X_{k-1}\;\vert\mathcal{F}_{k-1}\right]. \end{array}$ (1)

As previously discussed, M is then a semimartingale and A is an integrable process which is also predictable, in the sense that ${A_n}$ is ${\mathcal{F}_{n-1}}$-measurable for each ${n > 0}$. Furthermore, X is a submartingale if and only if ${{\mathbb E}[X_n-X_{n-1}\vert\mathcal{F}_{n-1}]\ge0}$ or, equivalently, if A is almost surely increasing.

Moving to continuous time, we work with respect to a complete filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}$ with time index t ranging over the nonnegative real numbers. Then, the continuous-time version of (1) takes A to be a right-continuous and increasing process which is predictable, in the sense that it is measurable with respect to the σ-algebra generated by the class of left-continuous and adapted processes. Often, the Doob-Meyer decomposition is stated under additional assumptions, such as X being of class (D) or satisfying some similar uniform integrability property. To be as general possible, the statement I give here only requires X to be a local submartingale, and furthermore states how the decomposition is affected by various stronger hypotheses that X may satisfy.

Theorem 1 (Doob-Meyer) Any local submartingale X has a unique decomposition

 $\displaystyle X=M+A,$ (2)

where M is a local martingale and A is a predictable increasing process starting from zero.

Furthermore,

1. if X is a proper submartingale, then A is integrable and satisfies

 $\displaystyle {\mathbb E}[A_\tau]\le{\mathbb E}[X_\tau-X_0]$ (3)

for all uniformly bounded stopping times ${\tau}$.

2. X is of class (DL) if and only if M is a proper martingale and A is integrable, in which case
 $\displaystyle {\mathbb E}[A_\tau]={\mathbb E}[X_\tau-X_0]$ (4)

for all uniformly bounded stopping times ${\tau}$.

3. X is of class (D) if and only if M is a uniformly integrable martingale and ${A_\infty}$ is integrable. Then, ${X_\infty=\lim_{t\rightarrow\infty}X_t}$ and ${M_\infty=\lim_{t\rightarrow\infty}M_t}$ exist almost surely, and (4) holds for all (not necessarily finite) stopping times ${\tau}$.

## 27 December 11

### Compensators of Counting Processes

A counting process, X, is defined to be an adapted stochastic process starting from zero which is piecewise constant and right-continuous with jumps of size 1. That is, letting ${\tau_n}$ be the first time at which ${X_t=n}$, then

$\displaystyle X_t=\sum_{n=1}^\infty 1_{\{\tau_n\le t\}}.$

By the debut theorem, ${\tau_n}$ are stopping times. So, X is an increasing integer valued process counting the arrivals of the stopping times ${\tau_n}$. A basic example of a counting process is the Poisson process, for which ${X_t-X_s}$ has a Poisson distribution independently of ${\mathcal{F}_s}$, for all times ${t > s}$, and for which the gaps ${\tau_n-\tau_{n-1}}$ between the stopping times are independent exponentially distributed random variables. As we will see, although Poisson processes are just one specific example, every quasi-left-continuous counting process can actually be reduced to the case of a Poisson process by a time change. As always, we work with respect to a complete filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}$.

Note that, as a counting process X has jumps bounded by 1, it is locally integrable and, hence, the compensator A of X exists. This is the unique right-continuous predictable and increasing process with ${A_0=0}$ such that ${X-A}$ is a local martingale. For example, if X is a Poisson process of rate ${\lambda}$, then the compensated Poisson process ${X_t-\lambda t}$ is a martingale. So, the compensator of X is the continuous process ${A_t=\lambda t}$. More generally, X is said to be quasi-left-continuous if ${{\mathbb P}(\Delta X_\tau=0)=1}$ for all predictable stopping times ${\tau}$, which is equivalent to the compensator of X being almost surely continuous. Another simple example of a counting process is ${X=1_{[\tau,\infty)}}$ for a stopping time ${\tau > 0}$, in which case the compensator of X is just the same thing as the compensator of ${\tau}$.

As I will show in this post, compensators of quasi-left-continuous counting processes have many parallels with the quadratic variation of continuous local martingales. For example, Lévy’s characterization states that a local martingale X starting from zero is standard Brownian motion if and only if its quadratic variation is ${[X]_t=t}$. Similarly, as we show below, a counting process is a homogeneous Poisson process of rate ${\lambda}$ if and only if its compensator is ${A_t=\lambda t}$. It was also shown previously in these notes that a continuous local martingale X has a finite limit ${X_\infty=\lim_{t\rightarrow\infty}X_t}$ if and only if ${[X]_\infty}$ is finite. Similarly, a counting process X has finite value ${X_\infty}$ at infinity if and only if the same is true of its compensator. Another property of a continuous local martingale X is that it is constant over all intervals on which its quadratic variation is constant. Similarly, a counting process X is constant over any interval on which its compensator is constant. Finally, it is known that every continuous local martingale is simply a continuous time change of standard Brownian motion. In the main result of this post (Theorem 5), we show that a similar statement holds for counting processes. That is, every quasi-left-continuous counting process is a continuous time change of a Poisson process of rate 1. (more…)

## 20 December 11

### Compensators of Stopping Times

The previous post introduced the concept of the compensator of a process, which is known to exist for all locally integrable semimartingales. In this post, I’ll just look at the very special case of compensators of processes consisting of a single jump of unit size.

Definition 1 Let ${\tau}$ be a stopping time. The compensator of ${\tau}$ is defined to be the compensator of ${1_{[\tau,\infty)}}$.

So, the compensator A of ${\tau}$ is the unique predictable FV process such that ${A_0=0}$ and ${1_{[\tau,\infty)}-A}$ is a local martingale. Compensators of stopping times are sufficiently special that we can give an accurate description of how they behave. For example, if ${\tau}$ is predictable, then its compensator is just ${1_{\{\tau > 0\}}1_{[\tau,\infty)}}$. If, on the other hand, ${\tau}$ is totally inaccessible and almost surely finite then, as we will see below, its compensator, A, continuously increases to a value ${A_\infty}$ which has the exponential distribution.

However, compensators of stopping times are sufficiently general to be able to describe the compensator of any cadlag adapted process X with locally integrable variation. We can break X down into a continuous part plus a sum over its jumps,

 $\displaystyle X_t=X_0+X^c_t+\sum_{n=1}^\infty\Delta X_{\tau_n}1_{[\tau_n,\infty)}.$ (1)

Here, ${\tau_n > 0}$ are disjoint stopping times such that the union ${\bigcup_n[\tau_n]}$ of their graphs contains all the jump times of X. That they are disjoint just means that ${\tau_m\not=\tau_n}$ whenever ${\tau_n < \infty}$, for any ${m\not=n}$. As was shown in an earlier post, not only does is such a sequence ${\tau_n}$ of the stopping times guaranteed to exist, but each of the times can be chosen to be either predictable or totally inaccessible. As the first term, ${X^c_t}$, on the right hand side of (1) is a continuous FV process, it is by definition equal to its own compensator. So, the compensator of X is equal to ${X^c}$ plus the sum of the compensators of ${\Delta X_{\tau_n}1_{[\tau_n,\infty)}}$. The reduces compensators of locally integrable FV processes to those of processes consisting of a single jump at either a predictable or a totally inaccessible time. (more…)

## 22 November 11

### Compensators

A very common technique when looking at general stochastic processes is to break them down into separate martingale and drift terms. This is easiest to describe in the discrete time situation. So, suppose that ${\{X_n\}_{n=0,1,\ldots}}$ is a stochastic process adapted to the discrete-time filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_n\}_{n=0,1,\ldots},{\mathbb P})}$. If X is integrable, then it is possible to decompose it into the sum of a martingale M and a process A, starting from zero, and such that ${A_n}$ is ${\mathcal{F}_{n-1}}$-measurable for each ${n\ge1}$. That is, A is a predictable process. The martingale condition on M enforces the identity

$\displaystyle A_n-A_{n-1}={\mathbb E}[A_n-A_{n-1}\vert\mathcal{F}_{n-1}]={\mathbb E}[X_n-X_{n-1}\vert\mathcal{F}_{n-1}].$

So, A is uniquely defined by

 $\displaystyle A_n=\sum_{k=1}^n{\mathbb E}\left[X_k-X_{k-1}\vert\mathcal{F}_{k-1}\right],$ (1)

and is referred to as the compensator of X. This is just the predictable term in the Doob decomposition described at the start of the previous post.

In continuous time, where we work with respect to a complete filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}$, the situation is much more complicated. There is no simple explicit formula such as (1) for the compensator of a process. Instead, it is defined as follows.

Definition 1 The compensator of a cadlag adapted process X is a predictable FV process A, with ${A_0=0}$, such that ${X-A}$ is a local martingale.

For an arbitrary process, there is no guarantee that a compensator exists. From the previous post, however, we know exactly when it does. The processes for which a compensator exists are precisely the special semimartingales or, equivalently, the locally integrable semimartingales. Furthermore, if it exists, then the compensator is uniquely defined up to evanescence. Definition 1 is considerably different from equation (1) describing the discrete-time case. However, we will show that, at least for processes with integrable variation, the continuous-time definition does follow from the limit of discrete time compensators calculated along ever finer partitions (see below).

Although we know that compensators exist for all locally integrable semimartingales, the notion is often defined and used specifically for the case of adapted processes with locally integrable variation or, even, just integrable increasing processes. As with all FV processes, these are semimartingales, with stochastic integration for locally bounded integrands coinciding with Lebesgue-Stieltjes integration along the sample paths. As an example, consider a homogeneous Poisson process X with rate ${\lambda}$. The compensated Poisson process ${M_t=X_t-\lambda t}$ is a martingale. So, X has compensator ${\lambda t}$.

We start by describing the jumps of the compensator, which can be done simply in terms of the jumps of the original process. Recall that the set of jump times ${\{t\colon\Delta X_t\not=0\}}$ of a cadlag process are contained in the graphs of a sequence of stopping times, each of which is either predictable or totally inaccessible. We, therefore, only need to calculate ${\Delta A_\tau}$ separately for the cases where ${\tau}$ is a predictable stopping time and when it is totally inaccessible.

For the remainder of this post, it is assumed that the underlying filtered probability space is complete. Whenever we refer to the compensator of a process X, it will be understood that X is a special semimartingale. Also, the jump ${\Delta X_t}$ of a process is defined to be zero at time ${t=\infty}$.

Lemma 2 Let A be the compensator of a process X. Then, for a stopping time ${\tau}$,

1. ${\Delta A_\tau=0}$ if ${\tau}$ is totally inaccessible.
2. ${\Delta A_\tau={\mathbb E}\left[\Delta X_\tau\vert\mathcal{F}_{\tau-}\right]}$ if ${\tau}$ is predictable.

## 3 October 11

### Special Semimartingales

For stochastic processes in discrete time, the Doob decomposition uniquely decomposes any integrable process into the sum of a martingale and a predictable process. If ${\{X_n\}_{n=0,1,\ldots}}$ is an integrable process adapted to a filtration ${\{\mathcal{F}_n\}_{n=0,1,\ldots}}$ then we write ${X_n=M_n+A_n}$. Here, M is a martingale, so that ${M_{n-1}={\mathbb E}[M_n\vert\mathcal{F}_{n-1}]}$, and A is predictable with ${A_0=0}$. By saying that A is predictable, we mean that ${A_n}$ is ${\mathcal{F}_{n-1}}$ measurable for each ${n\ge1}$. It can be seen that this implies that

$\displaystyle A_n-A_{n-1}={\mathbb E}[A_n-A_{n-1}\vert\mathcal{F}_{n-1}]={\mathbb E}[X_n-X_{n-1}\vert\mathcal{F}_{n-1}].$

Then it is possible to write A and M as

 $\displaystyle \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle A_n&\displaystyle=\sum_{k=1}^n{\mathbb E}[X_k-X_{k-1}\vert\mathcal{F}_{k-1}],\smallskip\\ \displaystyle M_n&\displaystyle=X_n-A_n. \end{array}$ (1)

So, the Doob decomposition is unique and, conversely, the processes A and M constructed according to equation (1) can be seen to be respectively, a predictable process starting from zero and a martingale. For many purposes, this allows us to reduce problems concerning processes in discrete time to simpler statements about martingales and separately about predictable processes. In the case where X is a submartingale then things reduce further as, in this case, A will be an increasing process.

The situation is considerably more complicated when looking at processes in continuous time. The extension of the Doob decomposition to continuous time processes, known as the Doob-Meyer decomposition, was an important result historically in the development of stochastic calculus. First, we would usually restrict attention to sufficiently nice modifications of the processes and, in particular, suppose that X is cadlag. When attempting an analogous decomposition to the one above, it is not immediately clear what should be meant by the predictable component. The continuous time predictable processes are defined to be the set of all processes which are measurable with respect to the predictable sigma algebra, which is the sigma algebra generated by the space of processes which are adapted and continuous (or, equivalently, left-continuous). In particular, all continuous and adapted processes are predictable but, due to the existence of continuous martingales such as Brownian motion, this means that decompositions as sums of martingales and predictable processes are not unique. It is therefore necessary to impose further conditions on the term A in the decomposition. It turns out that we obtain unique decompositions if, in addition to being predictable, A is required to be cadlag with locally finite variation (an FV process). The processes which can be decomposed into a local martingale and a predictable FV process are known as special semimartingales. This is precisely the space of locally integrable semimartingales. As usual, we work with respect to a complete filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}$ and two stochastic processes are considered to be the same if they are equivalent up to evanescence.

Theorem 1 For a process X, the following are equivalent.

• X is a locally integrable semimartingale.
• X decomposes as
 $\displaystyle X=M+A$ (2)

for a local martingale M and predictable FV process A.

Furthermore, choosing ${A_0=0}$, decomposition (2) is unique.

Theorem 1 is a general version of the Doob-Meyer decomposition. However, the name Doob-Meyer decomposition’ is often used to specifically refer to the important special case where X is a submartingale. Historically, the theorem was first stated and proved for that case, and I will look at the decomposition for submartingales in more detail in a later post. (more…)

## 18 July 11

### Predictable FV Processes

By definition, an FV process is a cadlag adapted stochastic process which almost surely has finite variation over finite time intervals. These are always semimartingales, because the stochastic integral for bounded integrands can be constructed by taking the Lebesgue-Stieltjes integral along sample paths. Also, from the previous post on continuous semimartingales, we know that the class of continuous FV processes is particularly well behaved under stochastic integration. For one thing, given a continuous FV process X and predictable ${\xi}$, then ${\xi}$ is X-integrable in the stochastic sense if and only if it is almost surely Lebesgue-Stieltjes integrable along the sample paths of X. In that case the stochastic and Lebesgue-Stieltjes integrals coincide. Furthermore, the stochastic integral preserves the class of continuous FV processes, so that ${\int\xi\,dX}$ is again a continuous FV process. It was also shown that all continuous semimartingales decompose in a unique way as the sum of a local martingale and a continuous FV process, and that the stochastic integral preserves this decomposition.

Moving on to studying non-continuous semimartingales, it would be useful to extend the results just mentioned beyond the class of continuous FV processes. The first thought might be to simply drop the continuity requirement and look at all FV processes. After all, we know that every FV process is a semimartingale and, by the Bichteler-Dellacherie theorem, that every semimartingale decomposes as the sum of a local martingale and an FV process. However, this does not work out very well. The existence of local martingales with finite variation means that the decomposition given by the Bichteler-Dellacherie theorem is not unique, and need not commute with stochastic integration for integrands which are not locally bounded. Also, it is possible for the stochastic integral of a predictable ${\xi}$ with respect to an FV process X to be well-defined even if ${\xi}$ is not Lebesgue-Stieltjes integrable with respect to X along its sample paths. In this case, the integral ${\int\xi\,dX}$ is not itself an FV process. See this post for examples where this happens.

Instead, when we do not want to restrict ourselves to continuous processes, it turns out that the class of predictable FV processes is the correct generalisation to use. By definition, a process is predictable if it is measurable with respect to the set of adapted and left-continuous processes so, in particular, continuous FV processes are predictable. We can show that all predictable FV local martingales are constant (Lemma 2 below), which will imply that decompositions into the sum of local martingales and predictable FV processes are unique (up to constant processes). I do not look at general semimartingales in this post, so will not prove the existence of such decompositions, although they do follow quickly from the results stated here. We can also show that predictable FV processes are very well behaved with respect to stochastic integration. A predictable process ${\xi}$ is integrable with respect to a predictable FV process X in the stochastic sense if and only if it is Lebesgue-Stieltjes integrable along the sample paths, in which case stochastic and Lebesgue-Stieltjes integrals agree. Also, ${\int\xi\,dX}$ will again be a predictable FV process. See Theorem 6 below.

In the previous post on continuous semimartingales, it was also shown that the continuous FV processes can be characterised in terms of their quadratic variations and covariations. They are precisely the semimartingales with zero quadratic variation. Alternatively, they are continuous semimartingales which have zero quadratic covariation with all local martingales. We start by extending this characterisation to the class of predictable FV processes. As always, we work with respect to a complete filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}$ and two stochastic processes are considered to be equal if they are equivalent up to evanescence. Recall that, in these notes, the notation ${[X]^c_t=[X]_t-\sum_{s\le t}(\Delta X_s)^2}$ is used to denote the continuous part of the quadratic variation of a semimartingale X.

Theorem 1 For a process X, the following are equivalent.

1. X is a predictable FV process.
2. X is a predictable semimartingale with ${[X]^c=0}$.
3. X is a semimartingale such that ${[X,M]}$ is a local martingale for all local martingales M.
4. X is a semimartingale such that ${[X,M]}$ is a local martingale for all uniformly bounded cadlag martingales M.

## 26 May 11

### Predictable Stopping Times

Although this post is under the heading of the general theory of semimartingales’ it is not, strictly speaking, about semimartingales at all. Instead, I will be concerned with a characterization of predictable stopping times. The reason for including this now is twofold. First, the results are too advanced to have been proven in the earlier post on predictable stopping times, and reasonably efficient self-contained proofs can only be given now that we have already built up a certain amount of stochastic calculus theory. Secondly, the results stated here are indispensable to the further study of semimartingales. In particular, standard semimartingale decompositions require some knowledge of predictable processes and predictable stopping times.

Recall that a stopping time ${\tau}$ is said to be predictable if there exists a sequence of stopping times ${\tau_n\le\tau}$ increasing to ${\tau}$ and such that ${\tau_n < \tau}$ whenever ${\tau > 0}$. Also, the predictable sigma-algebra ${\mathcal{P}}$ is defined as the sigma-algebra generated by the left-continuous and adapted processes. Stated like this, these two concepts can appear quite different. However, as was previously shown, stochastic intervals of the form ${[\tau,\infty)}$ for predictable times ${\tau}$ are all in ${\mathcal{P}}$ and, in fact, generate the predictable sigma-algebra.

The main result (Theorem 1) of this post is to show that a converse statement holds, so that ${[\tau,\infty)}$ is in ${\mathcal{P}}$ if and only if the stopping time ${\tau}$ is predictable. This rather simple sounding result does have many far-reaching consequences. We can use it show that all cadlag predictable processes are locally bounded, local martingales are predictable if and only if they are continuous, and also give a characterization of cadlag predictable processes in terms of their jumps. Some very strong statements about stopping times also follow without much difficulty for certain special stochastic processes. For example, if the underlying filtration is generated by a Brownian motion then every stopping time is predictable. Actually, this is true whenever the filtration is generated by a continuous Feller process. It is also possible to give a surprisingly simple characterization of stopping times for filtrations generated by arbitrary non-continuous Feller processes. Precisely, a stopping time ${\tau}$ is predictable if the process is almost surely continuous at time ${\tau}$ and is totally inaccessible if the underlying Feller process is almost surely discontinuous at ${\tau}$.

As usual, we work with respect to a complete filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\in{\mathbb R}_+},{\mathbb P})}$. I now give a statement and proof of the main result of this post. Note that the equivalence of the four conditions below means that any of them can be used as alternative definitions of predictable stopping times. Often, the first condition below is used instead. Stopping times satisfying the definition used in these notes are sometimes called announceable, with the sequence ${\tau_n\uparrow\tau}$ said to announce ${\tau}$ (this terminology is used by, e.g., Rogers & Williams). Stopping times satisfying property 3 below, which is easily seen to be equivalent to 2, are sometimes called fair. Then, the following theorem says that the sets of predictable, fair and announceable stopping times all coincide.

Theorem 1 Let ${\tau}$ be a stopping time. Then, the following are equivalent.

1. ${[\tau]\in\mathcal{P}}$.
2. ${\Delta M_\tau1_{[\tau,\infty)}}$ is a local martingale for all local martingales M.
3. ${{\mathbb E}[1_{\{\tau < \infty\}}\Delta M_\tau]=0}$ for all cadlag bounded martingales M.
4. ${\tau}$ is predictable.

## 3 May 11

### Continuous Semimartingales

A stochastic process is a semimartingale if and only if it can be decomposed as the sum of a local martingale and an FV process. This is stated by the Bichteler-Dellacherie theorem or, alternatively, is often taken as the definition of a semimartingale. For continuous semimartingales, which are the subject of this post, things simplify considerably. The terms in the decomposition can be taken to be continuous, in which case they are also unique. As usual, we work with respect to a complete filtered probability space ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}$, all processes are real-valued, and two processes are considered to be the same if they are indistinguishable.

Theorem 1 A continuous stochastic process X is a semimartingale if and only if it decomposes as

 $\displaystyle X=M+A$ (1)

for a continuous local martingale M and continuous FV process A. Furthermore, assuming that ${A_0=0}$, decomposition (1) is unique.

Proof: As sums of local martingales and FV processes are semimartingales, X is a semimartingale whenever it satisfies the decomposition (1). Furthermore, if ${X=M+A=M^\prime+A^\prime}$ were two such decompositions with ${A_0=A^\prime_0=0}$ then ${M-M^\prime=A^\prime-A}$ is both a local martingale and a continuous FV process. Therefore, ${A^\prime-A}$ is constant, so ${A=A^\prime}$ and ${M=M^\prime}$.

It just remains to prove the existence of decomposition (1). However, X is continuous and, hence, is locally square integrable. So, Lemmas 4 and 5 of the previous post say that we can decompose ${X=M+A}$ where M is a local martingale, A is an FV process and the quadratic covariation ${[M,A]}$ is a local martingale. As X is continuous we have ${\Delta M=-\Delta A}$ so that, by the properties of covariations,

 $\displaystyle -[M,A]_t=-\sum_{s\le t}\Delta M_s\Delta A_s=\sum_{s\le t}(\Delta A_s)^2.$ (2)

We have shown that ${-[M,A]}$ is a nonnegative local martingale so, in particular, it is a supermartingale. This gives ${\mathbb{E}[-[M,A]_t]\le\mathbb{E}[-[M,A]_0]=0}$. Then (2) implies that ${\Delta A}$ is zero and, hence, A and ${M=X-A}$ are continuous. $\Box$

Using decomposition (1), it can be shown that a predictable process ${\xi}$ is X-integrable if and only if it is both M-integrable and A-integrable. Then, the integral with respect to X breaks down into the sum of the integrals with respect to M and A. This greatly simplifies the construction of the stochastic integral for continuous semimartingales. The integral with respect to the continuous FV process A is equivalent to Lebesgue-Stieltjes integration along sample paths, and it is possible to construct the integral with respect to the continuous local martingale M for the full set of M-integrable integrands using the Ito isometry. Many introductions to stochastic calculus focus on integration with respect to continuous semimartingales, which is made much easier because of these results.

Theorem 2 Let ${X=M+A}$ be the decomposition of the continuous semimartingale X into a continuous local martingale M and continuous FV process A. Then, a predictable process ${\xi}$ is X-integrable if and only if

 $\displaystyle \int_0^t\xi^2\,d[M]+\int_0^t\vert\xi\vert\,\vert dA\vert < \infty$ (3)

almost surely, for each time ${t\ge0}$. In that case, ${\xi}$ is both M-integrable and A-integrable and,

 $\displaystyle \int\xi\,dX=\int\xi\,dM+\int\xi\,dA$ (4)

gives the decomposition of ${\int\xi\,dX}$ into its local martingale and FV terms.

Next Page »

The Rubric Theme. Create a free website or blog at WordPress.com.