Almost Sure

12 April 12


Quasimartingales are a natural generalization of martingales, submartingales and supermartingales. They were first introduced by Fisk in order to extend the Doob-Meyer decomposition to a larger class of processes, showing that continuous quasimartingales can be decomposed into martingale and finite variation terms (Quasi-martingales, 1965). This was later extended to right-continuous processes by Orey (F-Processes, 1967). The way in which quasimartingales relate to sub- and super-martingales is very similar to how functions of finite variation relate to increasing and decreasing functions. In particular, by the Jordan decomposition, any finite variation function on an interval decomposes as the sum of an increasing and a decreasing function. Similarly, a stochastic process is a quasimartingale if and only if it can be written as the sum of a submartingale and a supermartingale. This important result was first shown by Rao (Quasi-martingales, 1969), and means that much of the theory of submartingales can be extended without much work to also cover quasimartingales.

Often, given a process, it is important to show that it is a semimartingale so that the techniques of stochastic calculus can be applied. If there is no obvious decomposition into local martingale and finite variation terms, then, one way of doing this is to show that it is a quasimartingale. All right-continuous quasimartingales are semimartingales. This result is also important in the general theory of semimartingales with, for example, many proofs of the Bichteler-Dellacherie theorem involving quasimartingales.

In this post, I will mainly be concerned with the definition and very basic properties of quasimartingales, and look at the more advanced theory in the following post. We work with respect to a filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}. It is not necessary to assume that either of the usual conditions, of right-continuity or completeness, hold. First, the mean variation of a process is defined as follows.

Definition 1 The mean variation of an integrable stochastic process X on an interval {[0,t]} is

\displaystyle  {\rm Var}_t(X)=\sup{\mathbb E}\left[\sum_{k=1}^n\left\vert{\mathbb E}\left[X_{t_k}-X_{t_{k-1}}\;\vert\mathcal{F}_{t_{k-1}}\right]\right\vert\right]. (1)

Here, the supremum is taken over all finite sequences of times,

\displaystyle  0=t_0\le t_1\le\cdots\le t_n=t.

A quasimartingale, then, is a process with finite mean variation on each bounded interval.

Definition 2 A quasimartingale, X, is an integrable adapted process such that {{\rm Var}_t(X)} is finite for each time {t\in{\mathbb R}_+}.

Before going further, I’ll briefly mention that the definition of quasimartingales given by different authors do differ slightly, and is often more restrictive than the one above. This should be noted before comparing the precise statement of the results given here with those stated in some other texts. It is sometimes required that the mean variation is uniformly bounded on all of {{\mathbb R}_+} and, also, that the process is {L^1} bounded. Furthermore, some authors require right-continuity. The definition given here is equivalent to that used by Métivier (Semimartingales: a course on stochastic processes). An immediate consequence of Definition 2 is that all martingales, submartingales and supermartingales are also quasimartingales.

Lemma 3 Any martingale, submartingale, or supermartingale X is a quasimartingale. Furthermore,

\displaystyle  {\rm Var}_t(X)=\left\vert{\mathbb E}[X_t-X_0]\right\vert. (2)

Proof: Replacing X by –X if necessary, we can suppose that X is a submartingale. Then, {{\mathbb E}[X_{t_k}-X_{t_{k-1}}\vert\mathcal{F}_{t_{k-1}}]\ge0} almost surely. So, the absolute value signs can be removed from (1) and, by linearity of expectations, the right hand side of (1) reduces to (2) regardless of the choice of partition. In particular, the mean variation is finite on each bounded interval, so X is a quasimartingale. ⬜

However, unlike sub- and super-martingales, the space of quasimartingales is closed under taking linear combinations, so forms a vector space.

Lemma 4 The space of quasimartingales is closed under taking linear combinations. Furthermore,

  • if X is an integrable process and {\lambda\in{\mathbb R}}, then

    \displaystyle  {\rm Var}_t(\lambda X)=\vert\lambda\vert{\rm Var}_t(X). (3)
  • if X and Y are integrable processes then
    \displaystyle  {\rm Var}_t(X+Y)\le{\rm Var}_t(X)+{\rm Var}_t(Y). (4)

In other words, {\{{\rm Var}_t\}_{t\ge0}} forms a family of seminorms on the space of quasimartingales. Before moving on to the proof of Lemma 4, I’ll define the following simple bit of notation, just to keep the formulas in the remainder of the post reasonably short. Given a process X and a sequence of times {t_k} ({k=0,1,\ldots}), the notation

\displaystyle  \delta X_k\equiv X_{t_k}-X_{t_{k-1}}.

will be used to denote the increments of X across the given time steps. Now, moving on to the proof of Lemma 4.

Proof: Equation (3) is simply the result of substituting {\lambda X} in place of X in (1). If X is a quasimartingale then {\lambda X} has finite mean variation on bounded intervals and, hence, is also a quasimartingale.

Equation (4) is the result of substituting {X+Y} in place of X on the right hand side of (1) and applying the inequality

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle\left\vert{\mathbb E}\left[\delta(X+Y)_k\vert\mathcal{F}_{t_{k-1}}\right]\right\vert\le\left\vert{\mathbb E}\left[\delta X_k\vert\mathcal{F}_{t_{k-1}}\right]\right\vert+ \left\vert{\mathbb E}\left[\delta Y_k\vert\mathcal{F}_{t_{k-1}}\right]\right\vert. \end{array}

So, in particular, if X and Y are quasimartingales then so is {X+Y}. ⬜

Another simple class of quasimartingales is the processes of finite expected variation, since we can bound the mean variation by the expected variation. In general, the expected variation just gives an upper bound for the mean variation, and we do not have equality in (5). For example, martingales always have zero mean variation regardless of the pathwise variation. However, as we will see in a later post, (5) does become an equality in the case that X is a predictable FV process.

Lemma 5 If X is an integrable FV process then,

\displaystyle  {\rm Var}_t(X) \le{\mathbb E}\left[\int_0^t\,\lvert dX\rvert\right]. (5)

In particular, if X has integrable variation over each finite interval {[0,t]} then it is a quasimartingale.

Proof: We can compute the mean variation over {[0,t]} by taking the supremum of (1) over all sequences {0=t_0\le\cdots\le t_n=t},

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle {\rm Var}_t(X) &\displaystyle= \sup{\mathbb E}\left[\sum_{k=1}^n\left\lvert{\mathbb E}[X_{t_k}-X_{t_{k-1}}\;\vert\mathcal{F}_{t_{k-1}}]\right\rvert\right]\smallskip\\ &\displaystyle\le\sup{\mathbb E}\left[\sum_{k=1}^n{\mathbb E}[\lvert X_{t_k}-X_{t_{k-1}}\rvert\;\vert\mathcal{F}_{t_{k-1}}]\right]\smallskip\\ &\displaystyle=\sup{\mathbb E}\left[\sum_{k=1}^n\lvert X_{t_k}-X_{t_{k-1}}\rvert\right]\smallskip\\ &\displaystyle\le{\mathbb E}\left[\int_0^t\,\lvert dX\rvert\right] \end{array}

as required. ⬜

The mean variation can alternatively be defined via stochastic integrals of elementary predictable integrands. In practise, I usually find this definition slightly easier to work with than the one given above.

Lemma 6 The mean variation of an integrable process X is given by

\displaystyle  {\rm Var}_t(X)=\sup\left\{{\mathbb E}\left[\int_0^t\xi\,dX\right]\colon\vert\xi\vert\le1,\;\xi{\rm\ is\ elementary}\right\}. (6)

Proof: For any partition {0=t_0\le t_1\le\cdots\le t_n=t}, define the elementary process

\displaystyle  \xi=\sum_{k=1}^n{\rm sgn}({\mathbb E}[\delta X_k\vert\mathcal{F}_{t_{k-1}}])1_{(t_{k-1},t_k]}.


\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{\mathbb E}\left[\int_0^t\xi\,dX\right]&\displaystyle=\sum_{k=1}^n{\mathbb E}\left[\xi_{t_k}\delta X_k\right]\smallskip\\ &\displaystyle=\sum_{k=1}^n{\mathbb E}\left[\xi_{t_k}{\mathbb E}[\delta X_k\vert\mathcal{F}_{t_{k-1}}]\right]\smallskip\\ &\displaystyle=\sum_{k=1}^n{\mathbb E}\left[\left\vert{\mathbb E}[\delta X_k\vert\mathcal{F}_{t_{k-1}}]\right\vert\right]. \end{array}

Taking the supremum over all such partitions shows that (6) holds with equality replaced by ≤. To prove the reverse inequality, consider any elementary {\xi}. Then, there exists {0=t_0\le t_1\le\cdots\le t_n=t} such that, on each interval {(t_{k-1},t_k]}, {\xi} takes a constant {\mathcal{F}_{t_{k-1}}}-measurable value. Then, if {\vert\xi\vert\le1}, we have

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{\mathbb E}\left[\int_0^t\xi\,dX\right]&\displaystyle={\mathbb E}\left[\sum_{k=1}^n\xi_{t_k}\delta X_k\right]\smallskip\\ &\displaystyle={\mathbb E}\left[\sum_{k=1}^n\xi_{t_k}{\mathbb E}[\delta X_k\vert\mathcal{F}_{t_{k-1}}]\right]\smallskip\\ &\displaystyle\le{\mathbb E}\left[\sum_{k=1}^n\left\vert{\mathbb E}[\delta X_k\vert\mathcal{F}_{t_{k-1}}]\right\vert\right]\smallskip\\ &\displaystyle\le{\rm Var}_t(X). \end{array}

So, (6) holds with equality replaced by ≥. ⬜

As mentioned above, some texts require quasimartingales to be right-continuous or cadlag. This requirement is actually rather weak, as just right-continuity in probability is sufficient to guarantee the existence of cadlag modifications.

Theorem 7 Every quasimartingale which is right-continuous in probability has a cadlag modification.

Proof: This is stated by Theorem 1 of the post on cadlag modifications. Note that, by Lemma 6 above, the first condition in the statement of that theorem is equivalent to X being a quasimartingale. ⬜

Another immediate consequence of Lemma 6 is that, as with martingales, the space of quasimartingales is closed under the integration of bounded elementary processes.

Lemma 8 If X is a quasimartingale and {Y=\int\xi\,dX} for a bounded elementary process {\xi}, then Y is a quasimartingale. Furthermore, if {\vert\xi\vert\le K} for a constant K, then

\displaystyle  {\rm Var}_t(Y)\le K{\rm Var}_t(X). (7)

Proof: We have {\int\alpha\,dY=\int\alpha\xi\,dX} for any elementary process {\alpha}. If {K=0} then (7) is trivially true. Otherwise, if {\vert\alpha\vert\le1} then {K^{-1}\alpha\xi} is also bounded by 1. So,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{\mathbb E}\left[\int_0^t\alpha\,dY\right]&\displaystyle=K{\mathbb E}\left[\int_0^tK^{-1}\alpha\xi\,dX\right]\smallskip\\ &\displaystyle\le K{\rm Var}_t(X). \end{array}

Here, Lemma 6 gives the inequality. Then taking the supremum over all elementary {\vert\alpha\vert\le1} and applying Lemma 6 to the left hand side gives (7). In particular, Y is a quasimartingale. ⬜

It should be immediate, either from Definition 1 or Lemma 6, that {{\rm Var}_t(X)} is an increasing function of t. This means that the mean variation on {[0,\infty)} can be defined by taking the limit as t goes to infinity which, by Lemma 6, is given by

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{\rm Var}(X)&\displaystyle\equiv\lim_{t\rightarrow\infty}{\rm Var}_t(X)\smallskip\\ &\displaystyle=\sup{\mathbb E}\left[\int_0^\infty\xi\,dX\right]. \end{array} (8)

Here, the supremum is taken as {\xi} runs through the space of elementary processes with {\vert\xi\vert\le1}. Then, an adapted integrable process X satisfies {{\rm Var}(X) < \infty} if and only if it is a quasimartingale whose mean variation is bounded on all of {{\mathbb R}_+}. This is always satisfied by martingales, where {{\rm Var}_t(X)=0}. For submartingales and supermartingales, however, Lemma 3 shows that {{\rm Var}(X)} being finite is equivalent to {{\mathbb E}[X_t]} being a bounded function of t. The requirement for {{\rm Var}(X)} to be finite is used as part of the definition of quasimartingales in some texts, such as Rao’s original paper (Quasi-martingales, 1969) and in Dellacherie & Meyer (Probabilities and Potential B), where it is referred to as a quasimartingale on {[0,\infty)}.

An alternative definition which is sometimes used for the mean variation of an adapted integrable process, and which I will denote by {{\rm Var}^*_t(X)}, is given by

Definition 9 If X is an integrable process on the interval {[0,t]} then define,

\displaystyle  {\rm Var}^*_t(X)={\rm Var}_t(X)+{\mathbb E}[\vert X_t\vert].

Otherwise, if {X_s} is not integrable for any {s\le t}, then we set {{\rm Var}^*_t(X)=\infty}.

As I show below, {{\rm Var}^*_t(X)} it is an increasing function of t so that, for any {s < t}, the inequality {{\mathbb E}[\lvert X_s\rvert]\le{\rm Var}^*_t(X)} holds. So, extending to non-integrable processes on {[0,t]} by setting {{\rm Var}^*_t(X)=\infty} makes sense, and preserves this inequality, Also, by monotonicity in t, the limit

\displaystyle  {\rm Var}^*(X)\equiv\lim_{t\rightarrow\infty}{\rm Var}^*_t(X)

is well-defined. This is sometimes called the mean variation of X on the interval {[0,\infty]} (see equation (9) below). Also, it follows from Lemma 4 that {{\rm Var}^*} is a seminorm on the space of all adapted processes satisfying {{\rm Var}^*(X) < \infty}. That is, for adapted processes X and Y, and {\lambda\in{\mathbb R}},

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle{\rm Var}^*(\lambda X)=\vert\lambda\vert{\rm Var}^*(X),\smallskip\\ &\displaystyle{\rm Var}^*(X+Y)\le{\rm Var}^*(X)+{\rm Var}^*(Y). \end{array}

By Lemma 3 it can be seen that, for sub- and super-martingales, the condition {{\rm Var}^*(X) < \infty} is equivalent to X being {L^1} bounded. Many texts on the subject impose the, rather strong, condition on quasimartingales that {{\rm Var}^*(X)} is finite. This is the case, for example, in Protter (Stochastic Integration and Differential Equations), Rogers & Williams (Diffusions, Markov Processes, and Martingales), Kallenberg (Foundations of Modern Probability), He Wang & Yan (Semimartingale Theory and Stochastic Calculus), and also in Dellacherie & Meyer (Probabilities and Potential B) where it is referred to as a quasimartingale on {[0,\infty]}.

Finally, for this post, I show that {{\rm Var}^*_t(X)} is indeed increasing in t. Compare the similarity between equation (9) below and (8) above. The only difference now is that the elementary integrand {\xi} takes values on the whole of the nonnegative extended real numbers, {[0,\infty]}.

Lemma 10 If X is an integrable adapted process then {{\rm Var}^*_t(X)} is increasing in t. Furthermore, if we extend X to be a stochastic process with index set {[0,\infty]} by setting {X_\infty=0} then,

\displaystyle  {\rm Var}^*(X)=\sup{\mathbb E}\left[\int_0^\infty\xi\,dX\right]. (9)

Here, the supremum is over all elementary processes {\xi} with index set {[0,\infty]} and satisfying {\vert\xi\vert\le1}.

Proof: For any {t\in{\mathbb R}_+}, we have

\displaystyle  {\mathbb E}[\vert X_t\vert]=\sup{\mathbb E}\left[U(X_\infty-X_t)\right].

The supremum is taken over all {\mathcal{F}_t}-measurable random variables U with {\vert U\vert\le1}, and the maximum is attained at {U=-{\rm sgn}(X_t)}. So, by Lemma 6,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle{\rm Var}^*_t(X)&\displaystyle=\sup{\mathbb E}\left[\int_0^t\xi\,dX+U(X_\infty-X_t)\right]\smallskip\\ &\displaystyle=\sup{\mathbb E}\left[\int_0^\infty(\xi1_{[0,t]}+U1_{(t,\infty]})\,dX\right], \end{array} (10)

with the supremum taken over all elementary processes {\xi} and {\mathcal{F}_t}-measurable U with {\vert\xi\vert} and {\vert U\vert} both bounded by 1. Let {S_t} denote the set of elementary processes {\alpha} with index set {[0,\infty]} satisfying {\vert\alpha\vert\le1}, and which are constant on the interval {(t,\infty]}. Then, every {\alpha\in S_t} can be written in the form {\xi1_{[0,t]}+U1_{(t,\infty]}} for {\xi,U} as above. So, (10) can be written as

\displaystyle  {\rm Var}^*_t(X)=\sup\left\{{\mathbb E}\left[\int_0^\infty\xi\,dX\right]\colon\xi\in S_t\right\}.

In particular, {S_s\subseteq S_t} for any {s\le t}, so {{\rm Var}^*_s(X)\le{\rm Var}^*_t(X)}. Finally, taking the supremum over {t\in{\mathbb R}_+},

\displaystyle  {\rm Var}^*(X)=\sup\left\{{\mathbb E}\left[\int_0^\infty\xi\,dX\right]\colon\xi\in \bigcup_{t\in{\mathbb R}_+}S_t\right\}.

As {\bigcup_{t\in{\mathbb R}_+}S_t} is just the set of elementary processes {\xi} with index set {[0,\infty]} and satisfying {\vert\xi\vert\le1}, this is equivalent to (9). ⬜

Alternatively, {{\rm Var}^*(X)} can be expressed in terms of partitions of the interval {[0,\infty]}, similar to the way we defined the mean variation above.

Corollary 11 If X is an integrable adapted process, then

\displaystyle  {\rm Var}^*(X)=\sup{\mathbb E}\left[\sum_{k=1}^n\left\lvert{\mathbb E}[X_{t_k}-X_{t_{k-1}}\;\vert\mathcal{F}_{t_{k-1}}]\right\rvert\right]. (11)

Here, the supremum is taken over all finite sequences of times

\displaystyle  0=t_0\le t_1\le\cdots\le t_n=\infty

and we extend X to a process on {[0,\infty]} by taking {X_\infty=0}.

Proof: This follows in exactly the same way as Lemma 6 above, showing that the right hand side of equations (11) and (9) agree. The only difference here is that we apply the argument on the closed interval {[0,\infty]} instead of the closed bounded interval {[0,t]} but, otherwise, it is unchanged. Then, Lemma 10 shows that that this is equal to {{\rm Var}^*(X)}. ⬜

Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Blog at