Almost Sure

25 October 16

Optional Projection For Right-Continuous Processes

In filtering theory, we have a filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge 0},{\mathbb P})} and a signal process {\{X_t\}_{t\in{\mathbb R}_+}}. The sigma-algebra {\mathcal{F}_t} represents the collection of events which are observable up to and including time t. The process X is not assumed to be adapted, so need not be directly observable. For example, we may only be able to measure an observation process {Z_t=X_t+\epsilon_t}, which incorporates some noise {\epsilon_t}, and generates the filtration {\mathcal{F}_t}, so is adapted. The problem, then, is to compute an estimate for {X_t} based on the observable data at time t. Looking at the expected value of X conditional on the observable data, we obtain the following estimate for X at each time {t\in{\mathbb R}_+},

\displaystyle  Y_t={\mathbb E}[X_t\;\vert\mathcal{F}_t]{\rm\ \ (a.s.)} (1)

The process Y is adapted. However, as (1) only defines Y up to a zero probability set, it does not give us the paths of Y, which requires specifying its values simultaneously at the uncountable set of times in {{\mathbb R}_+}. Consequently, (1) does not tell us the distribution of Y at random times. So, it is necessary to specify a good version for Y.

Optional projection gives a uniquely defined process which satisfies (1), not just at every time t in {{\mathbb R}_+}, but also at all stopping times. The full theory of optional projection for jointly measurable processes requires the optional section theorem. As I will demonstrate, in the case where X is right-continuous, optional projection can be done by more elementary methods.

Throughout this post, it will be assumed that the underlying filtered probability space satisfies the usual conditions, meaning that it is complete and right-continuous, {\mathcal{F}_{t+}=\mathcal{F}_t}. Stochastic processes are considered to be defined up to evanescence. That is, two processes are considered to be the same if they are equal up to evanescence. In order to apply (1), some integrability requirements need to imposed on X. Often, to avoid such issues, optional projection is defined for uniformly bounded processes. For a bit more generality, I will relax this requirement a bit and use prelocal integrability. Recall that, in these notes, a process X is prelocally integrable if there exists a sequence of stopping times {\tau_n} increasing to infinity and such that

\displaystyle  1_{\{\tau_n > 0\}}\sup_{t < \tau_n}\lvert X_t\rvert (2)

is integrable. This is a strong enough condition for the conditional expectation (1) to exist, not just at each fixed time, but also whenever t is a stopping time. The main result of this post can now be stated.

Theorem 1 (Optional Projection) Let X be a right-continuous and prelocally integrable process. Then, there exists a unique right-continuous process Y satisfying (1).

Uniqueness is immediate, as (1) determines Y, almost-surely, at each fixed time, and this is enough to uniquely determine right-continuous processes up to evanescence. Existence of Y is the important part of the statement, and the proof will be left until further down in this post.

The process defined by Theorem 1 is called the optional projection of X, and is denoted by {{}^{\rm o}\!X}. That is, {{}^{\rm o}\!X} is the unique right-continuous process satisfying

\displaystyle  {}^{\rm o}\!X_t={\mathbb E}[X_t\;\vert\mathcal{F}_t]{\rm\ \ (a.s.)} (3)

for all times t. In practise, the process X will usually not just be right-continuous, but will also have left limits everywhere. That is, it is cadlag.

Theorem 2 Let X be a cadlag and prelocally integrable process. Then, its optional projection is cadlag.

A simple example of optional projection is where {X_t} is constant in t and equal to an integrable random variable U. Then, {{}^{\rm o}\!X_t} is the cadlag version of the martingale {{\mathbb E}[U\;\vert\mathcal{F}_t]}.

The proof of Theorem 2 will also be left until later. Existence of the optional projection for cadlag processes is simpler than for right-continuous processes, and I will prove the existence of the optional projection separately for cadlag and right-continuous processes below. For now, we show the basic properties of the optional projection assuming that it exists. Linearity of the projection is straightforward.

Lemma 3 Let X and Y be right-continuous and prelocally integrable processes. Then, for {\mathcal{F}_0}-measurable random variables {\lambda}, {\mu},

\displaystyle  {}^{\rm o}(\lambda X+\mu Y)=\lambda\,^{\rm o}\!X+\mu\,^{\rm o}Y.

Proof: It is clear that {\lambda X+\mu Y} is right-continuous and prelocally integrable. Setting {Z=\lambda\,^{\rm o}\!X+\mu\,^{\rm o}Y} we have,

\displaystyle  Z_t=\lambda{\mathbb E}[X_t\;\vert\mathcal{F}_t]+\mu{\mathbb E}[Y_t\;\vert\mathcal{F}_t] ={\mathbb E}[\lambda X_t+\mu Y_t\;\vert\mathcal{F}_t]

almost surely. So, by definition, Z is the optional projection of {\lambda X+\mu Y}. ⬜

It is also easy to show that optional projection commutes with multiplication by an adapted process.

Lemma 4 Let U and X be right-continuous processes such that U is adapted and X and UX are prelocally integrable. Then,

\displaystyle  {}^{\rm o}(UX)=U\,^{\rm o}\!X. (4)

Proof: The process {Z=U\,^{\rm o}\!X} is right-continuous and, as U is adapted,

\displaystyle  Z_t=U_t{\mathbb E}[X_t\;\vert\mathcal{F}_t]={\mathbb E}[U_tX_t\;\vert\mathcal{F}_t]

almost surely. So, by definition, Z is the optional projection of UX. ⬜

One of the most important properties of the optional projection is the fact that equation (3) still holds when t is generalized to be any stopping time.

Theorem 5 If X is a right-continuous and prelocally integrable process then, for all stopping times {\tau},

\displaystyle  1_{\{\tau < \infty\}}{}^{\rm o}\!X_\tau={\mathbb E}\left[1_{\{\tau < \infty\}}X_\tau\;\vert\mathcal{F}_\tau\right]{\rm\ \ (a.s.)}. (5)

Proof: In order to avoid having to keep multiplying by the term {1_{\{\tau < \infty\}}} in expressions such as (5), I will take all processes to be 0 at infinity. Suppose, first, that {\sup_t\lvert X_t\rvert} is integrable. Let U be a bounded {\mathcal{F}_\tau}-measurable random variable. If {\tau} is a simple stopping time taking values in a finite set {S\subseteq{\mathbb R}_+} then, using the fact that {1_{\{\tau=t\}}U} is {\mathcal{F}_t}-measurable for all times t,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle {\mathbb E}[U\,^{\rm o}\!X_\tau]&\displaystyle=\sum_{t\in S}{\mathbb E}[1_{\{\tau = t\}}U\,^{\rm o}\!X_t]\smallskip\\ &\displaystyle=\sum_{t\in S}{\mathbb E}[1_{\{\tau = t\}}UX_t] ={\mathbb E}[UX_\tau]. \end{array}

As {{}^{\rm o}\!X_\tau} is {\mathcal{F}_\tau}-measurable, this proves (5) and, in particular, implies that the set of random variables

\displaystyle  \lvert{}^{\rm o}\!X_\tau\rvert=\lvert{\mathbb E}[X_\tau\;\vert\mathcal{F}_\tau]\rvert\le{\mathbb E}[\sup_t\lvert X_t\rvert\;\vert\mathcal{F}_\tau],

over simple stopping times {\tau} is uniformly integrable.

Next, let {\tau} be an arbitrary stopping time and {\tau_n\ge\tau} be a sequence of simple stopping times decreasing to {\tau}. Then, using right-continuity of X and {{}^{\rm o}\!X} and the fact that U is {\mathcal{F}_{\tau_n}}-measurable,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle {\mathbb E}[U\,^{\rm o}\!X_\tau]&\displaystyle=\lim_{n\rightarrow\infty}{\mathbb E}[U\,^{\rm o}\!X_{\tau_n}]\smallskip\\ &\displaystyle=\lim_{n\rightarrow\infty}{\mathbb E}[UX_{\tau_n}]={\mathbb E}[UX_\tau]. \end{array}

This proves (5) in the case where {\sup_t\lvert X_t\rvert} is integrable.

If X is prelocally integrable, then let {\tau_n} be stopping times increasing to infinity such that (2) is integrable. From Lemma 4,

\displaystyle  {}^{\rm o}(1_{[0,\tau_n)}X)=1_{[0,\tau_n)}{}^{\rm o}\!X

and the above shows that

\displaystyle  1_{\{\tau < \tau_n\}}{}^{\rm o}\!X_\tau = {\mathbb E}[1_{\{\tau < \tau_n\}}X_\tau\;\vert\mathcal{F}_\tau] =1_{\{\tau < \tau_n\}}{\mathbb E}[X_\tau\;\vert\mathcal{F}_\tau]

(almost surely). Letting {\tau_n} increase to infinity gives the result. ⬜

Finally, we show that optional projection is indeed a projection operator. That is, applying it twice gives the same result as applying it once.

Lemma 6 If X is a right-continuous and prelocally integrable process, then so is its optional projection {{}^{\rm o}\!X} and,

\displaystyle  {}^{\rm o}({}^{\rm o}\!X)={}^{\rm o}\!X.

Proof: The fact that {{}^{\rm o}\!X} is adapted and right-continuous means that it is equal to its own optional projection, just so long as we can show that it satisfies the requirement of prelocal integrability. Supposing first that {\sup_t\lvert X_t\rvert} is integrable, we have the bound,

\displaystyle  \lvert{}^{\rm o}\!X_t\rvert=\lvert{\mathbb E}[X_t\;\vert\mathcal{F}_t]\rvert\le{\mathbb E}\left[\sup_{t\ge0}\lvert X_t\rvert\;\Big\vert\mathcal{F}_t\right]

The right hand side is a martingale so, taking a cadlag version, is locally integrable and hence prelocally integrable. Now, for arbitrary prelocally integrable X, let {\tau_n} be a sequence of stopping times increasing to infinity such that (2) is integrable. Then, using Lemma (4),

\displaystyle  1_{[0,\tau_n)}{}^{\rm o}\!X={}^{\rm o}(1_{[0,\tau_n)}X)

is prelocally integrable for each n. So, {{}^{\rm o}\!X} is prelocally integrable as required. ⬜

Optional Projection of Increasing Processes

Existence of the optional projection for increasing processes is relatively simple, and follows from the existence of cadlag modifactions of submartingales.

Lemma 7 Let X be an integrable, right-continuous and increasing process. Then, its optional projection exists and is a cadlag submartingale.

Proof: As X is integrable, the process Y can de defined at each time t by (1). For times {s\le t}, the fact that X is increasing gives

\displaystyle  {\mathbb E}[Y_t\;\vert\mathcal{F}_s]={\mathbb E}[X_t\;\vert\mathcal{F}_s]\ge{\mathbb E}[X_s\;\vert\mathcal{F}_s]=Y_s.

So, Y is a submartingale. If {t_n} is a sequence of times decreasing to {t} then, by dominated convergence,

\displaystyle  {\mathbb E}[Y_{t_n}]={\mathbb E}[X_{t_n}]\rightarrow{\mathbb E}[X_t]={\mathbb E}[Y_t]

as n goes to infinity. Therefore, Y has a cadlag modification which, by definition, is the optional projection of X. ⬜

Optional Projection of Integrable Variation Processes

Now, suppose that X is right-continuous with finite variation over each bounded interval. By the Jordan decomposition, {X-X_0} can be represented as the difference of increasing and right-continuous processes {X^+}, {X^-}, starting from 0,

\displaystyle  X_t=X_0 + X^+_t-X^-_t

and the sum of {X^+} and {X^-} is the variation of X,

\displaystyle  X^+_t+X^-_t=\int_0^t\,\lvert dX\rvert. (6)

It can be seen that {X^+_t} and {X^-_t} are measurable, by writing

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle X^+&\displaystyle=\lim\sum_{k=1}^n(X_{t_k}-X_{t_{k-1}})_+,\smallskip\\ \displaystyle X^-&\displaystyle=\lim\sum_{k=1}^n(X_{t_k}-X_{t_{k-1}})_-, \end{array}

where the limit is taken over a sequence of partitions {0=t_0\le t_1\le\cdots\le t_n=t} with mesh {\max_k\lvert t_k-t_{k-1}\rvert} going to zero. This decomposition gives a quick method of extending Lemma 7 to processes with integrable variation.

Lemma 8 Let X be a right-continuous integrable process with integrable variation over all finite time periods. Then, its optional projection exists and is cadlag.

Proof: As X has integrable variation over finite time periods, (6) shows that its increasing and decreasing components are integrable. So, applying Lemma 7, the optional projections of {X_0+X^+} and {X^-} exist and are cadlag submartingales. The optional projection of X can then be constructed as

\displaystyle  {}^{\rm o}\!X_t={}^{\rm o}(X_0+X^+)_t-{}^{\rm o}(X^-)_t.

This proof tells us more than just that {{}^{\rm o}\!X} exists and is cadlag. It expresses {{}^{\rm o}\!X} as the difference of submartingales. Although I will not make use of it here, this additional information is incorporated by the following lemma.

Lemma 9 Under the hypothesis of Lemma 8, {{}^{\rm o}\!X} is a quasimartingale with mean variation

\displaystyle  {\rm Var}_t({}^{\rm o}\!X)\le{\mathbb E}\left[\int_0^t\,\lvert dX\rvert\right].

Proof: As in the proof of Lemma 8, the optional projections of {Y=X_0+X^+} and {Z=X^-} are submartingales and {{}^{\rm o}\!X={}^{\rm o}Y-{}^{\rm o}\!Z}. Using subadditivity of the mean variation,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle {\rm Var}_t({}^{\rm o}\!X)&\displaystyle\le{\rm Var}_t({}^{\rm o}Y)+{\rm Var}_t({}^{\rm o}\!Z)\smallskip\\ &\displaystyle={\mathbb E}[{}^{\rm o}Y_t-{}^{\rm o}Y_0]+{\mathbb E}[{}^{\rm o}Z_t-{}^{\rm o}Z_0]\smallskip\\ &\displaystyle={\mathbb E}[Y_t-Y_0]+{\mathbb E}[Z_t-Z_0]\smallskip\\ &\displaystyle={\mathbb E}[X^+_t+X^-_t]={\mathbb E}\left[\int_0^t\,\lvert dX\rvert\right] \end{array}

The first equality here is just the formula for the mean variation of a submartingale and the final equality is (6). In particular, {{\rm Var}_t({}^{\rm o}\!X)} is finite so, by definition, X is a quasimartingale. ⬜

Optional Projection of Cadlag Processes

I will construct the optional projection of cadlag processes by taking limits of integrable variation processes, and apply the results above. The approximation by processes of integrable variation is given by the following lemma.

Lemma 10 Let X be a cadlag process such that {\sup_{t\ge0}\lvert X_t\rvert} is integrable. Then, there exists a sequence {\{X^n\}_{n=1,2,\ldots}} of cadlag processes of integrable variation such that, for each {t\in{\mathbb R}_+},

\displaystyle  {\mathbb E}\left[\sup_{s\le t}\lvert X_s-X^n_s\rvert\right]\rightarrow0 (7)

as n goes to infinity.

Proof: For any fixed {\epsilon > 0} and {t\ge0}, it is enough to construct an integrable variation process Y such that

\displaystyle  {\mathbb E}\left[\sup_{s\le t}\lvert X_s-Y_s\rvert\right] < \epsilon. (8)

Applying this to a sequence of times {t_n} increasing to infinity and {\epsilon_n} decreasing to zero will then give the required processes {Y^n}. Choosing {0 < \delta < \epsilon}, define the sequence {\tau_0,\tau_1,\ldots} of random times,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle \tau_0 &\displaystyle=0,\smallskip\\ \displaystyle \tau_{n+1}&\displaystyle=\inf\left\{s\ge\tau_n\colon\lvert X_s-X_{\tau_n}\rvert\ge\delta\right\}. \end{array}

As X is not adapted, these will not be stopping times. However, the debut theorem for right-continuous processes, applied with respect to the constant filtration {\mathcal{G}_s=\mathcal{F}}, shows that {\tau_n} are {\mathcal{F}}-measurable. Furthermore, the sequence {\tau_n} increases to infinity. Otherwise, as X has left limits everywhere, the sequence {X_{\tau_n}} would converge to a finite limit, contradicting the inequality {\lvert X_{\tau_n}-X_{\tau_{n-1}}\rvert\ge\delta}.

Now, for each {n\ge0}, define the process

\displaystyle  Z^n_t=\sum_{k=1}^n1_{\{\tau_{k-1}\le t < \tau_k\}}X_{\tau_{k-1}}.

These are cadlag with variation

\displaystyle  \sum_{k=1}^n\lvert X_{\tau_k}-X_{\tau_{k-1}}\rvert\le (2n+1)\sup_{t\ge0}\lvert X_t\rvert,

which is integrable. Also, by construction, we have {\lvert X_s-Z^n_s\rvert < \delta} for {\tau_n > s} and {Z^n_s=0} otherwise. So, using dominated convergence,

\displaystyle  \lim_{n\rightarrow\infty}{\mathbb E}\left[\sup_{s\le t}\lvert X_s-Z^n_s\rvert\right]\le\delta < \epsilon.

Taking {Y=Z^n} for large enough n gives (8). ⬜

Now, the optional projection of a cadlag process can be constructed as the limit of optional projections of integrable variation processes. We start with the case where X is dominated in {L^1}.

Lemma 11 Let X be a cadlag process such that {\sup_{t\ge0}\lvert X_t\rvert} is integrable. Then the optional projection of X exists and is cadlag.

Proof: Let {X^n} be a sequence of cadlag integrable variation processes satisfying (7). Lemma 8 guarantees that the optional projections {{}^{\rm o}\!X^n} exist and are cadlag. Looking at the difference of the optional projections at a time {s\le t},

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle\left\lvert{}^{\rm o}\!X^n_s-{}^{\rm o}\!X^m_s\right\rvert &\displaystyle=\left\lvert{\mathbb E}[X^n_s-X^m_s\;\vert\mathcal{F}_s]\right\rvert\smallskip\\ &\displaystyle\le{\mathbb E}\left[\sup_{u\le t}\lvert X^n_u-X^m_u\rvert\;\Big\vert\mathcal{F}_s\right]. \end{array}

The right hand side of this inequality is a martingale, so Doob’s maximal inequality can be applied for any {K > 0}.

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle {\mathbb P}\left(\sup_{s\le t}\lvert{}^{\rm o}\!X^n_s-{}^{\rm o}\!X^m_s\rvert\ge K\right) &\displaystyle\le K^{-1}{\mathbb E}\left[\sup_{u\le t}\lvert X^n_u-X^m_u\rvert\right]\smallskip\\ &\displaystyle\rightarrow0 \end{array}

as m and n go to infinity. So, the sequence {{}^{\rm o}\!X^n} converges ucp to a cadlag limit Y. By dominated convergence, taking limits in probability,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle Y_t=\lim_{n\rightarrow\infty}{}^{\rm o}\!X^n_t&\displaystyle=\lim_{n\rightarrow\infty}{\mathbb E}[X^n_t\;\vert\mathcal{F}_t]\smallskip\\ &\displaystyle={\mathbb E}[X_t\;\vert\mathcal{F}_t], \end{array}

and Y is the optional projection of X. ⬜

To complete the construction of the optional projection for cadlag processes, Lemma 11 needs to be extended to the case where X is only prelocally integrable. This generalization is done by the following.

Lemma 12 Let X be a prelocally integrable right-continuous process, and suppose that {\tau_n} are stopping times increasing to infinity such that the optional projections of {1_{[0,\tau_n)}X} exist. Then, the optional projection of X exists and

\displaystyle  1_{[0,\tau_n)}{}^{\rm o}\!X={}^{\rm o}(1_{[0,\tau_n)}X). (9)

Proof: For {m < n}, as {\tau_m\le\tau_n}, equation (4) gives

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle 1_{[0,\tau_m)}{}^{\rm o}(1_{[0,\tau_n)}X)&\displaystyle={}^{\rm o}(1_{[0,\tau_m)}1_{[0,\tau_n)}X)\smallskip\\ &\displaystyle={}^{\rm o}(1_{[0,\tau_m)}X). \end{array}

So, the optional projections agree at times {t <\tau_m}, and we can define a process Y by {Y_t={}^{\rm o}(1_{[0,\tau_n)}X)_t} for {t < \tau_n}. So, Y is right-continuous on the interval {[0,\tau_n)} and, taking the limit as n goes to infinity, Y is right-continuous. Also,

\displaystyle  1_{\{t < \tau_n\}}Y_t={}^{\rm o}(1_{[0,\tau_n)}X)_t={\mathbb E}[1_{\{t < \tau_n\}}X_t\;\vert\mathcal{F}_t].

Taking the limit as n goes to infinity gives (1), so Y is the optional projection of X and satisfies (9). ⬜

Finally, for prelocally integrable cadlag processes, we complete the construction of the optional projection, proving Theorem 2.

Lemma 13 Let X be a cadlag and prelocally integrable process. Then, its optional projection {{}^{\rm o}\!X} exists and is cadlag.

Proof: As X is prelocally integrable, there exists a sequence of stopping times {\tau_n} increasing to infinity such that (2) is integrable. Lemma 11 says that {1_{[0,\tau_n)}X} has a cadlag optional projection. So, by Lemma 12, the optional projection of X exists and satisfies (9). This shows that {{}^{\rm o}\!X_t} is cadlag over {t < \tau_n} and, taking the limit as n goes to infinity, is cadlag. ⬜

Optional Projection of Right-Continuous Processes

I now look at the optional projection of right-continuous processes. This is more difficult than the cadlag case, because it is not possible to find integrable variation processes approximating X in the sense of Lemma 10. Indeed, the convergence in (7) implies uniform convergence in probability, and ucp limits of cadlag processes are necessarily cadlag. Instead, I will use sequences approximating X pointwise from above and below. In the following, for any processes X and Y, the inequality {X\ge Y} means that X is greater than Y up to evanescence. That is, outside of a zero probability set, {X_t\ge Y_t} for all t. Similarly, convergence of a sequence of processes {X^n} to a limit X is to be taken as pointwise convergence up to evanescence. So, outside of a zero probability set, {X^n_t\rightarrow X_t} for all times t.

Lemma 14 Let X be a right-continuous process such that {\sup_t\lvert X_t\rvert} is integrable. Then, there exists a sequence, {\{X^n\}_{n=1,2,\ldots}} of cadlag integrable processes of integrable total variation such that {X^n} is decreasing in n and tends to X as n goes to infinity.

Proof: Choose a sequence of finite sets {S_n\subseteq(0,\infty)} such that {S_n\subseteq S_{n+1}} for each n, and {\bigcup_nS_n} is dense in {{\mathbb R}_+}. For any given n we write {S_n} as {\{t_1 < t_2 < \cdots < t_r\}}. Setting {t_0=0} and {t_{r+1}=\infty}, define the process {X^n} by

\displaystyle  X^n_t=\sup\left\{X_s\colon s\in[t,t_k)\right\}

for all t in {[t_{k-1},t_k)}. As {S_m} is a refinement of {S_n} for {m\ge n}, the supremum in the definition of {X^m_t} is taken over a subinterval of {[t,t_k)} and, hence, {X^m\le X^n}. Also, as {\bigcup_nS_n} is dense in {{\mathbb R}_+}, right-continuity ensures that {X^n_t} tends to {X_t} as n goes to infinity.

It is straightforward to see that {X^n_t} is right-continuous and decreasing across each interval {[t_{k-1},t_k)}, so it is cadlag. Furthermore, {\lvert X^n\rvert} is bounded by {\lvert\sup_tX_t\rvert}, so the variation of {X^n} across each of the intervals {[t_{k-1},t_k)} and at each of the times {t_k} is bounded by {2\lvert\sup_tX_t\rvert}. Hence, the total variation on {{\mathbb R}_+} is

\displaystyle  \int_0^\infty\,\lvert dX^n\rvert\le 2(2r+1)\sup_{t\ge0}\lvert X_t\rvert,

which is integrable. ⬜

I will make use of the approximation of X by integrable variation processes given by Lemma 14. This is much weaker than the convergence stated in Lemma 10 for the cadlag case, so it makes the proof that their optional projections converge more difficult than the argument used in the proof of 11 above. I start by showing that it is enough to show convergence, almost surely, at each stopping time.

Lemma 15 Let {\{X^n\}_{n=1,2,\ldots}} be a sequence of non-negative right-continuous adapted processes, and suppose that it is decreasing in n. If, for all stopping times {\tau},

\displaystyle  1_{\{\tau < \infty\}}X^n_\tau\rightarrow0{\rm\ \ (a.s.)}

as {n\rightarrow\infty}, then {X^n\rightarrow0}.

Proof: Choosing any {\epsilon > 0} it is enough to show that {\lim_nX^n < \epsilon}. Let {\mathcal{T}} denote the set of stopping times {\tau} such that {\lim_n1_{[0,\tau)}X^n < \epsilon}. Then {\mathcal{T}} is closed under taking the maximum of pairs of stopping times, and under taking the limit of increasing sequences. So, the essential supremum {\tau^*} of {\mathcal{T}} is itself in {\mathcal{T}}. It just remains to show that {\tau^*=\infty} almost surely.

Choosing any m, define the stopping time

\displaystyle  \sigma=\inf\left\{t\ge\tau^*\colon X^m_t\ge\epsilon\right\}.

So {X^m < \epsilon} on the interval {[\tau^*,\sigma)}, implying that {\sigma\in\mathcal{T}}. By maximality of {\tau^*}, this gives {\sigma=\tau^*} and, by right-continuity, {X^m_{\tau^*}\ge\epsilon} almost surely whenever {\tau^* < \infty}. If {\tau^*} is finite with nonzero probability then this contradicts the condition that {X^m_{\tau^*}\rightarrow0} (almost surely). Therefore, {\tau^*=\infty} almost surely. ⬜

The optional projection of a process at a stopping time is given simply by the conditional expectation as in (5). By applying this together with Lemma 15, it can be shown that optional projection satisfies a monotone convergence property.

Lemma 16 (Monotone Convergence) Let {\{X^n\}_{n=1,2,\ldots}} be a sequence of prelocally integrable right-continuous processes decreasing to 0 as n goes to infinity.

Then, supposing that their optional projections {{}^{\rm o}\!X^n} exist, they also decrease to 0.

Proof: For {m\ge n} we have

\displaystyle  {}^{\rm o}\!X^m_t={\mathbb E}[X^m_t\;\vert\mathcal{F}_t]\le{\mathbb E}[X^n_t\;\vert\mathcal{F}_t]={}\!X^n_t

almost surely. By right-continuity this implies that, outside of a zero probability set, {{}^{\rm o}\!X^n_t} is decreasing in n for all t. For any stopping time {\tau}, Theorem 5 gives

\displaystyle  1_{\{\tau < \infty\}}{}^{\rm o}\!X^n_\tau={\mathbb E}[1_{\{\tau < \infty\}}X^n_\tau\;\vert\mathcal{F}_\tau] \rightarrow0

almost surely. The limit here is using monotone convergence for the conditional expectation. Finally, Lemma 15 gives {{}^{\rm o}\!X^n\rightarrow0} as required. ⬜

We can now construct the optional projection of a right-continuous process by approximating with processes of integrable variation, as in Lemma 14, and applying monotone convergence.

Lemma 17 Let X be a right-continuous process such that {\sup_t\lvert X_t\rvert} is integrable. Then, its right-continuous optional projection exists.

Proof: By Lemma 14, there exists a sequence {Y^n} of integrable cadlag processes of integrable variation decreasing to X as n goes to infinity. Applying the same result to {-X}, there also exists a sequence {Z^n} of integrable cadlag processes of integrable variation increasing to X. So, {Y^n-Z^n} is decreasing to 0. Applying Lemma 8, the optional projections

\displaystyle  {}^{\rm o}(Y^n-Z^n)={}^{\rm o}Y^n-{}^{\rm o}\!Z^n

exist. As {Y^n} and {Z^n} are respectively decreasing and increasing in n, the same holds for their optional projections. Lemma 16 says that {{}^{\rm o}Y^n-{}^{\rm o}\!Z^n} is decreasing to zero. So, we can define a process W up to evanescence by

\displaystyle  W_t=\lim_{n\rightarrow\infty}{}^{\rm o}Y^n_t=\lim_{n\rightarrow\infty}{}^{\rm o}\!Z^n_t.

By dominated convergence,

\displaystyle  W_t=\lim_{n\rightarrow\infty}{\mathbb E}[Y^n_t\;\vert\mathcal{F}_t]={\mathbb E}[X_t\;\vert\mathcal{F}_t]

almost surely. To show that W is the optional projection of X, only right-continuity remains. If {t_m} is a sequence decreasing to t then,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle W_{t_m}&\displaystyle\le{}^{\rm o}Y^n_{t_m}\rightarrow {}^{\rm o}Y^n_t.\smallskip\\ \displaystyle W_{t_m}&\displaystyle\ge{}^{\rm o}Z^n_{t_m}\rightarrow {}^{\rm o}Z^n_t. \end{array}

as m goes to infinity, for each fixed n. Then letting n go to infinity gives

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle\limsup_{m\rightarrow\infty}W_{t_m}\le W_t,\smallskip\\ \displaystyle\liminf_{m\rightarrow\infty}W_{t_m}\ge W_t. \end{array}

Therefore, {W_{t_m}\rightarrow W_t} as required. ⬜

All that remains is to extend Lemma 17 to prelocally integrable processes.

Proof of Theorem 1: As X is prelocally integrable, there exists a sequence {\tau_n} of stopping times increasing to infinity such that (2) is integrable. Then, Lemma 17 says that the optional projection of {1_{[0,\tau_n)}X} exists for each n. Finally, Lemma 12 says that the optional projection of X exists. ⬜

Advertisements

Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Blog at WordPress.com.