Almost Sure

22 November 16

Predictable Processes

In contrast to optional processes, the class of predictable processes was used extensively in the development of stochastic integration in these notes. They appeared as integrands in stochastic integrals then, later on, as compensators and in the Doob-Meyer decomposition. Since they are also central to the theory of predictable section and projection, I will revisit the basic properties of predictable processes now. In particular, any of the collections of sets and processes in the following theorem can equivalently be used to define the predictable sigma-algebra. As usual, we work with respect to a complete filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\in{\mathbb R}_+},{\mathbb P})}. However, completeness is not actually required for the following result. All processes are assumed to be real valued, or take values in the extended reals {\bar{\mathbb R}={\mathbb R}\cup\{\pm\infty\}}.

Theorem 1 The following collections of sets and processes each generate the same sigma-algebra on {{\mathbb R}_+\times\Omega}.

  1. {{[\tau,\infty)}: {\tau} is a predictable stopping time}.
  2. {Z1_{[\tau,\infty)}} as {\tau} ranges over the predictable stopping times and Z over the {\mathcal{F}_{\tau-}}-measurable random variables.
  3. {\{A\times(t,\infty)\colon t\in{\mathbb R}_+,A\in\mathcal{F}_t\}\cup\{A\times\{0\}\colon A\in\mathcal{F}_0\}}.
  4. The elementary predictable processes.
  5. {{(\tau,\infty)}: {\tau} is a stopping time}{\cup}{{A\times\{0\}\colon A\in\mathcal{F}_0}}.
  6. The left-continuous adapted processes.
  7. The continuous adapted processes.

Compare this with the analogous result for sets/processes generating the optional sigma-algebra given in the previous post. The proof of Theorem 1 is given further below. First, recall that the predictable sigma-algebra was previously defined to be generated by the left-continuous adapted processes. However, it can equivalently be defined by any of the collections stated in Theorem 1. To make this clear, I now restate the definition making use if this equivalence.

Definition 2 The predictable sigma-algebra, {\mathcal{P}}, is the sigma-algebra on {{\mathbb R}_+\times\Omega} generated by any of the collections of sets/processes in Theorem 1.

A stochastic process is predictable iff it is {\mathcal{P}}-measurable.

We previously showed that the running maximum of a progressive process is optional. In the case that the process is predictable, then it is straightforward to show that its running maximum is also predictable.

Lemma 3 If X is a predictable process then so is {X^*_t\equiv\sup_{s\le t}X_s}.

Proof: As in the proof of Lemma 3 of the post on measurable projection, the process {Y_t\equiv\sup_{s < t}X_s} is left-continuous and adapted. So, Y is predictable and {X^*=Y\vee X} is the maximum of two predictable processes, so is predictable. ⬜

I proved, in the post on measurable projection, that the limit supremum, and left and right-limit supremum of a progressively measurable process is again progressive. Then, it was seen that most of these limits are in fact optional. I now show that the (strict) left limit-supremum is predictable.

Lemma 4 If X is a progressively measurable process then

\displaystyle  t\mapsto\limsup_{s\uparrow\uparrow t}X_s

is predictable. If X is predictable then so is

\displaystyle  t\mapsto\limsup_{s\uparrow t}X_s.

Proof: For each positive integer n, choose a sequence of times {0=t^n_0 < t^n_1 < \cdots} increasing to infinity, and such that {\sup_k(t^n_k-t^n_{k-1})} tends to zero as n goes to infinity. For example, {t^n_k=k/n}. Define the process {Y^n} by {Y^n_0=X_0} and

\displaystyle  Y^n_t=\sup\left\{X_s\colon s\in[t^n_{k-1},t)\right\}

for {t\in(t^n_{k-1},t^n_k]}. This is left-continuous and, by Lemma 4 of the post on measurable projection, is adapted. So, {Y^n} is predictable. Then,

\displaystyle  \limsup_{s\uparrow\uparrow t}X_s=\lim_{n\rightarrow\infty}Y^n_t

is predictable. Finally, if X is predictable then

\displaystyle  \limsup_{s\uparrow t}X_s=X_t\vee\limsup_{s\uparrow\uparrow t}X_s

is the maximum of two predictable processes, so is predictable. ⬜

A map {\tau\colon\Omega\rightarrow{\mathbb R}_+} is measurable if and only if its graph, {[\tau]}, is jointly measurable. Furthermore, {\tau} is a stopping time if and only if {[\tau]} is progressive, and also if and only if {[\tau]} is optional. In the current context, it would seem natural to ask if {\tau} being a predictable stopping time is equivalent to its graph being predictable. In fact, this is true. However, the proof of this requires the predictable section theorem or other advanced results. So, I will leave this until after introducing predictable section in the next post of these notes.

It can be shown that every optional process is equal to a predictable process outside of a thin set. Recall that a process X is thin if it is optional and the set {\{X\not=0\}} is contained in the graphs of a countable sequence of stopping times.

Lemma 5 For any optional process X, there exists a predictable process Y such that {X-Y} is thin.

Proof: Consider the collection, {\mathcal C}, of all processes which are the sum of a predictable and a thin process. We need to show that every optional process is in {\mathcal C}, which will be done with the functional monotone class theorem.

First, letting {\mathcal E} be the collection of stochastic intervals {[\tau,\infty)} for stopping times {\tau}, then {\mathcal E} is a pi-system generating the optional sigma-algebra. For any {[\tau,\infty)} in {\mathcal E}, we can decompose the indicator function

\displaystyle  1_{[\tau,\infty)}=1_{(\tau,\infty)}+1_{[\tau]}

as the sum of a predictable and a thin process, so {1_{[\tau,\infty)}} is in {\mathcal C}. Next, consider processes X and Y in {\mathcal C}. Then, {X=X^\prime + H} and {Y=Y^\prime+J} for predictable processes {X^\prime,Y^\prime} and thin processes {H,J}. Writing

\displaystyle  aX+bY=(aX^\prime+bY^\prime)+(aH+bJ)

for any real numbers {a,b} shows that {aX+bY} is in {\mathcal C}.

Consider a sequence {X^n\in\mathcal C} converging pointwise to a limit X. We can write {X^n=Y^n+H^n} for predictable {Y^n} and thin {H^n}. Letting S be the collection of {(t,\omega)} in {{\mathbb R}_+\times\Omega} for which {Y^n_t(\omega)} converges to a finite limit, predictability of {Y^n} implies that {S\in\mathcal P}. So, {1_SY^n} converges pointwise to a predictable process Y. On the set {\bigcap_n\{H^n=0\}} we have {Y^n=X^n\rightarrow X} and, so, {X=Y} on this set. As {\{H^n\not=0\}} is contained in {\bigcup_m[\tau_{nm}]} for a sequence of stopping times {\tau_{n1},\tau_{n2},\ldots},

\displaystyle  \{X-Y\not=0\}\subseteq\bigcup_n\{H^n\not=0\}\subseteq\bigcup_{m,n}[\tau_{nm}].

Therefore, {X\in\mathcal C}. Then, the functional monotone class theorem says that every optional process is in {\mathcal C} as required. ⬜

Proof of Theorem 1

Finally for this post, I will give the proof of Theorem 1 stated above.

Letting {\mathcal{P}_k}, {k=1,2,\ldots,7}, be the sigma-algebra generated by each of the respective collections of sets/processes of the theorem, it needs to be shown that these are all equal. As {\mathcal{P}_6} is generated by the left-continuous adapted processes, it equals what we originally defined as the the predictable sigma-algebra.

We have already shown the following equalities.

  • The predictable sigma-algebra is generated by the continuous adapted processes, {\mathcal{P}_7=\mathcal{P}_6}. See Lemma 2 of the post on filtrations and adapted processes.
  • The predictable sigma-algebra is generated by sets as in the third statement, {\mathcal{P}_3=\mathcal{P}_6}. See Lemma 3 of the post on filtrations and adapted processes.
  • The predictable sigma-algebra is generated by the stochastic intervals {[\tau,\infty)} for predictable stopping times {\tau}, {\mathcal{P}_1=\mathcal{P}_6}. See Lemma 7 of the post on predictable stopping times.

I now prove the remaining equivalences.

Proof of {\mathcal{P}_4=\mathcal{P}_6}: As elementary predictable processes are left-continuous and adapted, we have {\mathcal{P}_4\subseteq\mathcal{P}_6}. Also, if {S\subseteq{\mathbb R}_+\times\Omega} is as in the third statement, then {1_{[0,T]}1_S} is an elementary predictable process for each {T > 0} and, letting T go to infinity, shows that {S\in\mathcal{P}_4}. Hence, {\mathcal{P}_3\subseteq\mathcal{P}_4} and, as we have already shown that {\mathcal{P}_3=\mathcal{P}_6}, this completes the proof. ⬜

Proof of {\mathcal{P}_5=\mathcal{P}_6}: If {\tau} is a stopping time then {1_{(\tau,\infty)}} is left-continuous and adapted, so we see that {\mathcal{P}_5\subseteq\mathcal{P}_6}. Next, for any {t\in{\mathbb R}_+} and {A\in\mathcal{F}_t},

\displaystyle  \tau(\omega)=\begin{cases} t,&\textrm{if }\omega\in A,\\ \infty,&\textrm{if }\omega\not\in A \end{cases}

is a stopping time. Since {A\times(t,\infty)} is equal to {(\tau,\infty)}, it follows that {\mathcal{P}_3\subseteq\mathcal{P}_5}. As {\mathcal{P}_3=\mathcal{P}_6} has already been shown, this completes the proof. ⬜

Proof of {\mathcal{P}_2=\mathcal{P}_1}: It is clear by taking {Z=1} in the processes in the second statement that we have {\mathcal{P}_1\subseteq\mathcal{P}_2}.

Now, letting {\tau} be a predictable stopping time then it just remains to demonstrate that {Z1_{[\tau,\infty)}} is {\mathcal{P}_1}-measurable for all {\mathcal{F}_{\tau-}}-measurable Z. However, as {\mathcal{F}_{\tau-}} is generated by {\{X_\tau\}} as X ranges over the left-continuous and adapted processes, it is enough to consider {Z=X_\tau} for such a process X. In that case

\displaystyle  Z1_{[\tau,\infty)}=X^\tau1_{[\tau,\infty)}

where {X^\tau_t=X_{t\wedge\tau}} is the stopped process, which is left-continuous and adapted, hence {\mathcal{P}_6}-measurable. As we have already shown that {\mathcal{P}_6=\mathcal{P}_1}, this shows that {Z1_{[\tau,\infty)}} is {\mathcal{P}_1}-measurable as required. ⬜



  1. Hi George, a process X is predictable implies that it is progressive and X_t is {\mathcal F}_{t-} measurable for all t. I am wondering if the converse is also true. It seems to me quite “reasonable” to define “predictable” by the requirement that X_t is {\mathcal F}_{t-} measurable. Thanks!

    Comment by Yu Ding — 28 October 18 @ 6:38 PM | Reply

    • Hi. No, the converse is not true. For one thing, X_t being \mathcal{F}_{t-} measurable is not enough to ensure that the same holds at all stopping times. Consider Poisson processes. These would be predictable by your definition, but they are not predictable (e.g., a compensated Poisson process is a non-continuous martingale, but predictable martingales are always continuous).

      Comment by George Lowther — 6 December 18 @ 11:46 PM | Reply

  2. […] we like. There is also a corresponding predictable section theorem, which says that if S is in the predictable sigma-algebra, its section can be chosen to be a predictable stopping […]

    Pingback by Proof of Optional and Predictable Section | Almost Sure — 7 January 19 @ 12:03 PM | Reply

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Blog at