Almost Sure

29 November 16

The Section Theorems

Consider a probability space {(\Omega,\mathcal{F},{\mathbb P})} and a subset S of {{\mathbb R}_+\times\Omega}. The projection {\pi_\Omega(S)} is the set of {\omega\in\Omega} such that there exists a {t\in{\mathbb R}_+} with {(t,\omega)\in S}. We can ask whether there exists a map

\displaystyle  \tau\colon\pi_\Omega(S)\rightarrow{\mathbb R}_+

such that {(\tau(\omega),\omega)\in S}. From the definition of the projection, values of {\tau(\omega)} satisfying this exist for each individual {\omega}. By invoking the axiom of choice, then, we see that functions {\tau} with the required property do exist. However, to be of use for probability theory, it is important that {\tau} should be measurable. Whether or not there are measurable functions with the required properties is a much more difficult problem, and is answered affirmatively by the measurable selection theorem. For the question to have any hope of having a positive answer, we require S to be measurable, so that it lies in the product sigma-algebra {\mathcal{B}({\mathbb R}_+)\otimes\mathcal{F}}, with {\mathcal{B}({\mathbb R}_+)} denoting the Borel sigma-algebra on {{\mathbb R}_+}. Also, less obviously, the underlying probability space should be complete. Throughout this post, {(\Omega,\mathcal{F},{\mathbb P})} will be assumed to be a complete probability space.

It is convenient to extend {\tau} to the whole of {\Omega} by setting {\tau(\omega)=\infty} for {\omega} outside of {\pi_\Omega(S)}. Then, {\tau} is a map to the extended nonnegative reals {\bar{\mathbb R}_+={\mathbb R}_+\cup\{\infty\}} for which {\tau(\omega) < \infty} precisely when {\omega} is in {\pi_\Omega(S)}. Next, the graph of {\tau}, denoted by {[\tau]}, is defined to be the set of {(t,\omega)\in{\mathbb R}_+\times\Omega} with {t=\tau(\omega)}. The property that {(\tau(\omega),\omega)\in S} whenever {\tau(\omega) < \infty} is expressed succinctly by the inclusion {[\tau]\subseteq S}. With this notation, the measurable selection theorem is as follows.

Theorem 1 (Measurable Selection) For any {S\in\mathcal{B}({\mathbb R}_+)\otimes\mathcal{F}}, there exists a measurable {\tau\colon\Omega\rightarrow\bar{\mathbb R}_+} such that {[\tau]\subseteq S} and

\displaystyle  \left\{\tau < \infty\right\}=\pi_\Omega(S). (1)

As noted above, if it wasn’t for the measurability requirement then this theorem would just be a simple application of the axiom of choice. Requiring {\tau} to be measurable, on the other hand, makes the theorem much more difficult to prove. For instance, it would not hold if the underlying probability space was not required to be complete. Note also that, stated as above, measurable selection implies that the projection of S is equal to a measurable set {\{\tau < \infty\}}, so the measurable projection theorem is an immediate corollary. I will leave the proof of Theorem 1 for a later post, together with the proofs of the section theorems stated below.

A closely related problem is the following. Given a measurable space {(X,\mathcal{E})} and a measurable function, {f\colon X\rightarrow\Omega}, does there exist a measurable right-inverse on the image of {f}? This is asking for a measurable function, {g}, from {f(X)} to {X} such that {f(g(\omega))=\omega}. In the case where {(X,\mathcal{E})} is the Borel space {({\mathbb R}_+,\mathcal{B}({\mathbb R}_+))}, Theorem 1 says that it does exist. If S is the graph {\{(t,f(t))\colon t\in{\mathbb R}_+\}} then {\tau} will be the required right-inverse. In fact, as all uncountable Polish spaces are Borel-isomorphic to each other and, hence, to {{\mathbb R}_+}, this result applies whenever {(X,\mathcal{E})} is a Polish space together with its Borel sigma-algebra. (more…)

22 November 16

Predictable Processes

In contrast to optional processes, the class of predictable processes was used extensively in the development of stochastic integration in these notes. They appeared as integrands in stochastic integrals then, later on, as compensators and in the Doob-Meyer decomposition. Since they are also central to the theory of predictable section and projection, I will revisit the basic properties of predictable processes now. In particular, any of the collections of sets and processes in the following theorem can equivalently be used to define the predictable sigma-algebra. As usual, we work with respect to a complete filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\in{\mathbb R}_+},{\mathbb P})}. However, completeness is not actually required for the following result. All processes are assumed to be real valued, or take values in the extended reals {\bar{\mathbb R}={\mathbb R}\cup\{\pm\infty\}}.

Theorem 1 The following collections of sets and processes each generate the same sigma-algebra on {{\mathbb R}_+\times\Omega}.

  1. {{[\tau,\infty)}: {\tau} is a predictable stopping time}.
  2. {Z1_{[\tau,\infty)}} as {\tau} ranges over the predictable stopping times and Z over the {\mathcal{F}_{\tau-}}-measurable random variables.
  3. {A\times(t,\infty)\colon t\in{\mathbb R}_+,A\in\mathcal{F}_t}\cup{A\times\{0\}\colon A\in\mathcal{F}_0}.
  4. The elementary predictable processes.
  5. {{(\tau,\infty)}: {\tau} is a stopping time}{\cup}{{A\times\{0\}\colon A\in\mathcal{F}_0}}.
  6. The left-continuous adapted processes.
  7. The continuous adapted processes.

Compare this with the analogous result for sets/processes generating the optional sigma-algebra given in the previous post. The proof of Theorem 1 is given further below. First, recall that the predictable sigma-algebra was previously defined to be generated by the left-continuous adapted processes. However, it can equivalently be defined by any of the collections stated in Theorem 1. To make this clear, I now restate the definition making use if this equivalence.

Definition 2 The predictable sigma-algebra, {\mathcal{P}}, is the sigma-algebra on {{\mathbb R}_+\times\Omega} generated by any of the collections of sets/processes in Theorem 1.

A stochastic process is predictable iff it is {\mathcal{P}}-measurable.

(more…)

1 November 16

Predictable Projection For Left-Continuous Processes

In the previous post, I looked at optional projection. Given a non-adapted process X we construct a new, adapted, process Y by taking the expected value of {X_t} conditional on the information available up until time t. I will now concentrate on predictable projection. This is a very similar concept, except that we now condition on the information available strictly before time t.

It will be assumed, throughout this post, that the underlying filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\in{\mathbb R}_+},{\mathbb P})} satisfies the usual conditions, meaning that it is complete and right-continuous. This is just for convenience, as most of the results stated here extend easily to non-right-continuous filtrations. The sigma-algebra

\displaystyle  \mathcal{F}_{t-} = \sigma\left(\mathcal{F}_s\colon s < t\right)

represents the collection of events which are observable before time t and, by convention, we take {\mathcal{F}_{0-}=\mathcal{F}_0}. Then, the conditional expectation of X is written as,

\displaystyle  Y_t={\mathbb E}[X_t\;\vert\mathcal{F}_{t-}]{\rm\ \ (a.s.)} (1)

By definition, Y is adapted. However, at each time, (1) only defines Y up to a zero probability set. It does not determine the paths of Y, which requires specifying its values simultaneously at the uncountable set of times in {{\mathbb R}_+}. So, (1) does not tell us the distribution of Y at random times, and it is necessary to specify an appropriate version for Y. Predictable projection gives a uniquely defined modification satisfying (1). The full theory of predictable projection for jointly measurable processes requires the predictable section theorem. However, as I demonstrate here, in the case where X is left-continuous, predictable projection can be done by more elementary methods. The statements and most of the proofs in this post will follow very closely those given previously for optional projection. The main difference is that left and right limits are exchanged, predictable stopping times are used in place of general stopping times, and the sigma algebra {\mathcal{F}_{t-}} is used in place of {\mathcal{F}_t}.

Stochastic processes will be defined up to evanescence, so two processes are considered to be the same if they are equal up to evanescence. In order to apply (1), some integrability requirements need to imposed. I will use local integrability. Recall that, in these notes, a process X is locally integrable if there exists a sequence of stopping times {\tau_n} increasing to infinity and such that

\displaystyle  1_{\{\tau_n > 0\}}\sup_{t \le \tau_n}\lvert X_t\rvert (2)

is integrable. This is a strong enough condition for the conditional expectation (1) to exist, not just at each fixed time, but also whenever t is a stopping time. The main result of this post can now be stated.

Theorem 1 (Predictable Projection) Let X be a left-continuous and locally integrable process. Then, there exists a unique left-continuous process Y satisfying (1).

As it is left-continuous, the fact that Y is specified, almost surely, at any time t by (1) means that it is uniquely determined up to evanescence. The main content of Theorem 1 is the existence of Y, and the proof of this is left until later in this post.

The process defined by Theorem 1 is called the predictable projection of X, and is denoted by {{}^{\rm p}\!X}. So, {{}^{\rm p}\!X} is the unique left-continuous process satisfying

\displaystyle  {}^{\rm p}\!X_t={\mathbb E}[X_t\;\vert\mathcal{F}_{t-}]{\rm\ \ (a.s.)} (3)

for all times t. In practice, X will usually not just be left-continuous, but will also have right limits everywhere. That is, it is caglad (“continu à gauche, limites à droite”).

Theorem 2 Let X be a caglad and locally integrable process. Then, its predictable projection is caglad.

The simplest non-trivial example of predictable projection is where {X_t} is constant in t and equal to an integrable random variable U. Then, {{}^{\rm p}\!X_t=M_{t-}} is the left-limits of the cadlag martingale {M_t={\mathbb E}[U\;\vert\mathcal{F}_t]}, so {{}^{\rm p}\!X} is easily seen to be a caglad process. (more…)

Create a free website or blog at WordPress.com.