Optional Sampling

Doob’s optional sampling theorem states that the properties of martingales, submartingales and supermartingales generalize to stopping times. For simple stopping times, which take only finitely many values in {{\mathbb R}_+}, the argument is a relatively basic application of elementary integrals. For simple stopping times {\sigma\le\tau}, the stochastic interval {(\sigma,\tau]} and its indicator function {1_{(\sigma,\tau]}} are elementary predictable. For any submartingale {X}, the properties of elementary integrals give the inequality

\displaystyle  {\mathbb E}\left[X_\tau-X_\sigma\right]={\mathbb E}\left[\int_0^\infty 1_{(\sigma,\tau]}\,dX\right]\ge 0. (1)

For a set {A\in \mathcal{F}_\sigma} the following

\displaystyle  \sigma^\prime(\omega)=\begin{cases} \sigma(\omega),&\textrm{if }\omega\in A,\\ \tau(\omega),&\textrm{otherwise}, \end{cases}

is easily seen to be a stopping time. Replacing {\sigma} by {\sigma^\prime} extends inequality (1) to the following,

\displaystyle  {\mathbb E}\left[1_A(X_\tau-X_\sigma)\right]={\mathbb E}\left[X_\tau-X_{\sigma^\prime}\right]\ge 0. (2)

As this inequality holds for all sets {A\in\mathcal{F}_\sigma} it implies the extension of the submartingale property {X_\sigma\le{\mathbb E}[X_\tau\vert\mathcal{F}_\sigma]} to the random times. This argument applies to all simple stopping times, and is sufficient to prove the optional sampling result for discrete time submartingales. In continuous time, the additional hypothesis that the process is right-continuous is required. Then, the result follows by taking limits of simple stopping times.

Theorem 1 Let {\sigma\le\tau} be bounded stopping times. For any cadlag martingale, submartingale or supermartingale {X}, the random variables {X_\sigma, X_\tau} are integrable and the following are satisfied.

  1. If {X} is a martingale then, {X_\sigma={\mathbb E}\left[X_{\tau}\vert\mathcal{F}_\sigma\right].}
  2. If {X} is a submartingale then, {X_\sigma\le{\mathbb E}\left[X_{\tau}\vert\mathcal{F}_\sigma\right].}
  3. If {X} is a supermartingale then, {X_\sigma\ge{\mathbb E}\left[X_{\tau}\vert\mathcal{F}_\sigma\right].}

Proof: It is enough to prove the result for submartingales, and the martingale and supermartingale results follow from applying this to {-X}. So, suppose that {X} is a submartingale.

The idea is to approximate {\sigma,\tau} from the right by sequences of simple stopping times {\sigma_n,\tau_n}. It is easily seen that this is achieved by the following

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle\sigma_n=\min\left\{m/n\colon m\in{\mathbb N}, m/n\ge\sigma\right\},\smallskip\\ &\displaystyle\tau_n=\min\left\{m/n\colon m\in{\mathbb N}, m/n\ge\tau\right\}. \end{array}

Then, {{\mathbb E}[X_{\tau_n}-X_{\sigma_n}]\ge 0}. If it can be shown that {X_{\sigma_n},X_{\tau_n}} are uniformly integrable sequences then their limits {X_\sigma,X_\tau} must be integrable, and commuting the limit {n\rightarrow\infty} with the expectation would give {{\mathbb E}[X_\tau-X_\sigma]\ge 0}.

In the martingale case, the property {X_{\tau_n}={\mathbb E}[X_{\tau_1}\vert\mathcal{F}_{\tau_n}]} for simple stopping times expresses {X_{\tau_n}} as conditional expectations. This imples uniform integrability. To prove uniform integrability for submartingales, the following result can be applied: a submartingale sampled at a decreasing sequence of times which are bounded below is a uniformly integrable sequence. This was proven in the previous post using a simple `Doob-style’ decomposition. More generally, the submartingale property at simple stopping times implies that the process {Y_{-n}=X_{\tau_n}} is a submartingale with time running over the negative integers, and with respect to the filtration {\mathcal{G}_{-n}=\mathcal{F}_{\tau_n}}. Taking {\tau_\infty\equiv 0} this extends {Y_t} to {t=-\infty}, bounding the negative integers from below. So, {X_{\tau_n}\equiv Y_{-n}} is a uniformly integrable sequence. Similarly for {X_{\sigma_n}}. As shown above, this gives the inequality {{\mathbb E}[X_{\tau}-X_{\sigma}]\ge 0}.

As with the argument above for simple stopping times, inequality (2) follows, giving the result {X_\sigma\le{\mathbb E}[X_\tau\vert\mathcal{F}_\sigma]}. ⬜

Similarly, the following optional stopping result shows that the class of cadlag martingales is closed under stopping at arbitrary stopping times.

Theorem 2 Let {X} be a cadlag martingale (resp. submartingale, supermartingale) and {\tau} be a stopping time. Then, {X^\tau} is also a martingale (resp. submartingale, supermartingale).

Proof: As above, it is sufficient to prove the result for submartingales, with the supermartingale and martingale cases following by applying it to {-X}. As {X} is right-continuous and adapted (and hence, progressive), the stopped process {X^\tau} will also be adapted. Choosing times {s<t} and set {A\in\mathcal{F}_s},

\displaystyle  {\mathbb E}\left[1_A(X^\tau_t-X^\tau_s)\right]={\mathbb E}\left[1_A(X_{(\tau\wedge t)\vee s}-X_s)\right].

Theorem 1 applied to the stopping times {s\le (\tau\wedge t)\vee s} says that this is nonnegative, so {X^\tau} is indeed a submartingale. ⬜


First Exit Time of Standard Brownian motion

As an example of optional stopping, consider the first time {\tau} at which standard Brownian motion {B} exits an interval {[-a,b]}. Here, {a,b} are arbitrary positive real numbers. It must almost surely exit this interval some time, so {\tau<\infty}. Indeed, the probability that it is still in the interval at any time {t} is given by the probability that the standard normal {B_t/\sqrt{t}} is in the interval {[-a/\sqrt{t},b/\sqrt{t}]}, which goes to zero as {t\rightarrow\infty}.

One question we can ask is, what is the probability that it exits the interval at {b} rather than at {-a}? That is, what is the probability that standard Brownian motion hits {b} before {-a}? Using the fact that {B^\tau} is a uniformly bounded martingale,

\displaystyle  b{\mathbb P}(B_\tau=b)-a{\mathbb P}(B_\tau=-a)={\mathbb E}[B_\tau]=\lim_{t\rightarrow\infty}{\mathbb E}[B^\tau_t]=0.

Rearranging this gives the following probabilities

\displaystyle  {\mathbb P}(B_\tau=b)=a/(a+b),\ {\mathbb P}(B_\tau=-a)=b/(a+b). (3)

Similarly, the fact that the increment {B_t-B_s} has mean zero and variance {t-s} independently of {\mathcal{F}_s} shows that {B_t^2-t} is a martingale. So, {B^2_{t\wedge\tau}-t\wedge\tau} is a martingale giving,

\displaystyle  b^2{\mathbb P}(B_\tau=b)+a^2{\mathbb P}(B_\tau=-a)-{\mathbb E}[\tau]=\lim_{t\rightarrow\infty}{\mathbb E}[B^2_{t\wedge\tau}-t\wedge\tau]=0.

Substituting in (3) gives the expected time that the Brownian motion exits the range

\displaystyle  {\mathbb E}[\tau] = ab. (4)

Finally, note that letting {a} go to infinity in equations (3,4) shows that Brownian motion will eventually hit any value {b>0} with probability one, but the expected time when this first happens is infinite.


Non-Right-Continuous Martingales

The optional sampling theorems stated above have the important precondition that the martingale must have sample paths which are cadlag. It is clearly necessary to choose a good modification of a process in order for these results to hold, since the distribution of an arbitrary process at a continuously distributed time need not be related in any way to its distribution at fixed times. The restriction to cadlag processes is not usually a problem, as the relatively weak condition of right-continuity in probability is sufficient to guarantee the existence of a cadlag modification and, furthermore, if the underlying filtration is right-continuous then every martingale has a cadlag version.

There are, however, some situations where we might want to relax the right-continuity constraint. For instance, if we are looking at martingales with respect to a non-right-continuous filtration or have submartingales or supermartingales which are not guaranteed to be right-continuous in probability, then cadlag versions might not exist. Even so, it is still possible to modify any martingale, submartingale or supermartingale to have left and right limits everywhere, and to be right-continuous everywhere outside of a fixed countable set of times. Fortunately, the results above still hold in this generality.

Theorem 3 Let X be a martingale, submartingale or supermartingale whose sample paths are right-continuous everywhere outside of a fixed countable set of times {S\subset{\mathbb R}_+}. Then, for any bounded stopping times {\sigma\le\tau}, the following are satisfied.

  1. If {X} is a martingale then, {X_\sigma={\mathbb E}\left[X_{\tau}\vert\mathcal{F}_\sigma\right].}
  2. If {X} is a submartingale then, {X_\sigma\le{\mathbb E}\left[X_{\tau}\vert\mathcal{F}_\sigma\right].}
  3. If {X} is a supermartingale then, {X_\sigma\ge{\mathbb E}\left[X_{\tau}\vert\mathcal{F}_\sigma\right].}

Proof: We proceed in a similar way to the proof of Theorem 1 above. Choose sequences of simple stopping times {\sigma_n,\tau_n} decreasing respectively to {\sigma} and {\tau}. Also, as S is countable, we can choose a sequence of finite subsets {S_n\subseteq S} increasing to S. Then define the times

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle\tilde\sigma_n(\omega)&\displaystyle=\begin{cases} \sigma_n(\omega),&\textrm{if }\sigma(\omega)\not\in S_n,\smallskip\\ \sigma(\omega),&\textrm{if }\sigma(\omega)\in S_n. \end{cases}\smallskip\\ \displaystyle\tilde\tau_n(\omega)&\displaystyle=\begin{cases} \tau_n(\omega),&\textrm{if }\tau(\omega)\not\in S_n,\smallskip\\ \tau(\omega),&\textrm{if }\tau(\omega)\in S_n. \end{cases} \end{array}

These are again simple stopping times decreasing to {\sigma} and {\tau} respectively. Writing

\displaystyle  \left\{\omega\colon\tilde\sigma_n(\omega)\le t\right\}=\left\{\omega\colon\sigma_n(\omega)\le t\right\}\cup\left\{\omega\colon\sigma(\omega)\in S_n\cap(0,t]\right\},

which is in {\mathcal{F}_t}, we see that {\tilde\sigma_n} are indeed stopping times, and similarly for {\tilde\tau_n}. Furthermore, whenever {\sigma\in S} we have that {\tilde\sigma_n=\sigma} eventually. Together with right-continuity of the sample paths outside of S, this gives {X_{\tilde\sigma_n}\rightarrow X_\sigma} and, similarly, {X_{\tilde\tau_n}\rightarrow X_\tau}. The result now follows as in Theorem 1. Just considering the case where X is a submartingale,

\displaystyle  {\mathbb E}\left[X_\tau-X_\sigma\right]=\lim_{n\rightarrow\infty}{\mathbb E}\left[X_{\tilde\tau_n}-X_{\tilde\sigma_n}\right]\ge0,

from which the result follows. ⬜

Similarly, the optional stopping result stated in Theorem 2 above follows across to the more general situation.

Theorem 4 Let X be a martingale (resp. submartingale or supermartingale) whose sample paths are right-continuous everywhere outside of a fixed countable set of times {S\subset{\mathbb R}_+}. For any stopping time {\tau}, then {X^\tau} is also a martingale (resp. submartingale, supermartingale).

Proof: We first need to show that {X^\tau} is adapted. The proof of Theorem 3 constructed simple stopping times {\tau_n\downarrow\tau} such that {X_{\tau_n}\rightarrow X_\tau}. Then,

\displaystyle  X^\tau_t=\lim_{n\rightarrow\infty}1_{\{\tau_n < t\}}X_{\tau_n}+1_{\{\tau \ge t\}}X_t.

As {\tau_n} is a simple stopping time, {X_{\tau_n}} is measurable and, {1_{\{\tau_n < t\}}X_{\tau_n}} is {\mathcal{F}_t}-measurable. So, {X^\tau} is adapted.

The remainder of the proof is identical to that given above for Theorem 2, except that we apply Theorem 3 instead of 1. ⬜

13 thoughts on “Optional Sampling

  1. Hello Mr. Lowther

    I’ve been working on a Brownian motion problem that requires me to use the joint distribution of two distinct hitting times. The distribution of a single hitting time can easily be computed via an appropriate superposition of the brownian motion with its reflection at the barrier. On the contrary, to achieve the same for the “double” case I reckon (from applications of the method of images to analogous cases I researched) that an infinite series of reflections is needed (just like two mirrors facing each other generate an infinite series of reflected images, whereas one only generates one image). Do you have any better ideas how to approach the problem?

    Best regards

    1. Tom,

      Yes, you need to do something like that. I’m assuming that you are wanting the hitting times Ta, Tb of some levels a > 0 > b (if a and b had the same sign then it is easy). To get the joint distribution, you want to calculate P(Ta ≤ sTb ≤ t) in the case s ≤ t you can split this into two terms. P(max(Ta,Tb) ≤ s) is the probability of hitting either barrier by time s and can be calculated by double reflection. You can see my answer to this math.SE question or this other question might help. This is a well-known problem in finance, and searching for double barrier option pricing formula might also help. The other term, P(Ta ≤ s < Tb ≤ t) can be calculated by first calculating the joint distribution of {Ta ≤ s < Tb} and Bs. Then, calculate the probability of Tb ≤ t conditional on Bs and s < Tb. This only requires using the single-barrier solution. Then integrating over the possible values of Bs should give what you want. Hope that helps!

      George

    1. With probability 1 it’s equal to the hitting time of the set {-a,b} because following that BM crosses the horizontal line i.o. in any interval to the right, hence escapes the interval [-a.b].

  2. A typo is spotted in the third line of the last section “Non-Right-Continuous Martingales”, where “in probably” shall be “in probability”?

    1. You need to assume that you have a good version of the process. For example, let \tau be any random time without atoms (i.e., \mathbb{P}(\tau=t)=0 for all t). Define the process X_t=1 if t=\tau and X_t=0 otherwise. Then, X_t is almost surely 0 at each deterministic time t, yet X_\tau is almost surely equal to 1. More generally, X_\tau can be anything, and need not be measurable, even when X_t is almost surely 0 at each t.

  3. Before Eq. (3), what led to the conclusion \lim_{t \rightarrow \infty} \mathbb{E}(B_t^{\tau}) = 0? Also, if I just define $\tau$ to be the first hitting time of a, certainly \mathbb{E} B_{\tau} = a \neq 0 — so what make the difference here?

Leave a comment