The number of times that a process passes upwards or downwards through an interval is refered to as the number of upcrossings and respectively the number of downcrossings of the process.
Consider a process whose time index runs through an index set . For real numbers , the number of upcrossings of across the interval is the supremum of the nonnegative integers such that there exists times satisfying
and for which . The number of upcrossings is denoted by , which is either a nonnegative integer or is infinite. Similarly, the number of downcrossings, denoted by , is the supremum of the nonnegative integers such that there are times satisfying (1) and such that .
Note that between any two upcrossings there is a downcrossing and, similarly, there is a downcrossing between any two upcrossings. It follows that and can differ by at most 1, and they are either both finite or both infinite.
The significance of the upcrossings of a process to convergence results is due to the following criterion for convergence of a sequence.
Theorem 1 A sequence converges to a limit in the extended real numbers if and only if the number of upcrossings is finite for all .
Proof: Denoting the infimum limit and supremum limit by
then and the sequence converges to a limit if and only if .
First, suppose that the sequence converges. If then there is an such that for all . So, all upcrossings of must start before time , and we may conclude that is finite. On the other hand, if then and we can infer that for all and some . Again, this gives .
Conversely, suppose that the sequence does not converge, so that . Then choose in the interval . For any integer , there is then an such that and an with . This allows us to define infinite sequences by and
for . Clearly, and for all , so .
This convergence criterion is particularly suited to applications in stochastic calculus and martingale theory because bounds for the number of upcrossings and downcrossings of a process can be obtained in terms of stochastic integrals.
In these stochastic calculus notes, I am mainly concerned with stochastic processes with time index running through the positive real numbers. However, for the purposes of this post, it is useful to consider arbitrary index sets . We work under a probability space and a filtration of sigma-algebras . Consider an elementary predictable process of the form for times in and -measurable random variables . Its integral with respect to is given by
In particular, this includes processes for elementary predictable sets of the form
for times in and sets .
The number of upcrossings of an interval can be computed as follows.. Set and define and by
It is easily seen that whenever and, if for times in , then for some . The number of upcrossings of is therefore equal to the maximum such that . So, and, letting be the elementary process gives the following inequality (see also, the figure above)
The process takes only the values 0 and 1, so is equal to for an elementary predictable set of the form (2). This gives the following bound for the upcrossings of a process which, by the results of the previous post, is particularly suited to martingale convergence results.
Lemma 2 Let be a stochastic process with time running over the finite index set , and set . Then there exists an elementary set A of the form (2) such that
The considerations above lead to the following bound on the expected number of upcrossings of a martingale.
Lemma 3 (Doob’s upcrossing lemma) Let be a supermartingale with time running through a countable index set . The number of upcrossings of any satisfies
Proof: It is enough to prove this for finite index sets . The result for countably infinite sets then follows by taking a sequence of finite sets increasing to and applying monotone convergence.
So, suppose that is finite and set . Equation (3) can be applied.
The second inequality follows from the fact that is a supermartingale (equivalently, is a submartingale) and is a bounded nonnegative elementary predictable process.
Martingale convergence is a consequence of the upcrossing lemma. The following says that any -bounded martingale in discrete time converges almost surely.
Theorem 4 (Doob’s Forward Convergence Theorem) Let be a martingale (or submartingale, or supermartingale) such that is bounded over all . Then, with probability one, the limit exists and is finite.
Proof: It is enough to consider the case where is a supermartingale, as the submartingale case then follows by applying the result to . In this case, Lemma 3 shows that the expected number of upcrossings of any is finite, so almost surely. By countable additivity, simultaneously for all rational (and therefore, for all real ) with probability one. Consequently, the limit exists in the extended real numbers, and Fatou’s lemma gives
so is almost surely finite.
A process X is said to be uniformly integrable if the set of random variables is uniformly integrable. This case is particularly nice.
Corollary 5 Let be a uniformly integrable martingale and let be the sigma-algebra at infinity. Then, there is an almost-surely unique and integrable -measurable random variable such that for each . Furthermore, as , with probability one.
Proof: First, by Doob’s forward convergence theorem, we can set , which exists with probability one. By uniform integrability, this converges in and it follows that converges to as . So, satisfies the required properties.
Suppose is any other random variable satisfying the required properties. Then, for any and . By the monotone class theorem, this extends to all . As are -measurable, it follows that they are almost surely equal.
The condition that is -bounded in Doob’s forward convergence theorem is automatically satisfied in many cases. In particular, if is a non-negative supermartingale then for all , so is bounded, giving the following corollary.
Corollary 6 Let be a non-negative martingale (or supermartingale). Then, with probability one, the limit exists and is finite.
As an example application of the martingale convergence theorem, it is easy to show that a standard random walk started started at will visit every level with probability one.
Corollary 7 Let be a standard random walk. That is, and
Then, for every integer , with probability one for some .
Proof: Without loss of generality, suppose that . Let be the first time for which . It is easy to see that the stopped process defined by is a martingale and is non-negative. Therefore, by the martingale convergence theorem, the limit exists and is finite (almost surely). In particular, converges to and must be less than for large . However, whenever , so we have and therefore for some .
Levy’s Upwards and Downwards Theorems
A very useful application of martingale convergence is to the following results concerning limits of conditional expectations for a limit of increasing or decreasing sigma-algebras.
Theorem 8 (Levy’s upward theorem) Let be a probability space and be an increasing sequence of sub-sigma-algebras of . Set . For any integrable random variable , the following limit holds with probability one,
as . This limit also holds under -convergence.
Theorem 9 (Levy’s downward theorem) Let be a probability space and be a decreasing sequence of sub-sigma-algebras of . Set . For any integrable random variable , the following limit holds with probability one,
as . This limit also holds under -convergence.
The upwards theorem is nothing more than a direct application of the convergence theorem for uniformly integrable martingales. The downwards theorem follows in a similar way.
For each negative integer , set and . This is a martingale with respect to the filtration and, by Doob’s upcrossing lemma, almost surely has finitely many upcrossings of any interval . So the limit exists almost-surely. By uniform integrability of conditional expectations, this must also converge in the -norm. In particular, is integrable, and taking limits of the equality gives
proving Levy’s downwards theorem.
Note that, due to the uniformly integrability of conditional expectations, -convergence immediately follows from almost-sure convergence in Levy’s upwards and downwards theorems.
The Strong Law of Large Numbers
Roughly speaking, the strong law of large numbers says that the sample mean of a set of independent and identically distributed random variables tends to the mean, with probability one, as the sample size tends to infinity.
More precisely, this can be stated as follows. Suppose that is a sequence of IID random variables. Set . If the random variables are integrable with mean then
with probability one as . There is also a weak law of large numbers which states that the limit above holds in probability. However, as convergence in probability always follows from almost sure convergence, the weak law is a direct consequence of the strong law.
I mention the strong law as an example of a well-known result which is quite hard to prove from elementary ideas, but follows easily from martingale convergence. The strong law actually follows from Levy’s downwards theorem.
For each set . Symmetry under permuting gives
Taking the average of these terms,
So, setting , Levy’s downwards theorem shows that with probability one, and this limit is a random variable with mean . The final step is to apply Kolmogorov’s zero-one law. This implies that the limit of is almost-surely constant, so must equal .