Almost Sure

27 December 11

Compensators of Counting Processes

A counting process, X, is defined to be an adapted stochastic process starting from zero which is piecewise constant and right-continuous with jumps of size 1. That is, letting {\tau_n} be the first time at which {X_t=n}, then

\displaystyle  X_t=\sum_{n=1}^\infty 1_{\{\tau_n\le t\}}.

By the debut theorem, {\tau_n} are stopping times. So, X is an increasing integer valued process counting the arrivals of the stopping times {\tau_n}. A basic example of a counting process is the Poisson process, for which {X_t-X_s} has a Poisson distribution independently of {\mathcal{F}_s}, for all times {t > s}, and for which the gaps {\tau_n-\tau_{n-1}} between the stopping times are independent exponentially distributed random variables. As we will see, although Poisson processes are just one specific example, every quasi-left-continuous counting process can actually be reduced to the case of a Poisson process by a time change. As always, we work with respect to a complete filtered probability space {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge0},{\mathbb P})}.

Note that, as a counting process X has jumps bounded by 1, it is locally integrable and, hence, the compensator A of X exists. This is the unique right-continuous predictable and increasing process with {A_0=0} such that {X-A} is a local martingale. For example, if X is a Poisson process of rate {\lambda}, then the compensated Poisson process {X_t-\lambda t} is a martingale. So, the compensator of X is the continuous process {A_t=\lambda t}. More generally, X is said to be quasi-left-continuous if {{\mathbb P}(\Delta X_\tau=0)=1} for all predictable stopping times {\tau}, which is equivalent to the compensator of X being almost surely continuous. Another simple example of a counting process is {X=1_{[\tau,\infty)}} for a stopping time {\tau > 0}, in which case the compensator of X is just the same thing as the compensator of {\tau}.

As I will show in this post, compensators of quasi-left-continuous counting processes have many parallels with the quadratic variation of continuous local martingales. For example, Lévy’s characterization states that a local martingale X starting from zero is standard Brownian motion if and only if its quadratic variation is {[X]_t=t}. Similarly, as we show below, a counting process is a homogeneous Poisson process of rate {\lambda} if and only if its compensator is {A_t=\lambda t}. It was also shown previously in these notes that a continuous local martingale X has a finite limit {X_\infty=\lim_{t\rightarrow\infty}X_t} if and only if {[X]_\infty} is finite. Similarly, a counting process X has finite value {X_\infty} at infinity if and only if the same is true of its compensator. Another property of a continuous local martingale X is that it is constant over all intervals on which its quadratic variation is constant. Similarly, a counting process X is constant over any interval on which its compensator is constant. Finally, it is known that every continuous local martingale is simply a continuous time change of standard Brownian motion. In the main result of this post (Theorem 5), we show that a similar statement holds for counting processes. That is, every quasi-left-continuous counting process is a continuous time change of a Poisson process of rate 1. (more…)

24 June 10

Poisson Processes

A Poisson process sample path

Figure 1: A Poisson process sample path

A Poisson process is a continuous-time stochastic process which counts the arrival of randomly occurring events. Commonly cited examples which can be modeled by a Poisson process include radioactive decay of atoms and telephone calls arriving at an exchange, in which the number of events occurring in each consecutive time interval are assumed to be independent. Being piecewise constant, Poisson processes have very simple pathwise properties. However, they are very important to the study of stochastic calculus and, together with Brownian motion, forms one of the building blocks for the much more general class of Lévy processes. I will describe some of their properties in this post.

A random variable N has the Poisson distribution with parameter {\lambda}, denoted by {X\sim{\rm Po}(\lambda)}, if it takes values in the set of nonnegative integers and

\displaystyle  {\mathbb P}(N=n)=\frac{\lambda^n}{n!}e^{-\lambda}

(1)

for each {n\in{\mathbb Z}_+}. The mean and variance of N are both equal to {\lambda}, and the moment generating function can be calculated,

\displaystyle  {\mathbb E}\left[e^{aN}\right] = \exp\left(\lambda(e^a-1)\right),

which is valid for all {a\in{\mathbb C}}. From this, it can be seen that the sum of independent Poisson random variables with parameters {\lambda} and {\mu} is again Poisson with parameter {\lambda+\mu}. The Poisson distribution occurs as a limit of binomial distributions. The binomial distribution with success probability p and m trials, denoted by {{\rm Bin}(m,p)}, is the sum of m independent {\{0,1\}}-valued random variables each with probability p of being 1. Explicitly, if {N\sim{\rm Bin}(m,p)} then

\displaystyle  {\mathbb P}(N=n)=\frac{m!}{n!(m-n)!}p^n(1-p)^{m-n}.

In the limit as {m\rightarrow\infty} and {p\rightarrow 0} such that {mp\rightarrow\lambda}, it can be verified that this tends to the Poisson distribution (1) with parameter {\lambda}.

Poisson processes are then defined as processes with independent increments and Poisson distributed marginals, as follows.

Definition 1 A Poisson process X of rate {\lambda\ge0} is a cadlag process with {X_0=0} and {X_t-X_s\sim{\rm Po}(\lambda(t-s))} independently of {\{X_u\colon u\le s\}} for all {s\le t}.

An immediate consequence of this definition is that, if X and Y are independent Poisson processes of rates {\lambda} and {\mu} respectively, then their sum {X+Y} is also Poisson with rate {\lambda+\mu}. (more…)

Blog at WordPress.com.