# Almost Sure

## 31 August 10

### Zero-Hitting and Failure of the Martingale Property

For nonnegative local martingales, there is an interesting symmetry between the failure of the martingale property and the possibility of hitting zero, which I will describe now. I will also give a necessary and sufficient condition for solutions to a certain class of stochastic differential equations to hit zero in finite time and, using the aforementioned symmetry, infer a necessary and sufficient condition for the processes to be proper martingales. It is often the case that solutions to SDEs are clearly local martingales, but is hard to tell whether they are proper martingales. So, the martingale condition, given in Theorem 4 below, is a useful result to know. The method described here is relatively new to me, only coming up while preparing the previous post. Applying a hedging argument, it was noted that the failure of the martingale property for solutions to the SDE ${dX=X^c\,dB}$ for ${c>1}$ is related to the fact that, for ${c<1}$, the process hits zero. This idea extends to all continuous and nonnegative local martingales. The Girsanov transform method applied here is essentially the same as that used by Carlos A. Sin (Complications with stochastic volatility models, Adv. in Appl. Probab. Volume 30, Number 1, 1998, 256-268) and B. Jourdain (Loss of martingality in asset price models with lognormal stochastic volatility, Preprint CERMICS, 2004-267).

Consider nonnegative solutions to the stochastic differential equation

 $\displaystyle \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle dX=a(X)X\,dB,\smallskip\\ &\displaystyle X_0=x_0, \end{array}$ (1)

where ${a\colon{\mathbb R}_+\rightarrow{\mathbb R}}$, B is a Brownian motion and the fixed initial condition ${x_0}$ is strictly positive. The multiplier X in the coefficient of dB ensures that if X ever hits zero then it stays there. By time-change methods, uniqueness in law is guaranteed as long as a is nonzero and ${a^{-2}}$ is locally integrable on ${(0,\infty)}$. Consider also the following SDE,

 $\displaystyle \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle dY=\tilde a(Y)Y\,dB,\smallskip\\ &\displaystyle Y_0=y_0,\smallskip\\ &\displaystyle \tilde a(y) = a(y^{-1}),\ y_0=x_0^{-1} \end{array}$ (2)

Being integrals with respect to Brownian motion, solutions to (1) and (2) are local martingales. It is possible for them to fail to be proper martingales though, and they may or may not hit zero at some time. These possibilities are related by the following result.

Theorem 1 Suppose that (1) and (2) satisfy uniqueness in law. Then, X is a proper martingale if and only if Y never hits zero. Similarly, Y is a proper martingale if and only if X never hits zero.

## 16 August 10

### Failure of the Martingale Property

In this post, I give an example of a class of processes which can be expressed as integrals with respect to Brownian motion, but are not themselves martingales. As stochastic integration preserves the local martingale property, such processes are guaranteed to be at least local martingales. However, this is not enough to conclude that they are proper martingales. Whereas constructing examples of local martingales which are not martingales is a relatively straightforward exercise, such examples are often slightly contrived and the martingale property fails for obvious reasons (e.g., double-loss betting strategies). The aim here is to show that the martingale property can fail for very simple stochastic differential equations which are likely to be met in practice, and it is not always obvious when this situation arises.

Consider the following stochastic differential equation

 $\displaystyle \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle dX = aX^c\,dB +b X dt,\smallskip\\ &\displaystyle X_0=x, \end{array}$ (1)

for a nonnegative process X. Here, B is a Brownian motion and a,b,c,x are positive constants. This a common SDE appearing, for example, in the constant elasticity of variance model for option pricing. Now consider the following question: what is the expected value of X at time t?

The obvious answer seems to be that ${{\mathbb E}[X_t]=xe^{bt}}$, based on the idea that X has growth rate b on average. A more detailed argument is to write out (1) in integral form

 $\displaystyle X_t=x+\int_0^t\,aX^c\,dB+ \int_0^t bX_s\,ds.$ (2)

The next step is to note that the first integral is with respect to Brownian motion, so has zero expectation. Therefore,

$\displaystyle {\mathbb E}[X_t]=x+\int_0^tb{\mathbb E}[X_s]\,ds.$

This can be differentiated to obtain the ordinary differential equation ${d{\mathbb E}[X_t]/dt=b{\mathbb E}[X_t]}$, which has the unique solution ${{\mathbb E}[X_t]={\mathbb E}[X_0]e^{bt}}$.

In fact this argument is false. For ${c\le1}$ there is no problem, and ${{\mathbb E}[X_t]=xe^{bt}}$ as expected. However, for all ${c>1}$ the conclusion is wrong, and the strict inequality ${{\mathbb E}[X_t] holds.

The point where the argument above falls apart is the statement that the first integral in (2) has zero expectation. This would indeed follow if it was known that it is a martingale, as is often assumed to be true for stochastic integrals with respect to Brownian motion. However, stochastic integration preserves the local martingale property and not, in general, the martingale property itself. If ${c>1}$ then we have exactly this situation, where only the local martingale property holds. The first integral in (2) is not a proper martingale, and has strictly negative expectation at all positive times. The reason that the martingale property fails here for ${c>1}$ is that the coefficient ${aX^c}$ of dB grows too fast in X.

In this post, I will mainly be concerned with the special case of (1) with a=1 and zero drift.

 $\displaystyle \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle dX=X^c\,dB,\smallskip\\ &\displaystyle X_0=x. \end{array}$ (3)

The general form (1) can be reduced to this special case, as I describe below. SDEs (1) and (3) do have unique solutions, as I will prove later. Then, as X is a nonnegative local martingale, if it ever hits zero then it must remain there (0 is an absorbing boundary).

The solution X to (3) has the following properties, which will be proven later in this post.

• If ${c\le1}$ then X is a martingale and, for ${c<1}$, it eventually hits zero with probability one.

• If ${c>1}$ then X is a strictly positive local martingale but not a martingale. In fact, the following inequality holds
 $\displaystyle {\mathbb E}[X_t\mid\mathcal{F}_s] (4)

(almost surely) for times ${s. Furthermore, for any positive constant ${p<2c-1}$, ${{\mathbb E}[X_t^p]}$ is bounded over ${t\ge0}$ and tends to zero as ${t\rightarrow\infty}$.

## 2 June 10

### Failure of Pathwise Integration for FV Processes

Figure 1: A non-pathwise stochastic integral of an FV Process

The motivation for developing a theory of stochastic integration is that many important processes — such as standard Brownian motion — have sample paths which are extraordinarily badly behaved. With probability one, the path of a Brownian motion is nowhere differentiable and has infinite variation over all nonempty time intervals. This rules out the application of the techniques of ordinary calculus. In particular, the Stieltjes integral can be applied with respect to integrators of finite variation, but fails to give a well-defined integral with respect to Brownian motion. The Ito stochastic integral was developed to overcome this difficulty, at the cost both of restricting the integrand to be an adapted process, and the loss of pathwise convergence in the dominated convergence theorem (convergence in probability holds intead).

However, as I demonstrate in this post, the stochastic integral represents a strict generalization of the pathwise Lebesgue-Stieltjes integral even for processes of finite variation. That is, if V has finite variation, then there can still be predictable integrands ${\xi}$ such that the integral ${\int\xi\,dV}$ is undefined as a Lebesgue-Stieltjes integral on the sample paths, but is well-defined in the Ito sense. (more…)

## 1 June 10

### Stochastic Calculus Examples and Counterexamples

Filed under: Examples and Counterexamples,Stochastic Calculus — George Lowther @ 3:00 PM
Tags: ,

I have been posting my stochastic calculus notes on this blog for some time, and they have now reached a reasonable level of sophistication. The basics of stochastic integration with respect to local martingales and general semimartingales have been introduced from a rigorous mathematical standpoint, and important results such as Ito’s lemma, the Ito isometry, preservation of the local martingale property, and existence of solutions to stochastic differential equations have been covered.

I will now start to also post examples demonstrating results from stochastic calculus, as well as counterexamples showing how the methods can break down when the required conditions are not quite met. As well as knowing precise mathematical statements and understanding how to prove them, I generally feel that it can be just as important to understand the limits of the results and how they can break down. Knowing good counterexamples can help with this. In stochastic calculus, especially, many statements have quite subtle conditions which, if dropped, invalidate the whole result. In particular, measurability and integrability conditions are often required in subtle ways. Knowing some counterexamples can help to understand these issues. (more…)

## 25 October 09

### Integrating with respect to Brownian motion

Filed under: Stochastic Calculus — George Lowther @ 9:01 PM
Tags: , , ,

In this post I attempt to give a rigorous definition of integration with respect to Brownian motion (as introduced by Itô in 1944), while keeping it as concise as possible. The stochastic integral can also be defined for a much more general class of processes called semimartingales. However, as Brownian motion is such an important special case which can be handled directly, I start with this as the subject of this post. If ${\{X_s\}_{s\ge 0}}$ is a standard Brownian motion defined on a probability space ${(\Omega,\mathcal{F},\mathop{\mathbb P})}$ and ${\alpha_s}$ is a stochastic process, the aim is to define the integral

 $\displaystyle \int_0^t\alpha_s\,dX_s.$ (1)

In ordinary calculus, this can be approximated by Riemann sums, which converge for continuous integrands whenever the integrator ${X}$ is of finite variation. This leads to the Riemann-Stietjes integral and, generalizing to measurable integrands, the Lebesgue-Stieltjes integral. Unfortunately this method does not work for Brownian motion which, as discussed in my previous post, has infinite variation over all nontrivial compact intervals.

The standard approach is to start by writing out the integral explicitly for piecewise constant integrands. If there are times ${0=t_0\le t_1\le\cdots\le t_n=t}$ such that ${\alpha_s=\alpha_{t_{k-1}}}$ for each ${s\in(t_{k-1},t_k)}$ then the integral is given by the summation,

 $\displaystyle \int_0^t\alpha\,dX = \sum_{k=1}^n\alpha_{t_{k-1}}(X_{t_k}-X_{t_{k-1}}).$ (2)

We could try to extend to more general integrands by approximating by piecewise constant processes but, as mentioned above, Brownian motion has infinite variation paths and this will diverge in general.

Fortunately, when working with random processes, there are a couple of observations which improve the chances of being able to consistently define the integral. They are

• The integral is not a single real number, but is instead a random variable defined on the probability space. It therefore only has to be defined up to a set of zero probability and not on every possible path of ${X}$.
• Rather than requiring limits of integrals to converge for each path of ${X}$ (e.g., dominated convergence), the much weaker convergence in probability can be used.

These observations are still not enough, and the main insight is to only look at integrands which are adapted. That is, the value of ${\alpha_t}$ can only depend on ${X}$ through its values at prior times. This condition is met in most situations where we need to use stochastic calculus, such as with (forward) stochastic differential equations. To make this rigorous, for each time ${t\ge 0}$ let ${\mathcal{F}_t}$ be the sigma-algebra generated by ${X_s}$ for all ${s\le t}$. This is a filtration (${\mathcal{F}_s\subseteq\mathcal{F}_t}$ for ${s\le t}$), and ${(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge 0},\mathop{\mathbb P})}$ is referred to as a filtered probability space. Then, ${\alpha}$ is adapted if ${\alpha_t}$ is ${\mathcal{F}_t}$-measurable for all times ${t}$. Piecewise constant and left-continuous processes, such as ${\alpha}$ in (2), which are also adapted are commonly referred to as simple processes.

However, as with standard Lebesgue integration, we must further impose a measurability property. A stochastic process ${\alpha}$ can be viewed as a map from the product space ${{\mathbb R}_+\times\Omega}$ to the real numbers, given by ${(t,\omega)\mapsto\alpha_t(\omega)}$. It is said to be jointly measurable if it is measurable with respect to the product sigma-algebra ${\mathcal{B}({\mathbb R}_+)\otimes\mathcal{F}}$, where ${\mathcal{B}}$ refers to the Borel sigma-algebra. Finally, it is called progressively measurable, or just progressive, if its restriction to ${[0,t]\times\Omega}$ is ${\mathcal{B}([0,t])\otimes\mathcal{F}_t}$-measurable for each positive time ${t}$. It is easily shown that progressively measurable processes are adapted, and the simple processes introduced above are progressive.

With these definitions, the stochastic integral of a progressively measurable process ${\alpha}$ with respect to Brownian motion ${X}$ is defined whenever ${\int_0^t\alpha^2ds<\infty}$ almost surely (that is, with probability one). The integral (1) is a random variable, defined uniquely up to sets of zero probability by the following two properties.

• The integral agrees with the explicit formula (2) for simple integrands.
• If ${\alpha^n}$ and ${\alpha}$ are progressive processes such that ${\int_0^t(\alpha^n-\alpha)^2\,ds}$ tends to zero in probability as ${n\rightarrow\infty}$, then
 $\displaystyle \int_0^t\alpha^n\,dX\rightarrow\int_0^t\alpha\,dX,$ (3)

where, again, convergence is in probability.

## 8 October 09

### The Pathological Properties of Brownian Motion

Filed under: Stochastic processes — George Lowther @ 1:49 AM
Tags: ,

I turn away with fear and horror from the lamentable plague of continuous functions which do not have derivatives – Charles Hermite (1893)

Despite being of central importance to the theory of stochastic processes and to many applications in areas such as physics and economics, Brownian motion has some nasty properties such as being nowhere differentiable, which are in stark contrast to the usual well-behaved functions studied in elementary differential calculus. As I intend to post entries on stochastic calculus, it seems that a good place to start is by describing some of the properties of Brownian motion which rule out the use of the standard techniques of differential calculus. Strictly speaking, these properties should not really be regarded as pathological although they can seem so to someone not familiar with such processes and would have been regarded as such at the time of Hermite’s statement above.

Historically, the term `Brownian motion’ refers to the experiments performed by Robert Brown in 1827 where pollen and dust particles floating on the surface of water are observed to move about with a jittery motion. This was explained mathematically by Albert Einstein in 1905 and Marian Smoluchowski in 1906, and is caused by the particles being continuously bombarded by water molecules. Louis Bachelier also studied the mathematical properties of Brownian motion in 1900, applying it to the evolution of stock prices.

Mathematically, Brownian motion is a stochastic process whose increments are independent and identically distributed random variables, and which has continuous sample paths. In the case of the random motion of particles due to collisions with water molecules, as in the experiments performed by Robert Brown, each bombardment by a molecule will not produce a sudden change in the position of the particle. Instead, they will produce a sudden change in the particle’s velocity. So mathematical Brownian motion as described here is better used as model of the velocity of the particle rather than its position (even better – the velocity can be modeled by an Ornstein-Uhlenbeck process). More generally, it is used as a source of random noise in many models of physical and economic systems. It is also referred to as a Wiener process after Norbert Wiener and often represented using a capital W. (more…)

The Rubric Theme. Blog at WordPress.com.