A stochastic process X is said to have independent increments if is independent of for all . For example, standard Brownian motion is a continuous process with independent increments. Brownian motion also has stationary increments, meaning that the distribution of does not depend on t. In fact, as I will show in this post, up to a scaling factor and linear drift term, Brownian motion is the only such process. That is, any continuous realvalued process X with stationary independent increments can be written as

(1) 
for a Brownian motion B and constants . This is not so surprising in light of the central limit theorem. The increment of a process across an interval [s,t] can be viewed as the sum of its increments over a large number of small time intervals partitioning [s,t]. If these terms are independent with relatively small variance, then the central limit theorem does suggest that their sum should be normally distributed. Together with the previous posts on Lévy’s characterization and stochastic time changes, this provides yet more justification for the ubiquitous position of Brownian motion in the theory of continuoustime processes. Consider, for example, stochastic differential equations such as the Langevin equation. The natural requirements for the stochastic driving term in such equations is that they be continuous with stationary independent increments and, therefore, can be written in terms of Brownian motion.
The definition of standard Brownian motion extends naturally to multidimensional processes and general covariance matrices. A standard ddimensional Brownian motion is a continuous process with stationary independent increments such that has the distribution for all . That is, is joint normal with zero mean and covariance matrix tI. From this definition, has the distribution independently of for all . This definition can be further generalized. Given any and positive semidefinite , we can consider a ddimensional process X with continuous paths and stationary independent increments such that has the distribution for all . Here, is the drift of the process and is the `instantaneous covariance matrix’. Such processes are sometimes referred to as Brownian motions, and all continuous ddimensional processes starting from zero and with stationary independent increments are of this form.
Theorem 1 Let X be a continuous valued process with stationary independent increments.
Then, there exist unique and such that is a Brownian motion.
This result is a special case of Theorem 2 below. In particular consider the case of continuous real valued processes with stationary independent increments. Then, by this result, there are constants such that is normal with mean and variance for . As long as X is not a deterministic process, so that is nonzero, will be a standard Brownian motion and (1) is satisfied.
It is also possible to define Gaussian processes with independent but nonstationary increments. Consider continuous functions and with , , and such that is increasing in the sense that is positive semidefinite for all . Then, there will exist processes X with the independent increments property and such that has the distribution for all . This exhausts the space of continuous ddimensional processes with independent incements.
Theorem 2 Let X be a continuous valued process with the independent increments property.
Then, there exist (unique, continuous) functions and with , such that has the distribution for all .
Note, in particular, that if the increments of X are also stationary, then and will be independent of t for each fixed . It follows that and for some and . Theorem 1 is then a direct consequence of this result.
Before moving on to the proof of Theorem 2, I should point out that there do indeed exist welldefined processes with the required distributions. First, for the stationary increments case, consider and positive semidefinite . Letting be the Cholesky decomposition and B be a ddimensional Brownian motion,
is easily seen to have independent increments with the distribution. More generally, consider continuous functions and with , and such that is positive semidefinite for . If is absolutely continuous, so that for some measurable , then X can similarly be expressed in terms of a ddimensional Brownian motion B. As is increasing, will be positive semidefinite for almost all t. Letting be its Cholesky decomposition,

(2) 
satisfies the required properties. The term here is to be interpreted as matrix multiplication, . First,
is finite, so is indeed B^{j}integrable. The integral is also normally distributed with independent increments. If Q is piecewise constant then this follows from the fact that linear combinations of joint normal random variables are normal and the case for general deterministic integrands follows by taking limits. The covariance matrix of can be computed using the Ito isometry,
This identity made use of the covarations . So, the process given by (2) does indeed have stationary independent increments with the distribution.
Finally, in the general case, a deterministic time change can be applied to force to be absolutely continuous. Define by . This is continuous and strictly increasing, so has an inverse . By positive semidefiniteness
for . So, is absolutely continuous and, as described above, it is possible to construct a continuous process with independent increments from a standard ddimensional Brownian motion such that has the distribution for all . Then, has the required properties.
Proof of the Theorem
Assume that X is a continuous process with independent increments. If it can be shown that is normal for all , then Theorem 2 will follow by setting
By continuity of X, these are continuous functions. Furthermore, is the covariance matrix of and must be positive semidefinite. In fact it is enough to compute the characteristic function of for all ,

(3) 
The characteristic function of can recovered from by applying the independent increments property,
So, the distribution of is determined by

(4) 
Then, the proof of Theorem 2 requires showing that has the form of the characteristic function of a normal distribution (for each fixed t). That is, it is the exponential of a quadratic in a.
It is possible to prove the theorem directly, by splitting up into small time increments,
for . Letting the mesh of this partition go to zero, it is possible to show that only terms up to second order in a contribute to the terms in the limit. This does involve a tricky argument, taking care to correctly bound the higher order terms.
An alternative approach, which I take here, is to use stochastic calculus. Up to a martingale term, Ito’s lemma enables us to write the logarithm of in terms of X and a quadratic variation term. Then, taking expectations will give the desired quadratic form for .
As always, we will work with respect to a filtered probability space . In particular, if is the natural filtration of a process X with the independent increments property then, for , will be independent of . This will be assumed throughout the remainder of this post.
Let us start by showing that the characteristic functions of X have welldefined and continuous logarithms everywhere which, in particular, requires that be everywhere nonzero. On top of the independent increments property, only continuity in probability of X is required. That is, in probability for all sequences of times tending to t. This is a much weaker condition than pathwise continuity.
Lemma 3 Let X be a ddimensional process which is continuous in probability and has independent increments . Then, there exists a unique continuous function with and
(5)
Furthermore,
(6)
is a martingale for each fixed .
Proof: First, the function defined by (3) will be continuous. Indeed, if and then tends to in probability and, by bounded convergence, . We need to take its logarithm, for which it is necessary to show that it is never zero.
Suppose that for some t,a. By continuity, for the given value of a, t can be chosen to be minimal. From the definition, and t is strictly positive. By the independent increments property, for all
By minimality of t, is nonzero. Also, by continuity in probability, the second term on the right hand side tends to 1 as s increases to t, so is also nonzero for large enough s. So, .
We have shown that is a continuous function from to . It is a standard result from algebraic topology that is the covering space of with respect to the map and, therefore, has a unique lift with . That is, .
More explicitly, can be constructed as follows. For any positive constants , the continuity of implies that there are times such that for all t in the interval and . So, lies in the right halfplane of . As the complex logarithm is uniquely defined as a continuous function on this region, satisfying , uniquely extends from to by
So is uniquely defined over and and, by letting T, K increase to infinity, it is uniquely defined on all of .
It only remains to show that (6) is a martingale, which follows from (4) and the independent increments property,
Next, we would like to write the characteristic function in terms of the increments
With the aid of Ito’s lemma, it is possible to take logarithms of (6). This shows that, up to a deterministic process, X is a semimartingale, and also gives an expression for up to a martingale term.
Lemma 4 Let X be a continuous ddimensional process with independent increments and be as in (5).
Then, there exists a continuous such that is a semimartingale. Furthermore,
(7)
is a square integrable martingale, for all .
The proof of this makes use of complexvalued semimartingales, which are complex valued processes whose real and imaginary parts are both semimartingales. It is easily checked that Ito’s lemma holds for complex semimartingales, simply by applying the result to the real and imaginary parts separately.
Proof: Fixing an , set . Then, by Lemma 3, is a martingale and, hence, a semimartingale. Then, by Ito’s lemma, is a semimartingale. Note that, although the logarithm is not a welldefined twice differentiable function everywhere on , this true locally (actually, on any half plane), so there is no problem in applying Ito’s lemma here.
We have shown that is a semimartingale. Taking imaginary parts, is a semimartingale. In particular, writing where is the unit vector along the k’th dimension, then is a continuous function from to and is a semimartingale.
Applying Ito’s lemma again,
As , U is uniformly bounded over any finite time interval and, in particular, is a square integrable martingale. Similarly, is uniformly bounded on any finite time interval, so

(8) 
is also a square integrable martingale.
Now, Y can be written as for the process
which is both a semimartingale and a deterministic process. So, the integrals must be bounded over the set of all piecewisecontant and deterministic processes . Therefore, V has bounded variation over each bounded time interval. We have shown that plus an FV process, and recalling that continuous FV processes do not contribute to quadratic variations,
Substituting this and the definition of Y back into (8) shows that expression (7) is the square integrable martingale M.
Finally, taking expectations of (7) gives the required form for , showing that is a joint normal random variable for any , and completing the proof of Theorem 2.
Lemma 5 Let X be a continuous ddimensional process with independent increments, and be as in (5). Then, there are functions and such that
Proof: Taking the imaginary part of (7) shows that is a martingale. In particular, is integrable and, taking expectations,
Taking the real part of (7) shows that is a martingale. So, are integrable processes and
The result follows by taking and .
Let Y and Z two brownian motions and the process X = pY +(1p^2)^0.5 Z, where p is between 1 and 1. Assuming X is continuous and has marginal distributions N(0,t). Is X a brownian motion?
Another similar example….if Z is a normal (0,1) the process X(t) = t^0.5 Z is continuous and marginally distributed as a normal N(0,t). But is X a brownian motion?
I am confused how to prove the independent increments property or how to verify it. Any suggestions?
Comment by Quants Trader — 22 April 11 @ 3:48 PM 
Hi. You don’t need any advanced results to consider the examples you mention. If Y,Z are independent Brownian motions then will be a Brownian motion. This just uses the fact that a sum of independent normals is normal, so you can calculate the distribution X. If you aren’t assuming that they are independent then it will depend on precisely what you are assuming, and X does not have to be a Brownian motion in general.
The process is not a Brownian motion, as its increments are all proportional to Z, so are not independent.
Comment by George Lowther — 25 April 11 @ 1:53 AM 
Hi, is your definition equivalent to the one commonly used:
For any $n$ and any times $0<s_1<t_1<\ldots < s_n<t_n$, the random variables $\{X_{t_i}X_{s_i}\}$ are independent?
Comment by LimitSuperior — 3 May 18 @ 5:31 PM 