Almost Sure

8 November 09

Filtrations and Adapted Processes

In the previous post I started by introducing the concept of a stochastic process, and their modifications. It is necessary to introduce a further concept, to represent the information available at each time. A filtration {\{\mathcal{F}_t\}_{t\ge 0}} on a probability space {(\Omega,\mathcal{F},{\mathbb P})} is a collection of sub-sigma-algebras of {\mathcal{F}} satisfying {\mathcal{F}_s\subseteq\mathcal{F}_t} whenever {s\le t}. The idea is that {\mathcal{F}_t} represents the set of events observable by time {t}. The probability space taken together with the filtration {(\Omega,\mathcal{F},\{\mathcal{F}_t\}_{t\ge 0},{\mathbb P})} is called a filtered probability space.

Given a filtration, its right and left limits at any time and the limit at infinity are as follows

\displaystyle  \mathcal{F}_{t+}=\bigcap_{s>t}\mathcal{F}_s,\ \mathcal{F}_{t-}=\sigma\Big(\bigcup_{s<t}\mathcal{F}_s\Big),\ \mathcal{F}_{\infty}=\sigma\Big(\bigcup_{t\in{\mathbb R}_+}\mathcal{F}_t\Big).

Here, {\sigma(\cdot)} denotes the sigma-algebra generated by a collection of sets. The left limit as defined here only really makes sense at positive times. Throughout these notes, I define the left limit at time zero as {\mathcal{F}_{0-}\equiv\mathcal{F}_0}. The filtration is said to be right-continuous if {\mathcal{F}_t=\mathcal{F}_{t+}} .

A probability space {(\Omega,\mathcal{F},{\mathbb P})} is complete if {\mathcal{F}} contains all subsets of zero probability elements of {\mathcal{F}}. Any probability space can be extended to a complete probability space (its completion) in a unique way by enlarging the sigma-algebra to consist of all sets {A\subset\Omega} such that {B\subseteq A\subseteq C} for {B,C\in\mathcal{F}} satisfying {{\mathbb P}(C\setminus B)=0}. Similarly, a filtered probability space is said to be complete if the underlying probability space is complete and {\mathcal{F}_0} contains all zero probability sets.

Often, in stochastic process theory, filtered probability spaces are assumed to satisfy the usual conditions, meaning that it is complete and the filtration is right-continuous. Note that any filtered probability space can be completed simply by completing the underlying probability space and then adding all zero probability sets to each {\mathcal{F}_t}. Furthermore, replacing {\mathcal{F}_t} by {\mathcal{F}_{t+}}, any filtration can be enlarged to a right-continuous one. By these constructions, any filtered probability space can be enlarged in a minimal way to one satisfying the usual conditions.

Throughout these notes I assume a complete filtered probability space, although many of the results can be extended to the non-complete case without much difficulty. However, for the sake of a bit more generality, I don’t assume that filtrations are right-continuous.

One reason for using filtrations is to define adapted processes. A stochastic process process {X} is adapted if {X_t} is an {\mathcal{F}_t}-measurable random variable for each time {t\ge 0}. This is just saying that the value {X_t} is observable by time {t}. Conversely, the filtration generated by any process {X} is the smallest filtration with respect to which it is adapted. This is given by {\mathcal{F}^X_t=\sigma\left(X_s\colon s\le t\right)}, and referred to as the natural filtration of {X}.

As mentioned in the previous post, it is often necessary to impose measurability constraints on a process {X} considered as a map {{\mathbb R}_+\times\Omega\rightarrow{\mathbb R}}. Right-continuous and left-continuous processes are automatically jointly measurability. When considering more general processes, it is useful to combine the measurability concept with adaptedness. This can be done in either of the following three ways, in order of increasing generality (see Lemma 4 below).

Definition 1

  • The predictable sigma-algebra on {{\mathbb R}_+\times\Omega}, denoted by {\mathcal{P}}, is generated by the left-continuous and adapted processes. A stochastic process is said to be predictable if it is {\mathcal{P}}-measurable. Alternatively, the predictable processes are sometimes called previsible.
  • The optional sigma-algebra on {{\mathbb R}_+\times\Omega}, denoted by {\mathcal{O}}, is generated by the right-continuous and adapted processes. A stochastic process is said to be optional if it is {\mathcal{O}}-measurable.
  • A process {X} is progressively measurable, or just progressive, if for each {t\ge 0}, the map

    \displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle [0,t]\times\Omega\rightarrow{\mathbb R},\smallskip\\ &\displaystyle (s,\omega)\mapsto X_s(\omega) \end{array}

    is {\mathcal{B}([0,t])\otimes\mathcal{F}_t}-measurable.

The most important of these definitions, at least in these notes, is that of predictable processes. While adapted right-continuous processes will be used extensively, there is not much need to generalize to optional processes. Similarly, progressive measurability isn’t used a lot except in the context of adapted right-continuous (and therefore optional) processes. On the other hand, predictable processes are extensively used as integrands for stochastic integrals and in the Doob-Meyer decomposition, and are often not restricted to the adapted and left-continuous case.

Given any set of real-valued functions on a set, which is closed under multiplication, the set of functions measurable with respect to the generated sigma-algebra can be identitified as follows. They form the smallest set of real-valued functions containing the generating set and which is closed under taking linear combinations and increasing limits. So, for example, the predictable processes form the smallest set containing the adapted left-continuous processes which is closed under linear combinations and such that the limit of an increasing sequence of predictable processes is predictable.

Another way of defining predictable processes is in terms of continuous adapted processes.

Lemma 2 The predictable sigma-algebra is generated by the continuous and adapted processes.

Proof: Clearly every continuous adapted process is left-continuous and, therefore, is predictable. Conversely, if {X} is an adapted left-continuous process then it can be written as a limit of the continuous processes

\displaystyle  X^n_t = n\int_{t-1/n}^t1_{\{|X_{s\vee 0}|\le n\}}X_{s\vee 0}\,ds.

Continuity of {X^n} follows from the fact that {t\mapsto\int_{t-1/n}^tf(s)\,ds} is continuous for all bounded and measurable functions {f\colon{\mathbb R}\rightarrow{\mathbb R}}. This is easily seen to be true for piecewise constant functions, and the extension to measurable functions is a standard application of the functional monotone class theorem. The limit {X^n\rightarrow X} follows from left-continuity of {X}, which is therefore in the sigma-algebra generated by the continuous adapted processes {X^n}. \Box

A further method of defining the predictable sigma-algebra is in terms of simple sets generating it. The following is sometimes used.

Lemma 3 The predictable sigma-algebra is generated by the sets of the form

\displaystyle  \left\{(s,t]\times A\colon t>s\ge 0, A\in\mathcal{F}_s\right\}\cup\left\{\{0\}\times A\colon A\in\mathcal{F}_0\right\}.


Proof: If {S} is any of the sets in the collection (1) then the process {X=1_S} defined by {X_t(\omega)=1_{\{(t,\omega)\in S\}}} is adapted and left-continuous, and therefore predictable. So, {S\in\mathcal{P}}.

Conversely, let {X} be left-continuous and adapted. Then it is the limit of the piecewise constant functions

\displaystyle  X^n_t=X^n_01_{\{t=0\}}+\sum_{k=1}^\infty X^n_{(k-1)/n}1_{\{(k-1)<t\le k/n\}}

as {n} goes to infinity. Each of the summands on the right hand side is easily seen to be measurable with respect to the sigma-algebra generated by the collection (1). So, {X=\lim_{n\rightarrow\infty}X^n} is also measurable. \Box

In these notes, I refer to the collection of finite unions of sets in the collection (1) as the elementary or elementary predictable sets. Writing these as {\mathcal{E}} then {\mathcal{P}=\sigma(\mathcal{E})}.

Finally, the different forms of measurability can be listed in order of generality, starting with the predictable processes, up to the much larger class of jointly measurable adapted processes.

Lemma 4 Each of the following properties of a stochastic process implies the next

  1. predictable.
  2. optional.
  3. progressive.
  4. adapted and jointly measurable.

Proof: As the predictable sigma-algebra is generated by the continuous adapted processes, which are also optional by definition, it follows that all predictable processes are optional.

Now, if {X} is a right-continuous and adapted process and {T\ge 0}, then the process {Y_t=X_{t\wedge T}} is right-continuous and {\mathcal{F}_T}-measurable at all times. By the joint measurability of right-continuous processes, {Y} is {\mathcal{B}({\mathbb R})\otimes \mathcal{F}_T}-measurable. As this holds for all times {T}, {X} is progressively measurable.

Finally, consider a progressively measurable process {X}. From the definitions, it is jointly measurable. Furthermore, for any time {t\ge 0}, {(s,\omega)\mapsto X_s(\omega)} restricted to {[0,t]\times\Omega} is {\mathcal{B}([0,t])\otimes\mathcal{F}_t}-measurable. Therefore, {X_s} is {\mathcal{F}_t}-measurable for all {s\le t} and, in particular, {X} is adapted. \Box

About these ads


  1. I have been trying to find clarification on a question relating to filtrations and measurability for awhile, but can’t find discussions of it. I figure someone knowledgeable about stochastic processes would know this. If you have a random variable X that is measurable with respect to a sigma algebra generated by random variable Y, then there exists a function g such that X=g(Y). But what if the sigma algebra is generated by an uncountable number of random variables, such as where F is the natural filtration of stochastic process Y. What can be said in this case for r.v. X measurable with respect to F_t? Is there any functional representation?

    Thanks for any help.

    Comment by Student — 29 September 10 @ 2:13 AM | Reply

    • Yes there is! Assuming that X is a real-valued random variable, then it can be expressed as a function of Y.
      If Y:Ω → E is a measurable map from the underlying probability space to a measurable space (E,ℰ) then any σ(Y)-measurable map X:Ω → ℝ can be written as X = g(Y) for a measurable g:E → ℝ. This is an application of the functional monotone class theorem. (i) Show that every X = 1A for A ∈ σ(Y) is of the form g(Y). (ii) Show that the set of random variables of the form g(Y) is closed under linear combinations and taking increasing limits. (iii) appeal to the monotone class theorem.
      If Yt is a (real-valued) process, it can be considered as a map to the space ℝ+ of real valued functions.

      Comment by George Lowther — 30 September 10 @ 10:58 PM | Reply

  2. Hi,

    According to Lemma 2, will it be true to say that all brownian motions and martingales that are continuous are predictable processes?

    Comment by Su — 4 February 11 @ 5:00 AM | Reply

    • Yes, that’s correct. Assuming you are using an appropriate filtration, so that your processes are adapted, then they are predictable.

      Comment by George Lowther — 4 February 11 @ 9:06 PM | Reply

      • Thanks for your reply George.
        One more thing that’s bothering me, in the context of finding an equivalent martingale measure. Correct if I am wrong below:
        For discounted asset price process, Dt (assuming a constant discount rate), there are two reasons why the discounted asset price process could be predictable:
        a) Dt will be a martingale under say Q-probability measure, and is hence predictable (under this probability measure only?).
        b) If we assume say Geometric Brownian Motion Model for the asset price, the random component of the asset price will only be the Brownian Motion, which is predictable. Hence St (and Dt) is a predictable process.
        I have read somewhere that Dt is predictable, which is not very intuitive since asset prices from the market cannot be predictable. Hence my two reasoning above: Predictable only under certain probability measure OR predictable for asset prices which we have assumed a certain model.
        Hope you can understand where I am coming from and really appreciate if you could shed some light on this.

        Comment by Su — 6 February 11 @ 1:30 AM | Reply

      • Su,

        The reason why processes such as Brownian motion are predictable is just because they are continuous and adapted. It does not have anything to do with the martingale property. In fact, it does not have much to do with the measure either. The property of being predictable is not affected by an equivalent measure change.

        I think maybe you are a bit confused about the meaning of being predictable in the precise mathematical sense defined above, which is rather different from the (rather ill-defined) everyday notion of something being predictable. Here, predictability only means that you can (roughly speaking) predict the future value in an infinitesimal sense. Just because a Brownian motion is predictable does not mean that you can tell what value it will take at some future point in time. What you can say is that you can tell when the process is going to reach some value a just prior to it happening. This is simply because of continuity. If X_t=a at some time then, by continuity, \vert X_s-a\vert will be arbitrarily small for times s just before t.

        So, you can model asset prices by predictable processes. It simply means that they are continuous. Under Black-Scholes, the process is predictable in this sense. That is because it ignores the possibility that the price can suddenly jump.

        And, apologies for the late response. Hope that helps clear things up a bit.

        Comment by George Lowther — 25 February 11 @ 4:49 PM | Reply

  3. Hi George,

    Appreciate the explanation. Definitely cleared things up!

    Comment by Su — 12 March 11 @ 6:39 AM | Reply

  4. [...] is predictable, (see definition here) [...]

    Pingback by Levy Process-3: Ito’s formula « 01law's Blog — 27 April 11 @ 9:31 AM | Reply

  5. Nice proof of Lemma 2, although it took me a while to see why it does not get through when X is right continuous :). Adapt.

    Comment by yaoliang — 9 June 11 @ 4:56 AM | Reply

  6. BTW, would that mean if the filtration is right continuous, then optional sigma field = predictable sigma field?

    Comment by yaoliang — 9 June 11 @ 5:05 AM | Reply

    • Hi Yaoliang. The proof of Lemma 2 doesn’t work if X is only right-continuous because Xn would not converge to X. I’m not sure why you suggest that optional = predictable when the filtration is right-continuous though. This is not true. The filtration generated by a Poisson process is right-continuous (once you complete it), but Poisson processes are not predictable. However, it is often the case that optional = pedictable. For example, this happens if the filtration is generated by a Brownian motion — see my latest post Predictable Stopping Times.

      Comment by George Lowther — 9 June 11 @ 10:07 PM | Reply

      • Thanks, George. Here is what I thought about right continuous processes: change the integral interval of X^n from [t-1/n,t] to [t, t+1/n]. Would this be enough to argue that X^n converges to X? But of course, X is only F_+ measurable, that’s why I thought option=predictable when the filtration is right continuous (then F_+=F, hence X is adapted). I believe my claim is false, but still couldn’t see why the proof of Lemma 2 cannot go through.

        Comment by yaoliang — 9 June 11 @ 10:56 PM | Reply

        • Lemma 2 relies on the fact that the processes Xn are both continuous and adapted and, then, the fact that they converge to X means that X is measurable with respect to the sigma algebra generated by the continuous and adapted processes.
          If X was only assumed to be right-continuous then, as you say, you could define Xn by integrating over the range [t,t+1/n] instead. However, Xn would not be adapted. So, you can say that X is measurable with respect to the sigma-algebra generated by the continuous processes. But, it need not be measurable with respect to the sigma-algebra generated by the processes which are simultaneously continuous and adapted. It is true though, that X is measurable with respect to the processes which are continuous and \mathcal{F}_{\cdot+\epsilon}-adapted for any given \epsilon > 0. This is much weaker than being measurable with respect to the continous and \mathcal{F}_{t+}-adapted processes.

          Comment by George Lowther — 9 June 11 @ 11:09 PM

  7. Hi George,
    Thanks for this great post, it certainly clears things up. Nevertheless I have some trouble understanding the proof of lemma 4. I don’t completely understand how optional impies progressive. You proved that the generators of the optional sigma algebra, are progressively measurable. But I don’t see how this implies that for any optional process this also holds. Do you assume if the generators of a sigma algebra are B([o,t]) x Ft measurable on the subset [0,t] x omega then the same holds for the functions measurable with respect to this sigma algebra? I hope you understand my problem, and I would really appreciate it if you would explain this to me.

    All the best,


    Comment by GuidovM — 14 June 11 @ 11:15 PM | Reply

    • GuidovM,

      Yes I did assume that if the generators of a sigma algebra are B([0,t]) × Ft measurable on the subset [0,t] × Ω then the same holds for the functions measurable with respect to this sigma algebra.

      Maybe I did not explain this bit well. You can show that the sets A ⊆ R+×Ω which are B([0,t])×Ft measurable on the subsets [0,t] × Ω forms a sigma-algebra (just check that it is closed under countable unions and taking complements). This is the progressive sigma-algebra. Furthermore, the definition of progressive processes given in Definition 1 is equivalent to saying that the process is measurable with respect to the progressive sigma-algebra. So, the statement “optional ⇒ progressive” is the same as saying that optional sigma-algebra ⊆ progressive sigma-algebra. To prove this you only need to look at processes generating the optional sigma-algebra.

      I should add a statement to Definition 1 defining the progressive sigma-algebra and stating that progressive ⇔ measurable with respect to the progressive sigma-algebra. I’ll do this at some point – thanks for mentioning it.

      Hope that clears things up.

      Comment by George Lowther — 15 June 11 @ 1:35 AM | Reply

  8. Hi George,

    First of all thanks for your great set of notes. They are among the best I have come across so far.

    Quick question:
    Is it true that if a function f is F1xF2 measurable and that its section f_y(x) is F1* measurable for all y in Omega2, then f is F1xF2* measurable?
    Your proof of “optional => progressive” seems to rely on the latter fact.

    Thanks in advance!
    – Tigran

    Comment by Tigran — 16 November 11 @ 2:51 PM | Reply

    • Actually I just realized that your proof does not require what I mentioned.
      Never mind!

      Comment by Tigran — 16 November 11 @ 3:38 PM | Reply

    • Hi.

      No, I don’t rely on that fact. Actually, that’s not true. Consider for example F1, F2 to be the Borel sigma-algebra on the reals. Let f be the indicator function of a non-measurable subset of the diagonal in R2. Then, fy(x) will be measurable for each y, but f itself is not measurable (I’m not sure what you mean by F1* and F1xF2*. Maybe the completion? That depends on having a measure though).

      Comment by George Lowther — 16 November 11 @ 10:46 PM | Reply

  9. Thanks for the reply. I realized after my initial comment that the proof doesn’t rely on this statement.
    I also just realized how unclear my initial statement actually was (and how inconsistent the notation was as well).

    What I meant to say is that suppose f(x,y) is measurable w.r.t. some joint sigma-algebra (F1xF2) and that its section in y (f_y(x)) is measurable w.r.t. a sigma-algebra smaller than F1 (this smaller sigma-algebra is denoted F1*) for each y.
    Note: F1* does not depend on y.
    The question is: Is f(x,y) measurable w.r.t. F1*xF2?
    I have a feeling the answer in no.

    Comment by Tigran — 21 November 11 @ 3:23 PM | Reply

    • The answer is indeed no. Consider any probability space (Ω,F1,P) on which we have defined a uniformly distributed random variable T: Ω → [0,1]. Let F1* consist of the sets in F1 with probability 0 or 1. Let F2 be the Borel sigma-algebra on [0,1].

      Now, look at the stochastic process Ω×[0,1] → R given by (ω,t) ↦ Xt(ω) = 1{T(ω)=t}. This is F1×F2 measurable but not F1*×F2 measurable. However, Xt is almost surely zero at each time t, so is F1* measurable.

      Comment by George Lowther — 21 November 11 @ 7:32 PM | Reply

      • Thanks again for your answer George. It’s a nice counter-example.
        The only bit that I have yet to understand completely is: why is I_{T(omega}=t} not F1*xF2-measurable.
        It’s probably a trivial detail that I am missing and your input would be great to help me understand this.

        P.S.: Once again my hat is off to the clarity of your notes. They are extremely helpful in getting a better intuition of the technicalities in stochastic analysis.

        Comment by Tigran — 26 November 11 @ 2:55 PM | Reply

        • It is true that this process is not F1*xF2 measurable. However, it is not a trivial detail. Far from it. You can prove it by the measurable projection theorem. If it was measurable then this theorem would tell you that the set of ω ∈ Ω such that there exists a t ≤ 1/2 with Xt(ω) = 1 is measurable with respect to the completion of F1*, so has measure 0 or 1. However, this set is just {T ≤ 1/2}, which has probability 1/2. So X is not F1*xF2 measurable.

          Comment by George Lowther — 26 November 11 @ 3:11 PM

        • …actually the measurable projection theorem is very similar to the generalised form of the Debut theorem which says that the first time that a jointly measurable process hits a measurable set is itself measurable wrt the completion of the probability space (and is a stopping time if the process is progressive and the filtration is right continuous). In this case, the first time X hits 1 is T, which is not F1* measurable. So X is not F1*xF2 measurable.

          Comment by George Lowther — 26 November 11 @ 3:32 PM

  10. Dear George, thanks for very nice notes – the explanation is highly helpful. I didn’t get the definition of a predictable sigma-algebra: that is generated by left-continuous and adapted processes. W.r.t. which filtration processes generating predictable sigma-algebra should be adapted?
    Thanks in advance,

    Comment by Ilya — 17 February 12 @ 4:44 PM | Reply

    • The definition of the predictable sigma-algebra does depend on what underlying filtration is used to define adapted processes. Usually, we are given one sigma-algebra \{\mathcal{F}_t\}_{t\ge0} (representing all observable events at time t). Then, we work exclusively with respect to this sigma-algebra, which is used in the definition of adapted processes (and predictable processes, optional processes, stopping times, etc). If we have multiple sigma-algebras and it is not clear which one is being used from the context, then it should be stated. With multiple sigma-algebras, we have multiple predictable sigma-algebras (and multiple optional sigma algebras, etc). Where this happens in these notes I say something along the lines of “the \mathcal{F}_\cdot-predictable sigma algebra” or “predictable with respect to \mathcal{F}_\cdot”).

      Comment by George Lowther — 22 February 12 @ 1:20 AM | Reply

  11. Hi!

    I don’t quite understand why it is important to assume the usual conditions. What exactly do we miss if don’t assume them? Do stochastic integral stop being adapted? Are we unable to make Doob-Meyer decompositions? Martingales stop having a cadlag version? When we make a Girsanov transformation, the new Brownian motion generates a filtration which does not coincide with the filtration of the original Bm?

    Thanks for these great notes!

    Comment by Chilorio — 4 January 13 @ 6:22 AM | Reply

    • Cadlag versions and stochastic integrals would both fail to be adapted. Check: compensated Poisson process X, pick its left-continuous version Y and the filtration generated by Y (= the predictable).

      Comment by MathSwineT — 18 January 13 @ 7:43 AM | Reply

  12. George, I would like to ask a question about this very useful thread. In Lemma 2, when you define the process $X_t^n = \int_{t-1/n}^t etc… I can see that this definition works when $X_t $ is a step process or a simple process. But for general $X_t $ how do you handle them inside the integral?…. Are you saying that one should prove it using the functional monotone class theorem? In this case which version do you use as there is more than one fMonClassTheorem? Thank you in advance if you could throw a few more details in there.

    Comment by Mario — 4 September 13 @ 7:09 AM | Reply

  13. Hi, I have a quite related question. Suppose X and Y are two continuous processes. For each t, Xt is adapted with respect to {Ys:s X(t,w) and (t, w) –> Y(t,w), is the joint sigma-field generated by X contained in that generated by Y? This kind of joint measurability question is interesting and difficult.

    Comment by Chi Dong — 24 June 14 @ 12:29 AM | Reply

    • Sorry, there seems to be some system error. My question is like this: Suppose X and Y are two continuous processes. For each t, Xt is adapted with respect to {Ys:s X(t,w) and (t, w) –> Y(t,w), is the joint sigma-field generated by X contained in that generated by Y?

      Comment by Chi Dong — 24 June 14 @ 12:31 AM | Reply

      • Sorry, still error. Suppose X and Y are two continuous processes. For each t, Xt is adapted with respect to {Ys:s< = t}.

        Comment by Chi Dong — 24 June 14 @ 12:32 AM | Reply

      • Considering X and Y as functions (t,w) –> X(t,w) and (t, w) –> Y(t,w), is the joint sigma-field generated by X contained in that generated by Y?

        Comment by Chi Dong — 24 June 14 @ 12:32 AM | Reply

  14. My spouse and I stumbled over here by a different
    web address and thought I might as well check things out.
    I like what I see so noow i am following you.
    Look forward to exploring your web page yet again.

    Comment by Progressive.Com — 27 June 14 @ 2:36 AM | Reply

  15. Hellο, itѕ good paragraph regarding media print, we all bee familiar witfh
    media іs a enormous souce օf information.

    Comment by Gatorade Printable Coupons — 12 July 14 @ 4:53 AM | Reply

  16. Υes! Finally something abοut free printable powerade
    coupons 2011.

    Comment by 50 cent Coupons for powerade — 30 July 14 @ 6:49 PM | Reply

  17. Hello, еverything is gοing nicely here and
    ofcourse every оne is sharing faϲts, that’s genuijely excellent, ҟeep up writing.

    Comment by chobani Yogurt michaels coupon December 2013 — 1 August 14 @ 11:40 PM | Reply

  18. Yeѕ! Finally someone writes aboսt oil ߋf
    olay coupons total effects.

    Comment by oil Of olay microdermabrasion kit coupons — 4 August 14 @ 1:03 AM | Reply

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

The Rubric Theme. Create a free website or blog at


Get every new post delivered to your Inbox.

Join 140 other followers