Serving the Quantitative Finance Community

 
User avatar
TraderJoe
Topic Author
Posts: 1
Joined: February 1st, 2005, 11:21 pm

Of Markov chains and martingales

July 31st, 2008, 2:01 am

What's the difference between a Markov chain and a martingale?
 
User avatar
Roderick
Posts: 0
Joined: June 6th, 2008, 8:25 am

Of Markov chains and martingales

July 31st, 2008, 8:49 am

My apologies to all mathematicians for the rough explanation below;Intuitively, a Markov process (in discrete case often referred to as a chain) represents a process that possesses the Markov property. The Markov property can be thought of as a property such that for every future step in the process (chain), the current state of the process is the only important information. Past information is therefore not important and the process is often called memoryless. How the current state of the process is used to get some kind of expected value or probability distribution for the future step in the process is insignificant. A Martingale process is a process for which it holds that the expected value of the future step in the process, is the current state of the process. So, though I am not completely sure about the mathematical proof of this statement, every martingale process contains the Markov property.
 
User avatar
TheBridge
Posts: 1
Joined: November 22nd, 2005, 3:42 pm

Of Markov chains and martingales

July 31st, 2008, 9:34 am

Discounted price processes in a no arb complete market and discrete time setting are martingales ( fundamental theorem of asset pricing)If we take a path dependent option the discounted price process is then a martingale but it is not a markovian The other way around if you take a markov chain with non constant expectation it 's not a martingale for example X_n=R_n +n where R_n is a Rademacher process
 
User avatar
Roderick
Posts: 0
Joined: June 6th, 2008, 8:25 am

Of Markov chains and martingales

July 31st, 2008, 10:55 am

Hi,perhaps you can explain a little further, because now I seem to be confused...I agree with you that a markov chain with non-constant expectation is not a martingale. However, I am not sure if I agree with your other argument:for a path dependent option the (discounted) current price reflects the path already taken... right?is it not possible to construct an alternative path that would result in the same current price?If possible, I would argue that the current price is the only relevant information, and that whichever path was in fact taken is insignificant information..this I would say, equals the markov property
 
User avatar
TraderJoe
Topic Author
Posts: 1
Joined: February 1st, 2005, 11:21 pm

Of Markov chains and martingales

July 31st, 2008, 11:05 am

Also, how is this affected for discrete time and continuous time processes?Thanks Edit: cross-posted.
Last edited by TraderJoe on July 30th, 2008, 10:00 pm, edited 1 time in total.
 
User avatar
TraderJoe
Topic Author
Posts: 1
Joined: February 1st, 2005, 11:21 pm

Of Markov chains and martingales

July 31st, 2008, 11:16 am

From Baxter & Rennie:Markov process: a process whose future behaviour, conditional on the present, is independent of the past.Martingale: a process whose expected future value, conditional on the past, is its current value. That is, E[M_t|F_s] = M_s, for all s < t.Are these two statements equivalent, or is one a subset of the other?
 
User avatar
TraderJoe
Topic Author
Posts: 1
Joined: February 1st, 2005, 11:21 pm

Of Markov chains and martingales

July 31st, 2008, 11:31 am

Shreve makes it clearer:Martingale: E[M(t)|F(s)] = M(s), 0 < s < t < TMarkov: E[f(t, X(t))|F(s)] = f(s, X(s)), 0 < s < t < TM(t), X(t) are adapted (can't see the future) stochastic processes; f is a Borel-measurable function.So Markov applies to functions of X. Still curious to know how path dependence comes into it here. An application (in terms of asset pricing) would be nice .
Last edited by TraderJoe on July 30th, 2008, 10:00 pm, edited 1 time in total.
 
User avatar
Traden4Alpha
Posts: 3300
Joined: September 20th, 2002, 8:30 pm

Of Markov chains and martingales

July 31st, 2008, 11:49 am

A couple of additional differences:Markov chains operate in a nominal space (i.e., a set of potentially non-ordered states), so the expectation operator is not necessarily defined. Martingales are defined in an interval space.Markov chains can show second-order dynamics (e.g., oscillating through a set of two or more states), Martingales do not.Martingale is a subset of Markov.
 
User avatar
TheBridge
Posts: 1
Joined: November 22nd, 2005, 3:42 pm

Of Markov chains and martingales

July 31st, 2008, 12:39 pm

What do you think of this 3 step process ( martingale but not markov chain)Let Y_i=+1 with proba 1/2 -1 with proba 1/2 i=1,2,3 independent of one another Now let's define the process X_t t=1,2,3X_1=Y_1X_2=Y_1+Y_2X_3=Y_1+Y_2+Y_3*1_{X_1=-1;X_2=0}Then it is easy to see that it is a martingale It 's easy to see that X_3 is not a markov chain because the law of X_3 knownig (X_1=-1,X_2=0) (which is Y3) is not to the law of X_3 knowing only X_2=0 which is the law of Y_3 with proba 1/2 and a constant equal to X_2=0 with proba 1/2.So the path through which you reach X_2=0 has an impact on the law of X_3.In conclusion the markov property is a property about the information on the law of the process with respect to time and martingale porperty is a property on conditionnal expectation which cannot caracterize the information of the law of the process itself. Most of the time though both properties are true. For example for diffusion processes( it depends of course on the definition of the diffusion that you take).
Last edited by TheBridge on July 30th, 2008, 10:00 pm, edited 1 time in total.
 
User avatar
TraderJoe
Topic Author
Posts: 1
Joined: February 1st, 2005, 11:21 pm

Of Markov chains and martingales

July 31st, 2008, 1:21 pm

OK thanks. Can we look at some distributions now:BM - Martingale and Markov?PoissonCompound PoissonVarious other Levy processes (Gamma, NIG etc.)How does the fact that the probability densities are discrete or continuous affect the outcome?Edit: Interesting to note that we have four different cases here: Discrete time, continuous time; discrete and continous probability densities. How about multivariate distributions f(X, Y, Z, ...)? Stationary and non-stationary increments? (OK, I think I need a book now).
Last edited by TraderJoe on July 30th, 2008, 10:00 pm, edited 1 time in total.
 
User avatar
TheBridge
Posts: 1
Joined: November 22nd, 2005, 3:42 pm

Of Markov chains and martingales

July 31st, 2008, 1:53 pm

Lévy processes have strong markov porperty As Lévy processes are semi martingales you can decompose them ( not uniquely) as the sum of finite variation process which represents the finite jump parts+drift) plus a local martingale ( Bronwnian motion + compensated infinite activity process with "small" jumps) so you can isolate the locale martingale part of it.
 
User avatar
TheBridge
Posts: 1
Joined: November 22nd, 2005, 3:42 pm

Of Markov chains and martingales

July 31st, 2008, 2:22 pm

TraderJoe as you can note that discrete time process can be imbeded into continuous time process ( by considering them as constant between the time they change) you can see that what I said in my previous post about markov property and martingale property, continue to hold for continuous time process.considering dicrete probability densities and continuous one you can also ask what happen for continuous singular densities to be complete even if there are not of much utility in everyday life(exept if you consider Local times of brownian motion everyday).
Last edited by TheBridge on July 30th, 2008, 10:00 pm, edited 1 time in total.
 
User avatar
TraderJoe
Topic Author
Posts: 1
Joined: February 1st, 2005, 11:21 pm

Of Markov chains and martingales

July 31st, 2008, 3:14 pm

Thanks Bridge !
 
User avatar
TheBridge
Posts: 1
Joined: November 22nd, 2005, 3:42 pm

Of Markov chains and martingales

July 31st, 2008, 3:33 pm

De nada Señor Joe
 
User avatar
bilbo1408
Posts: 0
Joined: August 3rd, 2007, 12:50 pm

Of Markov chains and martingales

August 2nd, 2008, 1:12 pm

QuoteOriginally posted by: TraderJoeShreve makes it clearer:Martingale: E[M(t)|F(s)] = M(s), 0 < s < t < TMarkov: E[f(t, X(t))|F(s)] = f(s, X(s)), 0 < s < t < TM(t), X(t) are adapted (can't see the future) stochastic processes; f is a Borel-measurable function.So Markov applies to functions of X. Still curious to know how path dependence comes into it here. An application (in terms of asset pricing) would be nice .You've pretty much summed it up here. If we think of two functions of X, f(X) and g(X), then having the Markov property states that for every function f(X), and for every f(X) there must be a g(X). Existence of g(X) which is dependent only on f and n is enough to ensure Markov.Martingale is the special case of this when f(X)=g(X)=x. Therefore, Markov does not ensure martingale because it does not require g(X)=x. The example below shows why martingale does not necessarily imply Markov.