Serving the Quantitative Finance Community

 
User avatar
ProbablyWrong
Topic Author
Posts: 0
Joined: March 2nd, 2010, 12:37 pm

Simple Random Walk

March 2nd, 2010, 1:14 pm

Hi everybody.I'm looking at a type of question I haven't come across before but I gather it is a "Simple Random Walk" and I'm not sure how to go about solving it. The question is:A square is defined by points A, B, C and D. A dog starts at point A and can either go left (to B) or down (to D) with equal probability, p. While at any other point the dog may move to either adjacent point with probability p also i.e. at C it may move to B or D with equal probability, p. If the dog moves once per day, what is the expected number of days that the dog will return to A?I saw a similar question on this forum which used a triangle but had difficulty understanding the replies. If anybody can show me how these questions are approached or can point me in the direction of appropriate literature that'd be great. Any help much appreciated!
Last edited by ProbablyWrong on March 1st, 2010, 11:00 pm, edited 1 time in total.
 
User avatar
Eddi
Posts: 0
Joined: February 9th, 2010, 4:45 am

Simple Random Walk

March 2nd, 2010, 5:10 pm

I assume you mean p = 1/2 (if you mean that there is also probability for staying in place, the following can be extended to that).Let a = E(start from A and end at A), b = E(start from B and end at A), c = E(start from C and end at A), d = E(start from D and end at A). Note that b = d and a = c by the symmetries of the problem, but I keep them different below for clarity.a = 1/2(1+b) + 1/2(1+d)b = 1/2 + 1/2(1+c)c = 1/2(1+b) + 1/2(1+d)d = 1/2 + 1/2(1+c)Solving, you get a = 4.
 
User avatar
ProbablyWrong
Topic Author
Posts: 0
Joined: March 2nd, 2010, 12:37 pm

Simple Random Walk

March 3rd, 2010, 1:33 pm

Ah I see. I think I was making the problem a lot more difficult than it actually is! Thanks for that!
 
User avatar
avishalom
Posts: 0
Joined: October 8th, 2009, 8:26 am

Simple Random Walk

March 4th, 2010, 3:21 pm

well, if p=0.5 the problem is a lot simpler. if you pair the moves (since it will never be an odd number) you can reduce the problem down to being at point C or AHeads, you go to A ,(or stay, if it is the first)tails - go to C . (or stay)how many times do you have to through a coin until you see a Head - <N> = 2each coin flip represents two steps ->4
 
User avatar
cvkmndfv
Posts: 0
Joined: November 11th, 2009, 2:17 am

Simple Random Walk

March 4th, 2010, 8:08 pm

Since moving from one vertex to another vertex has equal probabilities, the long run probabilities of being in any position (A, B, C, D) should be equal -- namely, 1/4 for each vertex. The expected number of steps to reach state A starting from state A is then 1/(1/4) = 4. This expectations of reaching your initial state starting from any state will also be 4.
 
User avatar
wileysw
Posts: 7
Joined: December 9th, 2006, 6:13 pm

Simple Random Walk

March 6th, 2010, 11:57 pm

cvkmndfv, your answer is curiously interesting:it's easy to see from avishalom's answer that the stopping time is not a geometric random variable with 1/4 prob of happening, rather it's a scaled version, i.e., the steps can only be even and the prob of A appearing (rather than C) is 1/2, hence 2*1/(1/2). however, i don't see a straightforward way to apply his method for general n-vertex graphs.on the other hand, your seemingly erroneous approach not only gives the correct answer but also can be easily generalized. for a square, say the dog has a preference going clockwise (with prob p>1/2) to counterclockwise (with prob 1-p), the answer is still 4.actually for any n-vertex graph (and its transition matrix) with sufficient symmetry, e.g., a complete graph such as tetrahedron with equal probability from one vertex going to other three vertices (also a cube, note average of 8 steps going back to the origin, and 10 to the opposite vertex), the answer is indeed 1/(1/n)=n. could you justify why this is correct, or is it a pure coincidence?
Last edited by wileysw on March 9th, 2010, 11:00 pm, edited 1 time in total.
 
User avatar
FritzJacob
Posts: 3
Joined: January 22nd, 2010, 11:15 am

Simple Random Walk

March 7th, 2010, 6:26 pm

Is this approach correct??? Consider an N-vertex graph, let the p be the probability that it moves clockwise, (1-p) that it moves anti- clockwiseLet E(p) = expected number of days.This implies E = constant, now apply one of the following boundary conditions, when p = 1, E = N (since it can only move clockwise, it takes N days to reach home),p = 0, E = NHence E = N, for an n-vertex graph.
 
User avatar
FritzJacob
Posts: 3
Joined: January 22nd, 2010, 11:15 am

Simple Random Walk

March 8th, 2010, 2:28 am

I think this kind of approximation will not work with single dimension..otherwise every derivative can be proved to be 0?
 
User avatar
tristanm
Posts: 0
Joined: October 20th, 2009, 3:21 pm

Simple Random Walk

March 8th, 2010, 3:05 pm

QuoteOriginally posted by: wileyswcvkmndfv, your answer is curiously interesting:it's easy to see from avishalom's answer that the stopping time is not a geometric random variable with 1/4 prob of A appearing, rather it's a scaled version, i.e., the steps can only be even and the prob of going back to A is 1/2, hence 2*1/(1/2). however, i don't see a straightforward way to apply his method for general n-vertex graphs.on the other hand, your seemingly erroneous approach not only gives the correct answer but is easily generalized. for the square, say the dog has a preference going clockwise (with prob p>1/2) to counterclockwise (with prob 1-p), the answer is still 4.actually for any n-vertex graph with sufficient symmetry and transition matrix, e.g., a complete graph such as tetrahedron with equal probability from one vertex going to other three vertices (also a cube, note average of 8 steps going back to the origin, and 10 to the opposite vertex), the answer is indeed 1/(1/n)=n. could you justify why this is correct, or is it a pure coincidence?It 'works' because a random walk on a graph is always a recurrent Markov Chain, with invariant distribution given by,where v_i is the number of edges at a vertex i, and sigma is the sum of the v_i. The expected return time to i is then 1/pi_i, by standard Markov Chain results.
 
User avatar
wileysw
Posts: 7
Joined: December 9th, 2006, 6:13 pm

Simple Random Walk

March 10th, 2010, 3:26 am

tristanm, many thanks.that is a very nice way solving this type of problems.
Last edited by wileysw on March 9th, 2010, 11:00 pm, edited 1 time in total.
 
User avatar
Vanubis
Posts: 0
Joined: February 18th, 2010, 2:53 pm

Simple Random Walk

March 11th, 2010, 10:05 am

I think it works even if it's not a random walk.If you have p(i) the long term probability to be at vertex i, the average time to return from i to i is 1/p(i).