March 9th, 2007, 1:06 am
QuoteOriginally posted by: gentinexLet's generalize this. Say that you are playing a "best of 2n+1" series, in which the first team to win n+1 games wins the series. Now how much should you bet on the opening game, to guarantee that you win 100 if your team wins the series, and you lose 100 if your team loses the series? Also, how much should you bet on the next game if you're at the point where your team has won p games and the other team has won q games, with 0 <= p,q < n+1, to ensure the same aforementioned payoffs at the end?Solution using binomial trees is the least elegant way to solve the problem, although it best exposes the underlying technique for solving it. A more elegant method is using state variables to capture nodes on the tree. Even more elegant is using basic probability. I'll show how probability lends itself to generalization in one case. For instance, consider a seven game series. The question really is how much money will you have after one game? Since risk neutral probabilities are 1/2, we can solve as such:Payoff(Win) = 100. How can you win? You have to win at least 3 of the remaining six (ie. 3,4,5,6 of six) So 3 choose six, 4 choose six, 5 choose six, 6 choose 6Payoff(Loss) = -100. You have to lose 4 of the remaining 6. So ways to loose 0 choose six = 6 choose six, 1 choose six= 5 choose six, 2 choose six = 5 choose sixHow many total nodes on the left side of tree? 2^7/2=124/2=64So E = 3 choose six/64 * 100=31.25It's pretty easy to abstract this:E = (n Choose 2n)/2^(2n) * 100I can't explain the theory behind this as well as the practicality of the solution, unfortunately. Essentially, if you're read Shreve on discrete finance, you're talking about the value of a derivative at time N =1, where the first toss has been favorable. You're using the fact that the risk neutral probabilities are 1/2 (that the value of the derivative is the average of the two values before it), since your wager is two-way.