Serving the Quantitative Finance Community

 
User avatar
gtoutkast
Topic Author
Posts: 0
Joined: March 24th, 2008, 8:23 pm

Game Theory: Matching Pennies - repeated games

March 24th, 2008, 10:18 pm

Hi everyone,I've been stuck with a problem on mathing pennies. The payoffs for the game are as follows: H T H 1,-1 -1,1 T -1,1 1,-1If you graph the payoffs, you get a line joining (-1,1) and (1,-1).In repeated games, this line represents the set(its just a line unlike other games) of feasible payoffs when delta is close to 1.My question is what would the set of rational payoffs be for repated games? Intuitively I think it will be (0,0), because at that point no player has an advantage over the other. But I cannot prove it mathematically. This strategy is also the Nash equilibrium using mixed strategy. Now this game does not have an NE using pure strategy. Using mixed strategy, we get an NE by playing H and T with probability (1/2,1/2). Will there be a NE for repeated games? I know all games must have a NE using mixed strategy, but should all games have a NE when repeating?Any comments would be grealty appreciated.thanks!
 
User avatar
brotherbear1220
Posts: 0
Joined: July 12th, 2006, 9:43 pm

Game Theory: Matching Pennies - repeated games

March 25th, 2008, 4:34 am

First off, this doesn't exactly qualify for this forum.Nevertheless, so long as I am reading your payoff matrix appropriately, the game is as such:We have two pennies (one each), and we decide to play a game. Both of us must choose a strategy of either playing heads or playing tails (with no uncertainty), and we simultaneously reveal our selections. Prior to our choice of strategy, neither player has any information about the other not available to the market at large (i.e. no information asymmetries exist). After we reveal our strategies, payoffs are determined, and if the strategies match (either both heads or both tails), player 1 receives player 2's penny. Otherwise, player 2 receives player 1's penny.Given this scenario, you're right in saying that there are no pure strategy nash equilibria. Why? Given that player 2 will play heads, what will player 1 do? He'll play heads. Similarly, given that player 2 will play tails, what will player 1 do? He'll play tails. On the other hand, given that player 1 will play heads, player two will play tails; and given that player 1 will play tails, player 2 will play heads. As such, there are no pure strategy nash equilibria.As you said, there is, however, a mixed strategy nash equilibrium in which each player places a probabilty of 1/2 on the other adopting either strategy.In repeated games, this mixed strategy still holds. Why? Here's a better question: Why not? If the game is repeated ad infinitum, then the expected payoff for both players is 0. To show this, consider the expected payoff from the infinitely repeated game with the mixed strategy as described above. The expected payoff in each game equals the probability of winning that game (1/2) times the payoff from winning (1) + the probability of losing (1/2) times the payoff from losing (-1). Therefore, in each game, the expected payoff (and, thus, utility) is 0. The expected payoff, then, from the infinite game is also 0. Since this is true of either player, it is true of both players, and the equilibrium payoff is (0,0) in the infinitely repeated game.It should be noted, though, that the payoff in repeated games does not represent a nash equilibrium. A nash equilibrium is a SET OF STRATEGIES from which no agent has an incentive to deviate. No agent has an incentive to deviate from the mixed strategy in which they both place a probability of 1/2 on the other playing either option. All you need to do is show that there does not exist an incentive to deviate.To do this, you need to test if the final expected payoff to the infinite game improves by selecting a different mixed strategy. Say, then, that both players play heads with pobability (1/2 + epsilon), and play tails with probability (1/2 - epsilon). Does either have an incentive to deviate? Yes. Why? In the situation just described, there is now a higher probability that (heads, heads) will come up. In fact, it's the highest probability of the four outcomes (1/4 + e + e^2). More importantly, the probability of player one winning (i.e. both players adopting the same strategy) increases without question. P(heads, heads) + P(tails, tails) = (1/2 + e)(1/2 + e) + (1/2 - e)(1/2 - e) = (1/4 + e + e^2) + (1/4 - e + e^2) = 1/2 + 2(e^2) >= 1/2 for all e>0.Consequently, player two now has an incentive to deviate, and play tails with a higher probability. The only stable equilibria, then, is the one first described.You're welcome.
 
User avatar
gtoutkast
Topic Author
Posts: 0
Joined: March 24th, 2008, 8:23 pm

Game Theory: Matching Pennies - repeated games

March 26th, 2008, 5:45 am

thanks brotherbear for your detailed reply!perhaps not entirely related to finance, but I've been asked game theory questions at interviews, so I thought I'd give it a shot.I agree with what you said. It's a bit difficult distinguishing between mixed strategy NE and repeated games NE, hence the confusion.Anyway - thanks gain for the time and effort!