March 25th, 2008, 4:34 am
First off, this doesn't exactly qualify for this forum.Nevertheless, so long as I am reading your payoff matrix appropriately, the game is as such:We have two pennies (one each), and we decide to play a game. Both of us must choose a strategy of either playing heads or playing tails (with no uncertainty), and we simultaneously reveal our selections. Prior to our choice of strategy, neither player has any information about the other not available to the market at large (i.e. no information asymmetries exist). After we reveal our strategies, payoffs are determined, and if the strategies match (either both heads or both tails), player 1 receives player 2's penny. Otherwise, player 2 receives player 1's penny.Given this scenario, you're right in saying that there are no pure strategy nash equilibria. Why? Given that player 2 will play heads, what will player 1 do? He'll play heads. Similarly, given that player 2 will play tails, what will player 1 do? He'll play tails. On the other hand, given that player 1 will play heads, player two will play tails; and given that player 1 will play tails, player 2 will play heads. As such, there are no pure strategy nash equilibria.As you said, there is, however, a mixed strategy nash equilibrium in which each player places a probabilty of 1/2 on the other adopting either strategy.In repeated games, this mixed strategy still holds. Why? Here's a better question: Why not? If the game is repeated ad infinitum, then the expected payoff for both players is 0. To show this, consider the expected payoff from the infinitely repeated game with the mixed strategy as described above. The expected payoff in each game equals the probability of winning that game (1/2) times the payoff from winning (1) + the probability of losing (1/2) times the payoff from losing (-1). Therefore, in each game, the expected payoff (and, thus, utility) is 0. The expected payoff, then, from the infinite game is also 0. Since this is true of either player, it is true of both players, and the equilibrium payoff is (0,0) in the infinitely repeated game.It should be noted, though, that the payoff in repeated games does not represent a nash equilibrium. A nash equilibrium is a SET OF STRATEGIES from which no agent has an incentive to deviate. No agent has an incentive to deviate from the mixed strategy in which they both place a probability of 1/2 on the other playing either option. All you need to do is show that there does not exist an incentive to deviate.To do this, you need to test if the final expected payoff to the infinite game improves by selecting a different mixed strategy. Say, then, that both players play heads with pobability (1/2 + epsilon), and play tails with probability (1/2 - epsilon). Does either have an incentive to deviate? Yes. Why? In the situation just described, there is now a higher probability that (heads, heads) will come up. In fact, it's the highest probability of the four outcomes (1/4 + e + e^2). More importantly, the probability of player one winning (i.e. both players adopting the same strategy) increases without question. P(heads, heads) + P(tails, tails) = (1/2 + e)(1/2 + e) + (1/2 - e)(1/2 - e) = (1/4 + e + e^2) + (1/4 - e + e^2) = 1/2 + 2(e^2) >= 1/2 for all e>0.Consequently, player two now has an incentive to deviate, and play tails with a higher probability. The only stable equilibria, then, is the one first described.You're welcome.