The 'three hats problem' is a first puzzle dealing with the transmission of information between players about some state of nature. Three boys have hats on their heads, which they are told are either blue or red; in fact, the modeler knows they are all red. Each boy can see the color of the others' hats, but not his own. At the start, an observer gives a prior message that there is at least one red hat in the group. He fixes the rules of the game: in successive periods, a boy announces whether or not he knows the color of his hat. Each boy can observe the others' announcements, acting as messages. The payoffs are not made explicit, but a boy wins if he announces the true color of his hat and loses otherwise. It can be shown (by induction on the number of boys) that, for actors with perfect rationality, everybody makes a negative announcement in the first two periods and a positive announcement in the third one.

In this example, the process evolves in three steps from distributed knowledge to common knowledge about the combination of hat colors (a possible combination acts as a possible world). Several factors explain this result, which corresponds to a complete diffusion of information. Firstly, the players have no strategic interaction, since their

payoffs do not depend on the others' actions. Secondly, the prior message transforms a shared initial knowledge into a common initial knowledge and players ground their reasoning on that common basis. Thirdly, since the number of possible worlds is finite, shared knowledge gains one level at each step and must converge towards common knowledge.

The 'two generals problem' is a second puzzle which shows a direct transmission of information. Two allied generals fight against a common enemy in a twofold situation. If the situation is unfavorable, a simultaneous attack is bad for both generals, while a lack of attack is better, whatever the other does. If the situation is favorable, a simultaneous attack is good for both generals while a lack of attack is bad, whatever the other does. One general is on a hill top and observes the situation while the other is in a valley and observes nothing. When the situation becomes favorable, the first general sends a message of attack to the other, but the message has a small probability of getting lost. When he gets the message, the second general sends a message to the first confirming that he has received it, but this message has the same probability of getting lost. The process continues until one message is lost. It can be shown that rational generals never attack, whatever the number of exchanged messages.

In this example, the simultaneous attack could only happen if common knowledge of the situation were achieved, but this never happens. Several factors explain such an unsuccessful result, corresponding to an incomplete diffusion of information. First, the players are in a strategic context since their payoffs depend on the actions of both; however, information is transmitted directly and not through their actions. Second, the only prior belief which is common knowledge is a common probability distribution about the situation. Third, since the number of possible worlds is infinite, shared knowledge of the situation never becomes common knowledge. However, the result is not robust to small changes in the assumptions. For instance, if the generals agree by a prior convention that they attack after exactly n (greater than two) exchanges of messages, the attack will be implemented (if n exchanges are effectively exchanged) and succeeds.

The 'two restaurants problem' is a third illustration of the transmission of information, but in an indirect way. There are two restaurants, one on the left side and one on the right side of a street. One is good quality and the other is bad quality. Successive customers arrive to eat a meal. In fact, they aim for the better restaurant, but they don't know which it is. They have common information under the form of a prior probability indicating that the left one is a bit better. They receive private information under the form of a message correlated with the quality of the left (or right) restaurant. They also receive public information, since they observe the behavior of the preceding customers. It can be shown that after a certain number of periods where the customers alternate, they end up all going to the same restaurant, whether or not it is the better one.

In this example, it becomes common knowledge among the followers that one restaurant is probably the better one, even if it is the wrong one. The arbitration between the players' private information and their public information rapidly turns in favor of the second. Many factors explain this paradoxical result. Firstly, the players have independent preferences, since their payoff does not depend on the occupation of the restaurant. Secondly, they are weakly pre-coordinated by the prior probability distribution acting as a common reference. Thirdly, they have no long-term memory, since it is a different customer who arrives in each period. As a more general framework, consider that two players have a common prior probability of some event and exchange respective messages about their posterior probability in each period. It can be shown that their posterior probabilities of the event finally converge: they cannot 'agree to disagree' (Aumann, 1976).

In the crossroads game, the first driver may initially be uncertain about whether the second is a go-getter or a cautious type. The first driver attributes a probability to each type, constituting the 'reputation' of the second. Such a reputation evolves during the game as a function of the observed action of the second driver. As long as he keeps going, his reputation of go-getter driver slowly increases; whenever he stops, his reputation of go-getter driver drops straight to zero. In fact, it may be to the advantage of a cautious driver to mimic the behavior of a go-getter driver. More precisely, at the (Bayesian perfect) equilibrium state, the second driver keeps going in all the first periods, in order to appear as a go-getter, and then keeps going or stops randomly. After stopping once, he stops for ever since he no longer has a reputation to defend.

Was this article helpful?

## Post a comment