# Games with Multiple Nash Equilibria

Here is another example to try the Nash Equilibrium approach on.

Two radio stations (WIRD and KOOL) have to choose formats for their broadcasts. There are three possible formats: Country-Western (CW), Industrial Music (IM) or all-news (AN). The audiences for the three formats are 50%, 30%, and 20%, respectively. If they choose the same formats they will split the audience for that format equally, while if they choose different formats, each will get the total audience for that format. Audience shares are proportionate to payoffs. The payoffs (audience shares) are in Table 6-1.

Table 6-1

 KOOL CW IM AN WIRD CW 25,25 50,30 50,20 IM 30,50 15,15 30,20 AN 20,50 20,30 10,10

You should be able to verify that this is a non-constant sum game, and that there are no dominant strategy equilibria. If we find the Nash Equilibria by elimination, we find that there are two of them -- the upper middle cell and the middle-left one, in both of which one station chooses CW and gets a 50 market share and the other chooses IM and gets 30. But it doesn't matter which station chooses which format.

It may seem that this makes little difference, since

• the total payoff is the same in both cases, namely 80
• both are efficient, in that there is no larger total payoff than 80
There are multiple Nash Equilibria in which neither of these things is so, as we will see in some later examples. But even when they are both true, the multiplication of equilibria creates a danger. The danger is that both stations will choose the more profitable CW format -- and split the market, getting only 25 each! Actually, there is an even worse danger that each station might assume that the other station will choose CW, and each choose IM, splitting that market and leaving each with a market share of just 15.

More generally, the problem for the players is to figure out which equilibrium will in fact occur. In still other words, a game of this kind raises a "coordination problem:" how can the two stations coordinate their choices of strategies and avoid the danger of a mutually inferior outcome such as splitting the market? Games that present coordination problems are sometimes called coordination games.

From a mathematical point of view, this multiplicity of equilibria is a problem. For a "solution" to a "problem," we want one answer, not a family of answers. And many economists would also regard it as a problem that has to be solved by some restriction of the assumptions that would rule out the multiple equilibria. But, from a social scientific point of view, there is another interpretation. Many social scientists (myself included) believe that coordination problems are quite real and important aspects of human social life. From this point of view, we might say that multiple Nash equilibria provide us with a possible "explanation" of coordination problems. That would be an important positive finding, not a problem!

That seems to have been Thomas Schelling's idea. Writing about 1960, Schelling proposed that any bit of information that all participants in a coordination game would have, that would enable them all to focus on the same equilibrium, might solve the problem. In determining a national boundary, for example, the highest mountain between the two countries would be an obvious enough landmark that both might focus on setting the boundary there -- even if the mountain were not very high at all.

Another source of a hint that could solve a coordination game is social convention. Here is a game in which social convention could be quite important. That game has a long name: "Which Side of the Road to Drive On?" In Britain, we know, people drive on the left side of the road; in the US they drive on the right. In abstract, how do we choose which side to drive on? There are two strategies: drive on the left side and drive on the right side. There are two possible outcomes: the two cars pass one another without incident or they crash. We arbitrarily assign a value of one each to passing without problems and of -10 each to a crash. Here is the payoff table:

Table 6-2

 Mercedes L R Buick L 1,1 -10,-10 R -10,-10 1,1

Verify that LL and RR are both Nash equilibria. But, if we do not know which side to choose, there is some danger that we will choose LR or RL at random and crash. How can we know which side to choose? The answer is, of course, that for this coordination game we rely on social convention. Conversely, we know that in this game, social convention is very powerful and persistent, and no less so in the country where the solution is LL than in the country where it is RR.

We will see another example in which multiple Nash equilibria provides an explanation for a social problem. First, however, we need to deal with one of the issues about the Prisoners' Dilemma that applies no less to all of our examples so far: they deal with only two players.

We will first look at one way to extend the Prisoners' Dilemma to more than two players -- an invention of my own, so of course I rather like it -- and then explore a more general technique for extending Nash equilibria to games of many participants.