MATH318: Utility

Name:
The purpose of this exercise is to practice determining indifference values, an important step in calculating utility. Work with a partner if possible.

What's worth more to you, 50 cents or a chance to win a dollar? What if you have to pay \$5 for a chance to win \$6? Opportunities for large payoffs often include chances of large losses -- for example, see the Pittsburgh Development Corporation's condominium development project of section 4.1.

In Chapter 4 we learned to use expected values to determine the best decision for a company. However, if the decision with the highest expected value comes with a risk of bankruptcy, it may be better to compute a "expected utility" which takes into account the fact that some losses are unacceptable.

When calculating the expected utility of a decision an analyst starts with information on the utility of the possible payoffs, then applies that to the specific outcomes possible for that decision. In this worksheet you learn where that utility information comes from by creating a utility table that describes how you personally value payoffs between \$0.00 and \$1.00. You then use that information to predict your preferences and "check your work".

Professor Burgiel offered to play a game with a student in the class. She gave the student \$0.50; the student was then asked if he or she would like to spend that \$0.50 on a 50/50 chance to win \$1.00.

1. If you were offered this decision, would you keep the \$0.50 or gamble on winning \$1.00?

2. If Professor Burgiel gave you \$0.50, would you trade it back to her for a 75% chance of winning \$1.00?

3. If Professor Burgiel gave you \$0.50, would you trade it back to her for a 25% chance of winning \$1.00?

4. Fill in the blank: I would keep the \$0.50 unless the chance of winning \$1.00 was greater than or equal to:

_______%.

The gamble described above (win \$1 or nothing) is called a lottery (see 3 (a), p. 159). The probability you selected in the last question is the probability at which you are indifferent about whether you gamble or keep the \$0.50 (see 3 (b), p. 159); this is your indifference value of p for this lottery and the value \$0.50. You will calculate your utility for \$0.50 as described in 3 (c), p. 159 by computing:

U(\$0.50) = p U(\$1.00) + (1 - p) U(\$0.00) = p * 10.

The table we will use to describe your utility values for money is shown below. The values 10 and 0 were chosen to match the example in the book, and to make our calculations simpler.

Enter the percentage you found on the last page in the "Indifference Value" column to the right of \$0.50. The remainder of this worksheet is dedicated to filling in the rest of the table, then checking it against an example.

 Payoff Indifference value of p Utility Value \$1.00 n/a 10 \$0.75 \$0.50 \$0.25 \$0.00 n/a 0

1. Ask your partner if he or she would pay \$0.75 to have a 50% chance of winning \$1.
• If your partner replies "yes", decrease the chance of winning -- would he or she pay \$0.75 for a 25% chance (1 in 4) of winning \$1.00?
• If your partner replies "no", increase the chance of winning -- would he or she pay \$0.75 for a 75% chance (3 in 4) of winning \$1.00?
2. Continue to adjust the probability of winning until you have identified the percentage at which their answer changes from "yes" to "no". (For example, they might pay \$0.75 for an 85% chance at \$1.00 but not for an 80% chance, giving an indifference value of roughly 82.5% = .825.)
3. Enter this indifference value in the table above, to the right of the value \$0.75.
Repeat this exercise to find the indifference value for \$0.25 -- what is the smallest probability of winning \$1 for which you would pay \$0.25 to play? Then multiply each entry by 10 to determine the utility value of each payoff. (This is the calculation in step 3(c) on page 159.) When you have finished filling in your utility table, help your partner fill in theirs.

In theory, you now have a table describing how much money is worth to you when compared to a chance of winning \$1.00. Is this description accurate? We'll find out by comparing the expected utility of two sample games.

Suppose a casino game gives you a 25% chance to win \$0.75. Losing the casino game has a value of \$0.00, winning has a value of \$0.75. The expected value of the casino game is \$0.1875. If you used only the expected value to make your decision, you would pay \$0.18 to play the game but not \$0.19.

EV = P(win) * (payoff of win) + P(loss) * (payoff of loss) = .25 * \$0.75 + .75 * \$0.00 = \$0.1875

The expected utility of the casino game may provide a better estimate of how much you would be willing to pay to play it:

EU = P(win) * (utility of win) + P(loss) * (utility of loss) = .25 * (utility of \$0.75) + .75 * 0

1. Use the utility value listed to the right of \$0.75 in the table to find the expected utility (to you) of this game.

EU = .25 * (utility of \$0.75) + .75 * 0 =

2. Calculate the expected utility of a game in which you have a 75% chance to win \$0.25 (and receive nothing if you lose.) Please show your work carefully.

3. The two games described above (25% chance of \$.75 and 75% chance of \$.25) have the same expected value. The game with the higher expected utility should be the one you'd prefer to play. Would you actually prefer to play the game with the higher expected utility? If not, what might have gone wrong in your calculations?

If you preferred the 25% chance of \$0.75, you would be considered a risk taker (for payoffs in this range). If you preferred the 75% chance of \$0.25 you could be called a risk avoider. And if you had no preference you might be risk neutral. Figure 5.3 on page 164 shows sample graphs of monetary value versus utility for people or organizations with these three different attitudes toward risk.
1. Draw a graph below which has payoffs (\$0 to \$1) on the x-axis and utility (0 to 10) on the y-axis. (If you wish, you may instead make a note here and attach a piece of graph paper or a printout to the end of this worksheet.)

2. Compare your graph to the one on page 164. Are you a risk taker, a risk avoider, risk neutral, or none of the above?

Bonus: Compute the expected utility of a game that gives you a 1% chance of winning \$1.00 and also a 18% chance of winning \$0.50 (see page 161 for examples). Would you pay \$0.10 to play this game?  "Check" your answer by computing the utility of \$0.10 and comparing it to the expected utility of the game.