What's worth more to you, 50 cents or a chance to win a dollar? What if you have to pay $5 for a chance to win $6? Opportunities for large payoffs often include chances of large losses -- for example, see the Pittsburgh Development Corporation's condominium development project of section 4.1.
In Chapter 4 we learned to use expected values to determine the best decision for a company. However, if the decision with the highest expected value comes with a risk of bankruptcy, it may be better to compute a "expected utility" which takes into account the fact that some losses are unacceptable.
When calculating the expected utility of a decision an analyst starts with information on the utility of the possible payoffs, then applies that to the specific outcomes possible for that decision. In this worksheet you learn where that utility information comes from by creating a utility table that describes how you personally value payoffs between $0.00 and $1.00. You then use that information to predict your preferences and "check your work".
Please answer the questions below as honestly as you are able.
Professor Burgiel offered to play a game with a student in the class. She gave the student $0.50; the student was then asked if he or she would like to spend that $0.50 on a 50/50 chance to win $1.00.
_______%.
The gamble described above (win $1 or nothing) is called a lottery (see 3 (a), p. 159). The probability you selected in the last question is the probability at which you are indifferent about whether you gamble or keep the $0.50 (see 3 (b), p. 159); this is your indifference value of p for this lottery and the value $0.50. You will calculate your utility for $0.50 as described in 3 (c), p. 159 by computing:
U($0.50) = p U($1.00) + (1 - p) U($0.00) = p * 10.
The table we will use to describe your utility values for money is shown below. The values 10 and 0 were chosen to match the example in the book, and to make our calculations simpler.
Enter the percentage you found on the last page in the "Indifference Value" column to the right of $0.50. The remainder of this worksheet is dedicated to filling in the rest of the table, then checking it against an example.
Payoff | Indifference value of p | Utility Value |
$1.00 | n/a | 10 |
$0.75 | ||
$0.50 | ||
$0.25 | ||
$0.00 | n/a | 0 |
In theory, you now have a table describing how much money is worth to you when compared to a chance of winning $1.00. Is this description accurate? We'll find out by comparing the expected utility of two sample games.
Suppose a casino game gives you a 25% chance to win $0.75. Losing the casino game has a value of $0.00, winning has a value of $0.75. The expected value of the casino game is $0.1875. If you used only the expected value to make your decision, you would pay $0.18 to play the game but not $0.19.
EV = P(win) * (payoff of win) + P(loss) * (payoff of loss) = .25 * $0.75 + .75 * $0.00 = $0.1875
The expected utility of the casino game may provide a better estimate of how much you would be willing to pay to play it:
EU = P(win) * (utility of win) + P(loss) * (utility of loss) = .25 * (utility of $0.75) + .75 * 0
EU = .25 * (utility of $0.75) + .75 * 0 =
Bonus: Compute the expected utility of a game that gives you a 1% chance of winning $1.00 and also a 18% chance of winning $0.50 (see page 161 for examples). Would you pay $0.10 to play this game? "Check" your answer by computing the utility of $0.10 and comparing it to the expected utility of the game.