Consider the gambling game described in the StPetersburg par

Consider the gambling game described in the St.Petersburg paradox and assume that the utility function, U(x) for a change in fortune ,x, is bounded,monotonically increasing on R1, and satisfies U(0)=0. show that the utility of playing the game is negative for a large enough cost c.

Solution

Consider this theoretical approach to understanding the problem.

The gambler would invest in the game if and only if his cost is less than the expected payoff as per the normal theory of risk aversion and/or risk neutrality. If the total utility turns out to be finite (and not infinite as expected in the paradox) then for a large cost \'c\' the utlity is negative.

This can be explained in the example given below.

1. A Limit on Utility

Bernoulli\'s reaction to this problem was that one should distinguish the utility—desirability or satisfaction production—of a payoff from its dollar amount. His approach in a moment: but first consider this way (not his) of trying to solve this problem. Suppose that one reaches a saturation point for utility: that at some point, a larger dose of the good in question would not be enjoyed even a little bit more. Consider someone who loves chocolate ice-cream, and every tablespoon of it he eats gives him an equal pleasure—equivalent to one utile. But when he has eaten a pint of it (32 tablespoons), he suddenly can\'t enjoy even a tiny bit more. Thus giving him 16 tablespoons of chocolate ice-cream would provide him with 16 utiles, and 32 tablespoons with 32 utiles, but if he was given any larger quantity, he\'d enjoy only 32 tablespoons of it, and get 32 utiles. Now imagine a St. Petersburg game in which the prizes are as above, except in tablespoons of chocolate ice cream. Here is the start of the table of utiles it provides:

The sum of the last column now is not infinite: it asymptotically approaches 6. In the long run, a player of this game could expect an average payoff of six utiles. A rational ice-cream eater would pay anything up to the cost of 6 tablespoons of chocolate ice-cream.

I\'ve chosen ice-cream for that example because that sort of utility-maxing might be considered less implausible than when other goods are involved. Compare the original case, where payment is in dollars. Replacement ($X for X tablespoons) shows a maximum utility reached by any dollar prize over $32, and a fair entry cost of $6. But putting the maximizing point there would mean that $10,000 was worth no more—provided no more utiles—than $32. Nobody feels that way. So should we raise the point at which an additional dollar has no additional value? Setting that point at (say) $17 million is better. It\'s such a large prize that you might feel that anything larger wouldn\'t be better. Setting the maximum point there, by the way, means that the maximum rational entry price for this sort of game would be a little above $25, which is Hacking\'s intuition about the reasonable maximum price to pay. So is that the point where utility maximizes out, for money? Agreed, that\'s a lot of money, and it would take some imagination for some of us to figure out what to do with more; but it has seemed to most people thinking about this sort of thing that there isn\'t any point at which an additional dollar means literally nothing—confers no utility at all. It seems that the-more-the-better desires are possible—even common.

Hope this explains well.

n P(n) Prize Utiles of Prize Expected utility
1 1/2 2 tbs 2 1
2 1/4 4 tbs 4 1
3 1/8 8 tbs 8 1
4 1/16 16 tbs 16 1
5 1/32 32 tbs 32 1
6 1/64 64 tbs 32 .5
7 1/128 128 tbs 32 .25
8 1/256 256 tbs 32 .125
9 1/512 512 tbs 32 .0625
10 1/1024 1024 tbs 32 .03125
Consider the gambling game described in the St.Petersburg paradox and assume that the utility function, U(x) for a change in fortune ,x, is bounded,monotonicall

Get Help Now

Submit a Take Down Notice

Tutor
Tutor: Dr Jack
Most rated tutor on our site