A Conversation for Paradox

St. Petersburg Paradox

Post 1

Queex Quimwrangler (Not Egon)

For anyone who would like another paradox try this on for size:

Let us say you are going to play a gambling game, and you will always take the choice with the highest average pay-off.

The crafty dealer tells you he's going to play a coin-flipping game. For the price of a £1 stake, you can enter. Once in the game, he flips a coin. On tails, you lose everything. On heads, you win triple your stake and are given the option to continue playing with the new stake. He even gives you your first stake free.

So, to play or not to play? If you back out, you have your stake of £1. If you play, you get nothing half the time and £3 half the time, making an average of £1.50. So you play.

Let's say you win. Do you play again? Well, by your rule, yes.

In fact, you will continue playing as long as you win. The problem is, the chances that you will lose a round eventually are 1. A certainty.

By following a strategy to maximise your winnings, you win nothing.

Can anyone offer the resolution?


St. Petersburg Paradox

Post 2

Dogster

The problem here is that if the crafty dealer had an infinite amount of money (which he'd need if this scheme were to work) he wouldn't need to be running his dodgy scam, he would be living the high life in Hawaii or something. If he had only a finite amount of capital (£N say), then since he would have to stop playing after a certain number of rounds ([log(N)/log(3)-1] I think), because he wouldn't be able to pay if the person won, he would have an expected loss and the customer would have an expected gain. As a crafty dealer, he wouldn't offer this game.


St. Petersburg Paradox

Post 3

Queex Quimwrangler (Not Egon)

Ah, but as the gamer will never leave the game until he loses and thus never collects his winnings, the shifty dealer is laughing. He doesn't have to tell the gamer that he can't cover the bet, and he's guaranteed not to have to pay up. Of course, this only applies if the gamer rigidly sticks to the maximisation rule.

The resolution is actually quite subtle.


St. Petersburg Paradox

Post 4

Dogster

Well, if I was the gamer I'd want to know he could cover his costs if I decided to quit at some point, especially since I'd certainly quit at the point where I could live comfortably for the rest of my life off the interest of my winnings. If I didn't take into account his ability to pay, then my calculation of my expected gain will be false. The paradox would then be resolved.

Another resolution, supposing that he did have infinite capital (perhaps he is the eartly manifestation of some malignant demon), is my finite lifespan. There are only so many times I can toss a coin before I die. This would also resolve the paradox. If I behaved short-sightedly I'd keep playing until I either lost or dropped dead (leaving an enormous inheritance for my close family), either way a win for the malignant demon.

Of course, if I were taking a long-sighted approach I might discount future winnings against current winnings, this would also resolve the paradox (we'd then be solving an infinite-horizon optimization problem with discounting).

The only other resolution I can offer at the moment is that the gamer might be maximizing a utility function which depends concavely on wealth. (After all, £1m for someone earning £20k a year is a lot better than £1m for someone earning £20m a year.) At some point it will be in his interests to quit.

Did I get it right? Or is there something even cleverer that I've missed? smiley - smiley


St. Petersburg Paradox

Post 5

Queex Quimwrangler (Not Egon)

Utility is the classical resolution. Well done! If the shifty dealer knows your utility curve, he could even pull the same stunt by offering to triple your utility with a suitably large amount of money. This proves that utility should be finite (Bill Gates, take note).

But the paradox (rather than the problem) is that a 'maximising' strategy leads to a certain loss. Utility solves the problem, but not necessarily the paradox.

Finite lifespan ties in with the really subtle resolution in that the strategy only maximises short-term gain. If you look at the long-term, then it certainly doesn't. The apparent paradox arises because we call it a maximising strategy when, in fact, it isn't. Discounting also helps, but the bet could be changed to inflate faster than your discount factor.

Isn't game theory wonderful?


St. Petersburg Paradox

Post 6

Martin Harper

Bleh. The paradox only works if the dealer has an infinite amount of money. If sie doesn't, then you have a finite chance of bankrupting hir, and thus a paradox-free net gain. This kind of paradox only works in a world where infinities are possible. We don't live in such a world...


St. Petersburg Paradox

Post 7

Queex Quimwrangler (Not Egon)

But the dealer never has to pay; as you've agreed to stick to your maximisation rule he knows you will never collect. In practice, there is only a 0.1% chance of getting as far as £60,000, and a 0.01% chance of getting up to a million.

You may well want to bottle out if you reached that level, but you can't because you've already agreed to follow the 'maximisation' rule. Mephistopholis has you...


St. Petersburg Paradox

Post 8

Martin Harper

> "But the dealer never has to pay"

Not according to the formulation in post 1: in that formulation every time you win you are physically given triple your stake. At some point the dealer will be physically unable to offer "triple or quits" and so the maximisation strategy will stop.

(If the formulation is changed so that you are only given your stake when you choose to quit, then you are no longer gambling with money, you are gambling with *debt*. And a debt of one million dollars is not worth a million dollars - it is worth less than that, depending on the financial state of the debtor. At some point, following the maximisation strategy, it will not be worth accepting a "triple or quits" offer, because tripling the size of the debt will not double the real dollar value of that debt.)

This all applies only if you are maximising your dollar value. If you have different objectives (such as maximising utility) then different rules will apply, but a strategy of maximising your expected dollar return will always maximise your expected dollar return.


St. Petersburg Paradox

Post 9

Queex Quimwrangler (Not Egon)

It's true that the debt may not be worth is numeric value, but discounting for the uncertainty is wandering a little away from the scope of the problem.

Are you really going to do a credit check on Mephistopholis? smiley - winkeye

The crux of the paradox is that a what is seemingly a maximisation strategy can in fact, lead to a certain loss. It could be formulated outside of a financial setting, without the problems outlines in this thread. The idea is to demonstrate the need for utility, and that the utility function must be bounded.


St. Petersburg Paradox

Post 10

Martin Harper

I would imagine that debts owed by Mephistopholis would be of extremely low value, given that demons are notoriously unreliable.

Here's a formulation that works a little better.

The demon Quibbla has an infinite number of queebles. She pledges to play triple or quits with Fred, who has only one queeble. Fred is trying to maximise the expected value(in a statistical sense) of the number of queeble he possesses. Playing triple or quits takes no time. Queebles take up no space or any other resources.

Clearly, at every stage, Fred's maximisation strategy is to play triple or quits. Suppose he plays the game N times. As N tends to infinity, the chance of him owning any queebles tends to zero, the chance of him owning no queebles tends to one, the number of queebles he owns, if he owns any, tends to infinity, and the expected value of the number of queebles he possesses tends to infinity.

This is all perfectly normal behaviour when dealing with infinities. Whether you think it means there is a paradox depends on whether you feel that this behaviour is paradoxical, and what basis you're using for your mathematics.

But it says *nothing* about the necessity or otherwise of utility. The problem didn't specify whether queebles are of positive or negative utility for Fred. Fred may or may not be a utilitarian: me may be a Kantian, and maximising the number of queebles because he promised to do so. It certainly says nothing about whether the utility function must be bounded.

Don't get me wrong - I'm all in favour of utilitarianism: it has a great attraction for me. But the best way to demonstrate that utility is not directly proportional to wealth is to look at the experiments that have been done to prove that! smiley - smiley


St. Petersburg Paradox

Post 11

Queex Quimwrangler (Not Egon)

Yeah, utility rocks. I'm kind of glad that no-one brought this kind of analytical depth to the presentation on decision analysis I gave as part of my MSc...


St. Petersburg Paradox

Post 12

Martin Harper

smiley - laugh Well, such things are mainly theoretical, right? smiley - smiley

The demon who rewards directly in terms of utility is more interesting, though. But it's very hard to actually comprehend what a tripling of your utility might entail... more practical research needed, I feel... smiley - smiley


St. Petersburg Paradox

Post 13

Queex Quimwrangler (Not Egon)

The freaky thing is when utility cuves bend the wrong way; implying risk-takers. National Lottery, anyone?


St. Petersburg Paradox

Post 14

Skatehorn

I think the paradox arises because you have failed to corretly handle the option. You have implicitly assumed the option to replay is always 'in-the-money' (ie you are better off exercising it than not exercising it) and is therefore always exercised. Your initial argument does not show that the option is always in-the-money.

The argument that the expected pay-off from playing is £1.50 is correct for a one-shot version of the game with no option. Since the assumption that it is correct for an infinitely repeatable game leads to an apparent contradiction it must be false. So I conclude that the option to repeat the game always expires worthless.

However I'm not sure how one would go about valuing the option - Its an option to walk away or play a game, which if won grants another similar (though not identical, as the exercise price is different) option. What is the replicating portfolio for that?


St. Petersburg Paradox

Post 15

Gardener

I see no paradox here.
While I maximize my stake with every passing round thereby exicising the option to continue, my chances to lose wholly my progressively accumulative stake increase all the time i.e. my chances accumulate that the coin will turn up tails (or heads - I forgot what gives the negative outcome). Mathematically speaking,negative chances are fast approaching to 1 with the accretion of each futher outcome (1- (1/2*1/2*1/2*1/2... multiplying n times))-> 1 , if n-> sleeping 8. However fast my stake grows, I am dead certain that I will lose it in the end.
Or am I misinterpreting the terms of the bet? Please....
help me. I feel so powerless here. Some researchers even want to lug in options pricing theory here... and I see no scope for it:
If we assume that iterations of the betting will come on the heels of one another,than options will have no time value, just an intrinsic in-the-money value. So their premium will be equal to their intrinsic value,which in this case will constitute 1,5 dollars. Graphically,it will not be an option curve but an uprigth half-swastica piece. piece.


St. Petersburg Paradox

Post 16

AlexK the Twelve of Motion

A safer and more profitable way of dealing with this game is to simply play one game, take your winning, and then play another game. Since you are getting a 3-1 payout on a 2-1 game you will are only risking 1 to win 3, and you will win 3 half the time. So even thought it is profitable to keep "letting it ride" you risk losing half the time. If you only bet 1 each time, or whatever you want, you take almost no risk of losing in the long run. Well you take 0 risk of losing in the LONG run. The way to defeat this game is to simply not bet everything you have every time. But I know that I am kinda destroying the whole paradox thing...


St. Petersburg Paradox

Post 17

Gardener

In other words, judging games of chance by their median pay-off for singular outcome is not always reasonable or enough. Sometimes it is worthwhile to consider extreme circumstances and what they result in if materialized. Again, I see no paradox here beetween these different measurements (i.e. median vs. extreme) of games of chance.After all, the normal way to describe and work with chance- driven variables is analysing their averages (i.e. expected median results) plus their dispersion( i.e. kind of "average exteme" viz-a-viz median thing)


St. Petersburg Paradox

Post 18

Martin Harper

> "simply play one game, take your winning, and then play another game"

Sorry sir, but that option isn't available. One game per person per lifetime. The management's decision is final. No correspondance shall be entered into.


Key: Complain about this post