Comedy and entropy
Created | Updated Aug 9, 2003
Some people enjoy surprises, and especially jokes that contain some element of surprise. The technique of surprising the viewer is used to some extent in shows such as Monty Python's Flying Circus and Futurama. In addition, very few people laugh at jokes that they've heard before. These two facts show that there seems to be an intimate relation between comedy and entropy1.
iaoth's Law of Entertainment Value
Readers who know a bit about entropy might be able to infer what the alleged relation between comedy and entropy is, but for readers who for some reason have not read the entry on entropy, here it is.
The entertainment value of a joke is relative to the amount of information it provides.
How to follow it
For a joke to be very informative, you need a lot of entropy first. Therefore, the best approach to following this law seems to be the following technique: Let the entropy of the joke reach a climax just before the punchline, and let the punchline be so informative that it cancels out almost all of that 'pent up' entropy. If you want to be really complicated, then divide the joke into a number of information quanta (1, 2, ..., N), where the last quanta N is the punchline. Let quanta (1, 2, ..., N-1) have very low mutual information, so that the entropy of the joke just before the punchline is rather high2. The mutual information of the ultimate punchline (quanta N) and all the other quantas should be as high as possible without giving the punchline away in advance. It's hard to find the perfect amount of mutual information; sometimes the punchline will be too predictable, sometimes it just won't make any sense (to people with a low entropy threshold, see below). The complicated nature of jokes is probably why a keen sense of humour is often held to be related to intelligence.
Entropy threshold
Of course, a joke is a very subjective thing (until a non-subjective joke is found), which in this case means that different people have different so called 'entropy thresholds'. A person's entropy threshold is rather simple to measure. Begin by telling a person simple one-liners, and then move up the scale towards Monty Python, Douglas Adams and The Far Side3. Somewhere along the line, you might notice that the person in question stares blankly at you, laughs forcedly or says "I don't get it." You have then reached that person's entropy threshold; the joke is just too chaotic to be perceived as a joke by that person. You might find a person who enjoys Pythonesque humour. If so, congratulations! Finally someone with whom you can release all that pent up entropy!
If the reader doubts that such a relation exists, he/she is urged to read the entry on entropy.
2A lot of really disjoint facts is a rather chaotic thing.3There is currently no way to measure higher entropy thresholds, much like the fact (?) that you can't measure IQ values over 200.