A Conversation for Entropy versus Information

A452387 - Entropy and Information

Post 1

Martin Harper

http://www.h2g2.com/A452387

In a fit of organisation, I've decided to combine h2g2 with work by writing down my understanding of my lectures on h2g2 for all to disagree with. This is the first effort.

It's a complicated subject. I've tried to keep it simple by introducing nice-and-easy examples, but I dunno how well I've done - is this understandable?

I've not mentioned thermodynamics and suchlike, I'm afraid. The entry just talks about what entropy/information *is*, not how they move about, and what limits are on them. I'd expect Thermodynamics to deserve a seperate entry...

I'm probably missing a conclusion - I'll add that after this post.


A452387 - Entropy and Information

Post 2

Salamander the Mugwump

Do you need a degree in maths or physics to understand this? I would like to understand it but it's incomprehensible to me. The information isn't accessible to me. Is it at all possible to write it in such a way that a person who hasn't studied the subject could understand the concepts? I'm interested. I read James Gleick's book "Chaos" and most of that made sense to me.

If you've written this just to be read by people who are deeply into the subject, fair enough but if it's for a wider audience, it needs a bit of editing (dumbing down, to be honest - for the likes of me). smiley - smiley


A452387 - Entropy and Information

Post 3

Jim diGriz

I've not read the article in detail yet.

All I'll say for now is: you are exceptionally brave to even think of attempting an article on this subject!

It can't be done, I tell you, it's impossible! (In true Hollywood style, that's a guarantee that you'll get it done by Tuesday teatime. smiley - winkeye )

Superficial observation: it's probably not a good idea to reference a footnote in the middle of an equation. At first I thought you were squaring the log, which didn't make much sense.

I'll have another little look over it and see what else I can suggest.


A452387 - Entropy and Information

Post 4

Gnomon - time to move on

A good entry, but the mathematics in it put even me off, although I'm interested in Maths. Try and introduce concepts without using Maths, and put as much as possible of the maths in a separate section.

The diagram was hopelessly mangled.

The sentence "It is written I(X;Y)." should be put immediately after the thing it is describing, before the sentence starting "Alternatively". As it is at the moment, it is not clear what is written as I(X;Y).

Why are computer bits not to be confused with information bits? Are they not exactly the same thing?


A452387 - Entropy and Information

Post 5

Martin Harper

I'm aiming the article the same place every other guide article is aimed... smiley - smiley Not successfully though, from the sound of it. I've had another go at reducing the maths content - care to try again?

Information bits are a measure of information, computer bits are a means of storing information. It just so happens that you need one computer bit to store one information bit.

I guess it's the difference between a metre, and a metre-ruler... so not a terribly important. I'll scrap it for clarity.

The equations have gone. I might add a section at the end on "calculating information" or suchlike - but I'm not sure that it would be terribly useful, so probably not.

I think the diagram is fixed now - it seems that previously it was fine in preview, but died in normal view smiley - sadface


A452387 - Entropy and Information

Post 6

Gnomon - time to move on

That's better!

I recommend this article as an Edited Guide Entry. I love your mathematical proof that truth is relative!


A452387 - Entropy and Information

Post 7

Martin Harper

dang - and now the diagram's mucked up again...
(serve me right for fiddling...)


A452387 - Entropy and Information

Post 8

Wampus

Well, I've taken three college classes in thermodynamics, and so I'm fairly familiar with entropy. However, most of that article went right over my head. Maybe it's because I haven't had my morning coffee yet...

Wampus


A452387 - Entropy and Information

Post 9

Salamander the Mugwump

It's a little bit easier to follow now. Your diagram looks like a definite something or other.

I'll give a couple of specific examples of problems I have understanding your article.

1) You seem to be taking a certain level of prior knowledge on the part of your reader for granted. The equations are beyond me. You have X and Y sometime divided by a pipe, other times a semi-colon and yet other times a coma. I don't know what these different characters signify in this context.

2) I was trying to work out what the effect on entropy would be of knowing just X or Y about your subject. If, for example, you learn a person is 5' tall and believe on the basis of that fact that it's probably a woman - but it isn't, does that constitute an increase in entropy?

Hope my confusion helps you some. smiley - smiley


A452387 - Entropy and Information

Post 10

Martin Harper

1) Yep, I was thinking about this last night, and decided that I needed to devote a goodly bit at the start to saying what entropy and information are - more than is there.
The difference between , | and ; is just notation: H(X,Y) is the joint entropy( of X and Y), while H(X|Y) is the conditional entropy. That's all the difference is - a way of distinguishing between the two types of entropy. Nothing more...

2) Well, regardless of whether they are are or aren't a woman, the entropy in their gender *decreases*. Because it is now more likely the person is female, there is less uncertainty (though still enough that you could guess wrong), and so less entropy.
Ok, I'll add that example, or one like it, somewhere.

thanks for your confusion smiley - winkeye


A452387 - Entropy and Information

Post 11

Cefpret

I don't understand this 'Relative Truth' thing. I'm a physicist not a computer scientist, so for me entropy is a very accurately defined quantity; I've never heard that it depends on who's observing the system.

I ask this fully unsuggestively -- my main subject wasn't thermodynamics so my knowledge is quite rudimentary. smiley - sadface


A452387 - Entropy and Information

Post 12

Martin Harper

That's the difference between theory and practice - in theory the amount of entropy a system X has left, given that we have a whole bunch of observations Y, is the conditional entropy of X given Y.

Since everyone has a different Y, because some people are observing the thing closely, and some people are in a bar in Hawaii, everyone's view of conditional entropy can be different.

However, in physical systems, almost all the entropy is present in the form of heat, and the conversion factor between degrees and bits is the boltzmann factor - about 10^-23 - so if you'd gathered a terabyte of perfect data(IE, every bit of data was totally independant of all other data) you'd change the entropy of the body by 10^-11 degrees. Big deal.

In addition to this, the uncertainty principle limits the mutual information between X and Y anyway - it simply isn't possible to know that information because measuring something changes it.

So in practice, for physicists, entropy varies by such ridiculously tiny amounts you can ignore it. Which means that all your precious laws of thermodynamics still hold. smiley - winkeye

---

Ok - I've added an initial explanation, and hopefully that'll help all the confused physicists who are used to entropy being a nice, round thing they can put in a poke and sell at the market. Sorry guys.

I've got rid of the H(X,Y) / I(X;Y) type stuff too - now the closest I get to equations is talking about variables... smiley - smiley


A452387 - Entropy and Information

Post 13

The Unmentionable Marauding Pillowcase

Hi, this is an interesting entry, high in information content, low in entropy, I find it quite understandable, entertainingly presented. smiley - bigeyes

Just one thing - this entry is supposed to be about entropy and how it relates to information; isn't there a lot more you can say about such a rich subject? For instance: entropy increase and the subjective perception of the arrow of time, among other things.


A452387 - Entropy and Information

Post 14

Salamander the Mugwump

Your editing has borne fruit (for me at least). I think I have a good grasp now. You can probably get away with testing me - probably.

Now I think I understand what you're saying, I have a question: Under the heading "Opposites Attract" you say a point of maximum entropy never occurs. How do you know?


A452387 - Entropy versus Information

Post 15

Martin Harper

Umm.
Good question... smiley - winkeye

I'll be honest here - the entropy of something about which you have absolutely no information whatsoever is not something that is discussed in great detail. This likely because it's not terribly useful to go around talking about things about which you know nothing. Because there's nothing much to say - if there was, they wouldn't have infinite entropy...

But, in practice, you always know something. Let's take the variable that's the alcoholic content of a Pan-Galactic Gargleblaster as an example. (conditionalised on the existance of such a thing).

Now, we know very little about PGGBs, so the entropy of that variable is really quite huge. But we do know something - we know it can't be less than 0%, and can't be more than 100%. We have a description by DNA which is hugely unlikely to be correct, but if you make guesses at to the probability that telepathy exists, and the probability that DNA tuned into the mind of an alien from another planet, then you can decide that there is some correlation. (and the probability that he's from another planet himself might be noticeable...).

We might also look at the link between alcohol content and name length - on earth, short names imply a low alcohol content (water, beer, wine) and long ones a high content, and little umbrellas. We might estimate the probability that this is a universe-wide concept and so forth. So, even this phenomenally unknown thing doesn't have an infinite entropy...

Now, what about the entropy of a variable about which we don't know the existance? Well, I think that comes under trees falling in a wood making a sound, so I'll say nothing... a paradox just occured to me, - if we know nothing about something, then it has infinite entropy. But then we know something about it - it's entropy. But then it's entropy isn't infinite. Just a rehash of the old "I know nothing" statement, though... smiley - winkeye

--

I'm glad you like it more now. It's certainly nicer now it doesn't have logarithms anywhere to be seen! smiley - smiley Certainly reading people's problems has pointed out some of my own misconceptions, as well as pointed me at cool info like the conversion rate between bits and Joules per Kelvin.

--

Please, pretty please don't ask me to write about entropy as it relates to big balls of fire in the sky and the direction of time - I gave all that up a year ago!
In all seriousness, my own view is that such discussion is a topic for all those people who write entries on astrology and suchlike - if someone writes such an entry, I'd be perfectly happy for it to be combined with this one if the editors think that's a good idea - but myself, I'd keep the two seperate but linked.

--
http://www.h2g2.com/A452387


A452387 - Entropy versus Information

Post 16

The Unmentionable Marauding Pillowcase

Come on, it's straightforward, kids' stuff! Consider: future - unknown; low information, high entropy. Past - relatively well known, high information, low entropy. Entropy always increases. Therefore time flows from the past to the future. Quite Easily Done!


A452387 - Entropy versus Information

Post 17

Martin Harper

In which case, you'd have to argue that the present was the point of highest entropy, since we know the most about it... Do *you* know exactly what happened a million years ago? smiley - smiley

{actually that may not be true - thinking about it. It's possibly the case that a perfect entity could know exactly what happened - provided the universe has certain semi-bizzare properties in reverse-time-flow mode.... smiley - winkeye}

There is actually a semi-equivalent of the thermodynamic law that entropy can never decrease in information theory - it's called the "Information Processing Inequality", and shows that, provided that the system behaves like a Markov Chain (memoryless), and no new observations are gathered, the amount of mutual information between our current observations and the system can never increase.

But I felt that was somewhat OTT for a guide entry, especially as I'd have to give a decent entry for a Markov Chain first! Maybe later...


A452387 - Entropy versus Information

Post 18

The Unmentionable Marauding Pillowcase

Whatever, I won't argue the point, it's probably good not to make the entry too complicated. But I like OTT stuff. Don't you think you should be a little bit adventurous with your entry? I mean, entropy and information, I don't think many people realise just how relevant the theories really are to everyday life. The more you can widen the scope of your discussion, the better I say. But if you would like to keep it the way it is now, that's fine by me! smiley - smiley

And I would also be interested in any entries on related topics as well!


A452387 - Entropy versus Information

Post 19

Salamander the Mugwump

Hmmmmmmm, I see. Where can I get a Pan-Galactic Gargleblaster, I wonder. Gin and rum are rather short names so I think I'd better dilute the PGGB with one or both of those because, according to DNA, it's a bit strong. smiley - winkeye

I like the entry and I think it's just the right length for me to read without giving myself a brain haemorrhage. If I wanted to badger you to make it longer (which I don't) I'd say: what about strange attractors? Don't they interfere with the laws of thermodynamics?

Pillowcase, perhaps you could do a complementary entry on the bits you feel would enhance this piece. smiley - smiley


A452387 - Entropy versus Information

Post 20

Martin Harper

ok - see "why bother" in the entry... smiley - smiley

{And I've put in some pointless links to other entries for the sub-ed to strip out}


Key: Complain about this post