r6 and I discuss his theory that entropy is subjectiveI've never been satisfied with the solutions I've seen to Maxwell's Demon.
I take
r6's interpretation of entropy as an agent-dependent quantity related to his knowledge, and a measurement of what one can do with this knowledge: knowledge is power. According to his theory, an all-knowing being (Laplace's Genius) could make the entropy according to a more ignorant agent decrease, through a demon. The point seems to be that no one can decrease his/her own entropy.
I wonder what physicists have to say about this.
(no subject)
Date: 2005-06-01 02:23 am (UTC)I briefly looked over the thread, and I noticed you asked "is entropy defined for a macroscopic particle". I think I know what you're asking here, but I'm not sure so correct me if I misinterpretted. I think what you're asking is: does entropy apply to classical systems which have no constituants small enough for quantum mechanics to play a role? And the answer is yes. Entropy is an entirely classical concept; the theory of it was mostly sorted out before quantum mechanics was discovered. In the end many questions about what's going to happen to the entropy of the universe (if such a quantity really makes sense to talk about) may be intimately tied to quantum mechanics, but that's a somewhat separate subject.
Another thing to realize about entropy, is that you need a very large number of states in order to define it. And you also need the system to be in equilibrium. If there are only a small number of states (like... less than a million) then there is no well-defined "entropy" of it, nor is there a well-defined temperature. If the system is rapidly changing so that it's not in equilibrium then there is also no well-defined temperature or entropy. Hope this clears up some of your questions about entropy!
Accessible States
Date: 2005-06-01 06:32 am (UTC)Oh, it’s the number of accessible states. For god’s sake, why didn’t books and web pages say that. I had the impression that it was the number of possible states.
(no subject)
Date: 2005-06-01 08:56 am (UTC)I briefly looked over the thread, and I noticed you asked "is entropy defined for a macroscopic particle". I think I know what you're asking here, but I'm not sure so correct me if I misinterpretted. I think what you're asking is: does entropy apply to classical systems which have no constituants small enough for quantum mechanics to play a role? And the answer is yes. Entropy is an entirely classical concept;
What I meant was "macroscopic" in the sense ping-pong balls: if we had a huge box in space (zero-gravity vacuum), with billions of ping-pong balls bouncing around in it, could we define entropy based on these observable states?
The problem for me is that it's not clear how you count states: at what level of detail do you look?
It seems to me that entropy is a "statistical" law, rather than a "physical" one: the 2nd law says that any system will tend to end up in the more probable set of states; and without knowledge of the system, one cannot bring it into a less probable set of states (these differences in probability being huge)
Does this make sense?
(no subject)
Date: 2005-06-01 09:00 am (UTC)So what's the solution to Maxwell's demon?
(no subject)
Date: 2005-06-01 02:23 pm (UTC)level of detail: See http://en.wikipedia.org/wiki/Entropy#Counting_of_microstates
(no subject)
Date: 2005-06-01 02:27 pm (UTC)i.e. this works only when the demon is outside the system being considered in thermodynamic terms.
(no subject)
Date: 2005-06-01 06:48 pm (UTC)What I meant was "macroscopic" in the sense ping-pong balls: if we had a huge box in space (zero-gravity vacuum), with billions of ping-pong balls bouncing around in it, could we define entropy based on these observable states?
Yes, it's perfectly well defined. But only up to an additive constant which has to do with how finely grained things are. You can always add a constant to the amount of entropy in a system and it's not going to change the dynamics. As long as you're consistant as to what constant you add, it doesn't matter. Kind of like defining a "ground voltage". In quantum mechanics, there's a natural definition for what that constant should be, but that need not be essential to the theory of statistical mechanics; it can be seen just a convenience issue.
It seems to me that entropy is a "statistical" law, rather than a "physical" one: the 2nd law says that any system will tend to end up in the more probable set of states; and without knowledge of the system, one cannot bring it into a less probable set of states (these differences in probability being huge)
Yes, that's right... it's a law that comes entirely from statistics. There are some very basic physical assumptions that go into proving it, but other than that it's entirely a consequence of deductive reasoning, not something that just happens to be true in our physical world.
(no subject)
Date: 2005-06-01 06:50 pm (UTC)(no subject)
Date: 2005-06-02 10:23 am (UTC)Since he knows everything about the deterministic system inside the box, his knowledge is the same before and after his trick of making heat flow the "wrong" way, so no resetting of the memory is necessary.
Is the relevant question "could he do it again to the same box?"? Obviously, you can't expect the box to go back to the exact same state as before, but I claim that he could take the box back to a "high entropy" state, and then the heat pump again. His memory is not increasing, and neither is it being erased: it's being updated at every "iteration", in a fully reversible process.
Would you say the entropy of the universe increased as a result of this? What could have happened in the Genius's brain to offset the massive entropy loss due to his heat trick?
(no subject)
Date: 2005-06-02 10:24 am (UTC)(no subject)
Date: 2005-06-02 10:30 am (UTC)which physical assumptions?
(no subject)
Date: 2005-06-02 03:41 pm (UTC)Ergodicity is the main one though. Ergodicity means that you have to be dealing with a system which spends an equal amount of time in each "state" in the long run. Or, if its phase space is continuous... then the probability of it being in a particular region has to be proportionate to the volume of that region of phase space. ("phase space" is the 2n-dimensional space where position and momentum are the axes and n is the dimensionality of the regular "position" space--typically 3). It might make sense to say that ergodicity is just saying you need to define what a state is in a sane way. You can construct systems which don't satisfy ergodicity, but you could always say that that's just because you haven't labelled the states correctly.
(no subject)
Date: 2005-06-02 04:04 pm (UTC)Maxwell's demon is something that I don't understand fully either, and I would definitely like to. Maybe we should both read this book on the subject. I particularly think Zurek has some important things to say about Maxwell's demon. He has a lot of papers on it and its combining it with quantum mechanics and information theory.
(no subject)
Date: 2005-06-02 04:19 pm (UTC)Maybe there is a school of physics where they really emphasize logical foundations? A place where people couldn't be happy unless they knew how to solve such paradoxes...
During sophomore year, it became clear that I wouldn't get what I wanted out of a physics education. It was quite a disappointment, but it seems I made the right decision.
(no subject)
Date: 2005-06-02 04:35 pm (UTC)hm... I don't know much about reversible computation, but I argued above that you should be able to simulate the system forwards or backwards in time without erasing information, and without needing more memory.
It would be interesting if computation were under the same constraints as the physical system it simulates (e.g. if your computer contained a physical copy of the system it simulates), and irreversible computations corresponded to irreversible changes in the sytem (i.e. entropy increasing).
(no subject)
Date: 2005-06-02 04:48 pm (UTC)It seems that physics curricula tend survey a bunch of ideas without analyzing their logical relationships to each other.
My approach is more to do learn a little bit, stop... check how it fits with everything.... and *then* proceed. This is the reason why I'm slower than most people: I'm always busy searching for contradictions (and incoherences). It's also the reason why my code is less buggy ;-)
what kind of book is this, btw? Maxwell's Demon's Greatest Hits? It seems each paper is from a different place.
(no subject)
Date: 2005-06-02 05:50 pm (UTC)It would be interesting if computation were under the same constraints as the physical system it simulates (e.g. if your computer contained a physical copy of the system it simulates), and irreversible computations corresponded to irreversible changes in the sytem (i.e. entropy increasing).
Yes, that's what I'm saying is probably the case. But I don't know for sure, and the situation becomes more difficult when you ask the question of how quantum mechanics fits into the whole thing.
I have a pet "hypthesis" that in the many-worlds interpretation of quantum mechanics (the interpretation which I believe in) the total entropy in the multiverse should be constant (since everything is both reversible and non-chaotic). So far I haven't seen anyone addressing this question, but I plan on investigating it at some point if it hasn't already been dealt with. I wouldn't be too surprised either way, but I would really like to know the answer to this question.
(no subject)
Date: 2005-06-02 05:56 pm (UTC)That's the reason I quit physics: someone can go as far as you (and much further) not being able to answer such questions.
When I took graduate statistical mechanics, the professor brought up Maxwell's demon one day and he said "I don't really understand this, but I'll explain it to the best of my ability." He explained it and then I asked him a few questions similar to the ones you asked. And he said "sorry, I just don't know. You'd have to ask an expert on it." The thing is, physics is a really large subject. Each person has a specialization. While they may understand most of the logical foundations of their specialization, nobody has the time to look into every question in infinite detail.
Maybe there is a school of physics where they really emphasize logical foundations? A place where people couldn't be happy unless they knew how to solve such paradoxes...
I'd say the school of physics that emphasizes logical foundations is "mathematics". Or philosophy, depending on which particular questions your asking about. (see my post on my view of the differences between these fields).
(no subject)
Date: 2005-06-02 06:05 pm (UTC)My approach is more to do learn a little bit, stop... check how it fits with everything.... and *then* proceed. This is the reason why I'm slower than most people: I'm always busy searching for contradictions (and incoherences).
I work very much the same way. I'm by far the slowest physicsist (and thinker in general) that I've met. I try harder to understand the foundations of things before I'm happy with them, and as a consequence it takes me longer than most to learn a subject. Or sometimes just longer to admit that I've "learned it". Although I must confess that over this past year, my standards have been slipping a bit due to necessity. The sheer volume of stuff we have to know how to use has started to outweigh the possibility of understanding any of it fully. So my plan at this point is to go along with the way they want me to learn now, and then go back and learn it to my satisfaction when I get the chance. Hopefully, I won't just keep saying that and never go back. :)
(no subject)
Date: 2005-06-02 06:21 pm (UTC)During sophomore year, it became clear that I wouldn't get what I wanted out of a physics education. It was quite a disappointment, but it seems I made the right decision.
Oh, one more thing I meant to say. The thing that attracts me most to physics is that it does pose questions which are so difficult to see the logical implications of. I enjoy sitting down and thinking about "hard" problems where the answer is not immediately obvious. Some of the problems in physics are so non-obvious that they've taken half a century or longer for people to answer. (Mawell's demon being one of those questions. I think it was posed back in the 1800's!). So I guess I've taken the opposite approach. Every time I see a physicist who I think doesn't really understand what he's doing, I see it as an encouragement that I have something to offer the field.
(no subject)
Date: 2005-06-03 07:36 am (UTC)This is still a bit handwavey, but I think that's the shape of the argument you're looking for -- that at some point you need to interact with an uncontrolled environment for this to matter. Physical law is such that information is conserved in time evolution (well, modulo quantum measurement); if the amount of information in the 'outside' is reduced, the amount in the 'inside' must be increased.
(no subject)
Date: 2005-06-03 07:41 am (UTC)(no subject)
Date: 2005-06-03 04:30 pm (UTC)Here's the deal though... information is definintely conserved in the multiverse, that much is sure. The laws are all explicitly reversible. But the problem is, there are two different definitions of "reversible" and they're often confused. (Actually, I know of three different uses of the term in physics which all have very different meanings). Although I'd like to see entropy conserved in the multiverse too because it would make other stuff make more sense to me, I'm not sure it follows from the conservation of information. The old classical laws of physics were all reversible too, but entropy is not conserved there. I think the reason for that is that classical laws are chaotic. But I need to think about it more and read up more on these things to be sure.
As for thermodynamics coming from information conservation... I'd say that's overstating the case. Thermodynamics and statistical mechanics make sense for both classical and quantum systems... whether or not information is conserved. It might has a lot of bearing on how entropy works in our world, but the fact that you can still have a thermodynamics without it makes me cautious about making such claims.
(no subject)
Date: 2005-06-04 10:44 am (UTC)My take on it was, information from a god's-eye point of view is conserved: a 1-to-1 local update function in a CA, symplectic flows in classical mechanics, unitarity in quantum mechanics. But if you a flip a single bit in a reversible CA that's at all interesting, the trajectory rapidly diverges, and 'therefore' any imperfection in an agent within the system causes it to lose accurate information in its representation -- so the god's-eye conservation of information along with the interestingness of the dynamics means that agent's-eye information can decrease but not increase. (It can try to dump the lost information into parts of the system state it's not interested in, though.)
That seems like essentially the usual physics argument but from a bit of a more computer-sciencey point of view. I'm not a physicist, though.
That's interesting about Toffoli. The only physics paper of his I've read apparently claimed that the least-action principle actually amounts to having the universe waste the maximal amount of computation possible -- but I didn't understand it.
(no subject)
Date: 2005-06-04 05:26 pm (UTC)My take on it was, information from a god's-eye point of view is conserved: a 1-to-1 local update function in a CA, symplectic flows in classical mechanics, unitarity in quantum mechanics. But if you a flip a single bit in a reversible CA that's at all interesting, the trajectory rapidly diverges, and 'therefore' any imperfection in an agent within the system causes it to lose accurate information in its representation -- so the god's-eye conservation of information along with the interestingness of the dynamics means that agent's-eye information can decrease but not increase. (It can try to dump the lost information into parts of the system state it's not interested in, though.)
Yes, that's exactly how it works. From the point of view of any observer in quantum mecahnics, information is always being destroyed. That's one of the main things I like the many-worlds-interpretation. It is the only interpretation where you can have a God's-eye view of the system where information is conserved. The illusion of the destruction of information is merely due to information flowing into different branches of the wavefunction which get causally isolated from each other due to decoherence (which is when it's useful to say the universe "splits" into disconnected pieces).
And because there's a close connection between information an entropy, this has led me to speculate that maybe entropy is conserved too. I'd like to see that turn out true, but it's only a guess. In thermodynamics, whenever things are "reversible" then entropy is conserved. But the usage of reversible in that context is a little different from the usage in the context of information. I don't completely understand the connections between the two, but I haven't had a chance to look into it more.
That's interesting about Toffoli. The only physics paper of his I've read apparently claimed that the least-action principle actually amounts to having the universe waste the maximal amount of computation possible -- but I didn't understand it.
There's a universal gate in quantum computing called the Toffoli gate which has great importance. Maybe they borrowed the name from elsewhere, but I assumed he had come up with it in the context of thinking about quantum computing. I don't know about that least-action comment, that sounds very strange!
(no subject)
Date: 2005-06-05 08:06 am (UTC)I'd be interested in your thoughts on reversibility and information if you get around to them.