my compiled rants against physicists
Feb. 17th, 2006 07:15 pmMy compiled rants against physicists. In response to like-minded
quale, I wrote the following:
quale wrote:
I dropped out of being a physics major because everyone was just dogmatically accepting the notion of entropy as the "log of the number of states" and didn't want to question what the hell that really meant.
Me too! Not just they way they gloss over entropy, but also where the Schroedinger equation comes from, etc., and the way they avoid thinking about paradoxes (e.g. Maxwell's demon: is entropy subjective?, this one about classical mechanics). And the fact that nobody bothers to fix the very bad notation traditionally used in some physics is a pretty bad sign too (nobody except for my hero Sussman).
In college physics, I was just told to plug-and-play, which made me very unhappy. I was interested in finding logical relationships between sets physical axioms (e.g. how to prove that energy is proportional to amplitude squared using only the additiveness of amplitude and energy conservation).
Since I like my knowledge network to be dense / tight (i.e. certain), ignoring foundational questions and paradoxes is totally against my cognitive style, but I wonder if being less conservative might sometimes be a good idea, if the goal is to make the science progress: it might sometimes be a good idea to ignore foundational questions.
(no subject)
Date: 2006-02-17 07:21 pm (UTC)Entropy being the log of the number of states is just a *definition* - there's no dogma about it. From that you can *derive* that this microscopic (statistical mechanical) definition yeilds a macroscopic (thermodynamic) entity which behaves in the way that we expect thermodynamic entropy to behave... (in particular, it's a logarithm because you need it to be an *extensive* quantity: double the volume of the system, you want the entropy to double - but the number of states of a system discretized on a lattice will *square* when you double the number of sites, so the log of the number of states will double... *shrug*).
To get a feel for what it really means, you really need to do a lot of condensed matter physics (so that you're always playing with systems with a large number of degrees of freedom). And if you really want to confuse yourself, do some gravitational physics.
As for questions of "deriving" Schroedinger's equation, are you wondering about what the "minimal" set of axioms are needed to arrive at it? There's a ton of equivalent ways of getting it (either from "canonical quantization", where you take a classical mechanical system and promote observables to hermitian operators and Poisson brackets to commutators, or from the Feynman path integral), but essentially it's a phenomenological observation: a wave equation was needed (that much was clear from experiment), and it needed to be first order in time derivatives (for technical reasons), so that was the one that worked.
Classical mechanics, of course, works just fine, it's consistent and doesn't have any paradoxes - the proper conservation laws are derivable from Noether's theorem and the whole setup follows from either starting with Newton's laws or the variational principle in a Lagrangian setup. Why Newton's laws are true is kind of a strange question if you ask it - why are any of the laws of physics true? We (physicists) typically don't try to pare down our mathematical edifice to the minimal number of axioms, because we're not mathematicians - we want laws that work (i.e. make predictions) - we don't care if they have redundant interal definitions (as long as they're consistent, but even that we don't need to worry about until someone comes along and shows us that we can make two contradictory predictions with the same laws, *then* we worry about fixing the consistency).
But finding logical relationships between axioms is important, and we often do it (just not as undergrads - at that point we rarely *know* enough of the relevant axioms to do much of that), if it helps us understand the laws they determine.
Since I like my knowledge network to be dense / tight (i.e. certain), ignoring foundational questions and paradoxes is totally against my cognitive style, but I wonder if being less conservative might sometimes be a good idea, if the goal is to make the science progress: it might sometimes be a good idea to ignore foundational questions.
This is empirically at least 99% true: nearly *every* physicist, by the time they get to actually working in the field, spends most of their time ignoring foundational questions *even* when actually coming up with new laws of physics that have no foundations yet! They guess new phenomenological laws, work out some of their consequences, then tweak the original guess, work it out again, all trying to get physics that looks like something we know, but with some slight modifications to accomodate new data (or new theoretical data about what we imagine must be happening). Almost none of the time is spent wondering about the foundational consistency or minimal axiomitization.
.... one last thing
Date: 2006-02-17 07:22 pm (UTC)(no subject)
Date: 2006-02-17 07:56 pm (UTC)Not just they way they gloss over entropy, but also where the Schroedinger equation comes from, etc., and the way they avoid thinking about paradoxes
Okay, there are several things wrong with this comment. First of all, the primary reason why everything is glossed over in undergraduate physics classes is because the students are not experienced enough in the subject to be able to understand such things at the outset. This is the reason the Schroedinger equation is not explained further. It would not be possible to give the students a derivation of it or an explanation of what it means, unless they had far more background. The entropy issue is maybe a little different, because the foundations of statistical mechanics are a bit clouded even for experienced physicists. But the way physics is done and thought about by physicists is very different from how it's taught to undergrads. Undergraduates are deliberately given the most dumbed-down superficial version of everything as is possible, because most would get bored and quit if they had to spend a year learning how to do their very first problem in physics in the most rigorous way possible. You can solve many simple problems without understanding why the formulas work before you learn the foundations, which is why this approach is taken.
As for paradoxes, physicists love thinking about paradoxes which is why there are so many classic examples of paradoxes in physics (all of which were devised by physicists, specifically because they love thinking about them). Although I think there should be a distinction here between experimentalists and theorists. Theorists are the ones who are paid to think about and resolve paradoxes. It's not part of an experimentalist's job to do so, but some of them do so anyone in their spare time simply because it's interesting.
In college physics, I was just told to plug-and-play, which made me very unhappy.
Yep, that's the way undergrad physics is taught, and deliberately so. If you want more, you have to learn it on your own... which is what usually separates the bright students from the dumb ones. By far, the most common complaint that I hear from undergrads is that they want us to give them more plug-n-chug stuff, because they don't want to bother thinking about it too much. So we cater to the masses, even though most of us wish we could teach it in a more interesting way for the bright ones. Of course, this is done more for the first and second year classes; by the third and fourth year, it starts to get a little bit less plug-n-chug.
Since I like my knowledge network to be dense / tight (i.e. certain), ignoring foundational questions and paradoxes is totally against my cognitive style
Again, this is an experimentalist/theorist distinction. Only theorists think about paradoxes, and experimentalists outnumber theorists by at least 4 to 1. So if you got the (false) impression somehow that physicists don't like to think about paradoxes... it might be because you were talking to an experimentalist.
(no subject)
Date: 2006-02-17 09:32 pm (UTC)The key word here is Understanding. If what you are trying to produce is a prediction machine then yes the way physicists teach is good. In fact I would be inclined to agree that for the vast majority of physicists, i.e., experimentalists this is the best way to educate them (though like many subjects it still hasn't properly included computers in an educational setting for evaluating horrible integrals and other things). At least about general physics rather than their area of specialization. This method of teaching guarantees that everyone can accurately perform pragmatic calculations and come up with results. What it is not good at doing is creating understanding.
There is a very serious difference between the two and before you say it much of physics intuition is not understanding but rather just a sense of what is likely to happen because you have done so many problems, i.e., it means you can unconciouslly sorta guess the right answer or picture the problem in a way that lets you read off the right answer. Being able to quickly calculate an answer is not understanding. If we made a computer program that did the same thing we wouldn't think it was a triumph of AI because the computer really understood things, it would just be good at applying heuristic rules to a limited technical domain. Understanding is seeing why something is true. Of course you aren't going to get any ultimate 'whys' out of physics but what it means in this context is seeing why something is going to be true given certain assumptions hold. These can either go in the forward direction (supposing the basic principle of relativity why must the lorentz contraction be such and such) or backward (why do have fundamental principle X if we want the predictions to look like Y). Understanding in physics thus is often quite mathematical because it involves deductive relationships between laws though the mathematics is often informed/aided by physical intuition. It also has a large inductive component of figuring out what sorts of properties are reasonable to make laws about and this sort of thing. Good physics intuition will combine this deductive understanding of the underlying principles with an ability to predict outcomes the way someone who is really good in relativity can come up with answers by considering equivalent reference frames. However, often what is called physical intuition is just being able to picture the setup in your mind and predict what will happen.
Perhaps if it was only math people like me who felt this way I would be more inclined towards your point. However, my friends who stayed in physics all agree with me that understanding requires this sort of deductive relationship between principles and results. Interestingly the ones who are still on track to be theoriticians support my contention that most of what happens in physics classes is just grunt work that provides no real understanding.
This perfectly mirrors my experience at caltech. I was the source of the original quote and what convinced me physics wasn't for me wasn't just the fact that the course had never bothered to explain what the hell all the calculations we were doing really meant, i.e., what this entropy thing reall was (or at least the book didn't which is the same thing for me). But that I went around and asked people and most of the proto-physicists (my theoritician inclined friends aside) who could calculate the entropy in a problem in a flash didn't even seem to understand what I was asking much less be able to explain what entropy was. This was just the straw that broke the camels back. I never had an interest in playing with things in a lab I wanted to come up with new theories and understand why things were true. Seeing that all my fellow students could solve mad hard problems but couldn't really explain why things were true convinced me that physics education (for valid reasons...need more experimentalists than theoriticians) isn't focused on producing understanding.
(no subject)
Date: 2006-02-17 09:38 pm (UTC)Also I guess I should be more clear. Yes I knew plenty of people who didn't seem concerned about more than having physical intuition and picturing the situation so they could generate a prediction. My friends were all just more theoretically inclined though many of them have dropped down to be experimentalists by now. I don't believe this is a bad thing, no more than being interested in engineering as opposed to physics, but it is a focus on knowing what is true/what happens rather than knowing why it is true.
For most physicists, especially in calculations they only need as tools, an answer and physical intuition to know that answer is reasonable *is* more important than knowing why something is true. However, if what you want to do is come up with new fundamental laws or make geniune revolutions (rather than just incrementally advance the field) understanding is of vital important but everyone can't be einstein.
Re: .... one last thing
Date: 2006-02-17 09:43 pm (UTC)The difference between Einstein and Laurence (sp?) is that while they both had the same predictive equations Einstein thought about *why* things should be a certain way and derived his results from a general principle. This did provide a big improvement in physics.
Now when learning relativity some people can do lots of problems and get a very good sort of intuition about what things are going to be true without ever really understanding how it all derives from the equivalence principle. Conversely one can read through Einstein's eminently readable argument for SR and understand *why* the equations are true even though you have very little inutition about what happens in any particular circumstance.
(no subject)
Date: 2006-02-17 09:57 pm (UTC)Besides, who cares what the undergrads say they want. It isn't like there are so many physics jobs on the market that we need to fill them with hacks. At least my physics courses at caltech were all about pain, as much pain as they could possibly make you do in 140 hours of work a week (well minus some other classes) and grad school in physics isn't exactly a piece of cake from what I hear. So force a few more of the undergrads to drop out it will probably be good for them. That way they can take another major and won't find themselves jobless and gradschoolless on graduation.
I'm somewhat inclined to agree with you about schrodinger's equation. However, schrodiner got it somehow and if you taught the kids enough math first you could go through an idealized version of the same steps. Still I'm happy enough with them putting off the explanation of this. It is the entropy thing that really bugged me (and many similar things). Phys 12 at caltech wasn't holding off on this because it was too difficult but for some other reason.
While it may be true that the physicists themselves don't really have a good handle on entropy this just illustrates the point in the above post. I heard a philosopher give a very good explanation of what 'entropy' is just a week or so ago, you just need to be willing to give up the notion that there is obe quantity called entropy. Instead there is entropy relative to a certain set of properties, e.g., entropy relative to pressure, volume, and type of moleculte (e.g. have a semi-permeable membrance to distingush). This seems like a clear case of dogma/preffered intuition winning out over explanation.
(no subject)
Date: 2006-02-18 01:24 am (UTC)All the best experimentalists I know would put most theorists to shame in their grasp of the principles.
Of course, physicists are not in the business of applying axioms -- w're in the business of showing their incompleteness. This is yet more intuitive.
I was taught this way as an undergrad, though anything that would require too strong/long a technical digression to elucidate was left for grad school.
(no subject)
Date: 2006-02-18 06:05 am (UTC)It is *not* a deductive science.
What's a "deductive science"? Are there any examples?
(no subject)
Date: 2006-02-18 06:08 am (UTC)I heard a philosopher give a very good explanation of what 'entropy' is just a week or so ago, you just need to be willing to give up the notion that there is obe quantity called entropy. Instead there is entropy relative to a certain set of properties, e.g., entropy relative to pressure, volume, and type of moleculte (e.g. have a semi-permeable membrance to distingush). This seems like a clear case of dogma/preffered intuition winning out over explanation.
I'm as clueless as
(no subject)
Date: 2006-02-18 06:50 am (UTC)Secondly you have choice as to how you define terms. Saying that it doesn't make sense to say you need to accept that entropy really is blah is just like saying that Special Relativity is just bullshit because it involves setting kinetic energy to something other than 1/2mv^2 and that is just the *definition* of kinetic energy (as arguably it was before SR). Worse in pre-relativity physics someone might very well have insisted that there was an absolute fact about the amount of kinetic energy an object has (use the velocity relative to the absolute reference frame). The point is when you discover a particular definition is inappropriate in physics but that another defintion will allow the term to serve the same purpose in the overall theory you change the definition. When I say that entropy really exists only relative to a set of observational quantities this is the same sort of change as the change that one might make to say that kinetic energy is defined relative only to a reference frame (different observers give different kinetic energies).
Actually though I'm not really suggesting we abandon the idea that entropy is the log of the multiplicity. The problem is that there is no one number that is the multiplicity of the system. If I ask you how many microstates are compatible with a box with size X containing n moles of gas I get a different number than if I ask about a box with size X, n moles and pressure P. The more properties I insist that the box have the lower the multiplicity becomes. At the extreme of course where you know the state of every particle the multiplicity is always 1.
You might try and say, well we mean the number of possibilities relative to all possible macroscopic properties but this just doesn't work. Certainly relative to this you could never really calculate the entropy as you would have to account to for all possible macroscopic observations. Why would we even believe the laws of thermodynamics in this case since we can't ever really verify this notion of entropy really increases. Worse there is no systematic distinction between macroscopic and microscopic properties. For instance what happens when you want to talk about the entropy of a cell where the microstructure of DNA codes for macroscopic behavior.
In order to make this make sense one thing you can do is just say we only have entropy relative to certain properties. That is the box has one entropy relative to pressure and volume and another relative to pressure volume and temprature.
(no subject)
Date: 2006-02-19 03:46 am (UTC)You might try and say, well we mean the number of possibilities relative to all possible macroscopic properties but this just doesn't work. Certainly relative to this you could never really calculate the entropy as you would have to account to for all possible macroscopic observations.
All you need is a complete set of macroscopic parameters... every parameter that you'd need to put in the Hamiltonian in order to determine what the dynamics of the microstates are. Internal energy is always one such paramter. For the case of an ideal monatomic gas, there is a second macroscopic paramter, volume, which is necessary. This is because the gas is trapped in a potential... where the walls confining the gas are determine the shape of the potential energy term in the Hamiltonian which affects the dynamics of each of the gas particles.
Pressure is a bad example to use, because it is the thermodynamical conjugate to the volume paramater (just as temperature is the conjugate to entropy). It's true that you can determine a macrostate for an ideal gas in terms of pressure and volume, but that's because pressure is defined as (dE/dV) during a slow non-dissipative (reversible) process. Therefore, knowing P and V gives you E, the total internal energy which is the second of the 2 fundamental parameters.
If you were to add more macroscopic parameters, then either one of two things is true... either they affect the dynamics or they do not. If they do not, they are redundant and can be ignored. If they do, then this will affect the calculation of the multiplicity, and therefore the entropy. An example would be adding an electric field or a magnetic field to ions in a gas chamber. This is an extra external paramter and therefore you have to recalculate the entropy to account for it. Adding redundant paramters doesn't do anything, so it won't affect the entropy. And even if you do include them, it would just add a constant to it which is irrelevant anyway.
The more properties I insist that the box have the lower the multiplicity becomes. At the extreme of course where you know the state of every particle the multiplicity is always 1.
Keep in mind that, in order for entropy to have any meaning, or for any of the laws of thermodynamics to hold, you have to be talking about a large ensemble of states... one large enough for the central limit theorem to apply. So you couldn't do any statistical mechanics or define an entropy if the multiplicity were 1. However, you do raise an interesting question as to whether you could more finely-grain the macrostates and hence reduce the multiplicity. This sounds fairly reasonable, and if that's true then you would end up with a different version of entropy. However, I'm not sure exactly what that would mean for most cases. Using parameters like energy and volume seem pretty natural... if you tried to more finely grain it then you'd need to introduce more specific parameters which partially describe the states. But any way of doing that I can think of wouldn't work very well. You might have to get a pretty special set of parameters or come up with an entirely new way of doing physics in order to get it to work. Now that I think of it, I believe it would be akin to dividing the space of possible states up into smaller regions and writing down the equations of motion for a particle in each of those regions. So you'd end up with an entropy for each "region" but when you add them all back up again, you'd still get the same total entropy. I think. The truth is... these are very subtle issues, and while there are certainly those physicists who understand them and have sorted this all out... not all of us need to know that in order to do our jobs. Granted, I would personally like to understand it a lot more... and I suspect I will eventually... but there are a zillion other things I can say the same thing about, and it's a matter of prioritizing my time.
(no subject)
Date: 2006-02-19 04:56 am (UTC)As you seemed to understand the real problem with defining it to be the number of states relative to all macroscopic properties is the difficulty in defining what it is to be a macroscopic property. In other words this definition of entropy would presuppose that there is a principled distinction between what is a macroscopic and a microscopic property of a system and it is unclear that such a distinction exists.
If you say fine we will just define entropy as the log of the number of states compatible with a given description then you are basically endorsing the solution I described. However, it is important to see that there is no longer any description independent notion of the entropy that some box of gas has. What entropy it gets is going to depend on what sorts of experiments we are going to do on it.
The other problem, which I forgot to mention last time, is that the number of possible microstates is going to depend on what we count as distinct microscopic states. Do we count two different isotopes as being in different microstates? Do we count atoms whose electrons have different spin as being in two different microstates? Either we have to throw up our hands and admit that (even classically) we can never even approximate the entropy of a real system until we have a complete theory of all possible properties of the microscopic atoms or we need to make our notion of entropy relative to some set of possible microscopic states, i.e., we have many different concepts of entropy one considering isotopes different things another not doing so.
Let me give you the example I heard which convinced me of the problem. Let us consider a box with a divider down the center with oxygen on one side and nitrogen on the other side (equal pressure blah blah). Now imagine we remove the divider and consider the increase in entropy. If we view oxygen and nitrogen as indistingushable ("molecules of air") we get one answer (is it 0) but if we view them as distingushable we get another answer. So long as we can't tell oxygen and nitrogen apart in the experiment viewing them as indistingushable is perfectly fine but the second we can make them respond differently this is a problem. For instance if we can insert a divider between the two compartments that only allows oxygen to pass through we can use the osmotic increase in pressure to extract energy which will look like a violation of the laws of thermo to the person who doesn't calculate being oxygen or nitrogen as being in differeing microstates.
Sure it might be a bit silly not to count the type of moluecule when calculating entropy but we neglect what isotope the molecule is all the time. However, in uranium purification type setups this probably becomes important to consider the same way the introduction of the semi-permiable membrane above made the type of molecule important to consider.
There just isn't one determinate thing we can call the number of microstates compatble with a system. Unless we know what physical properties we consider macroscopic and what differing properties count as different microstates the notion just isn't coherent. Unlike teaching classical mechanics which is a consistant approximation to how things work, or teaching someone the schrodinger equation without telling them where it comes from just telling someone that the entropy is the log of the multiplicity just doesn't make sense. Sure there have been attempts by real physicists to fix this problem (Gibbs springs to mind) but the above example with oxygen and nitrogen causes his theory issues.
What I am accusing physicists of dogmatism with respect to is the insistance that there is some objective (non-description relative) notion of entropy (or at least some notion like this that does usefull work in our theories). Everything works out just fine if you agree that there are many different notions of entropy (one with respect to every possible choice of properties we use to describe the system) and, so long as we choose 'reasonable' ways to describe the system the laws of thermodynamics hold for each one of these.
(no subject)
Date: 2006-02-19 04:58 am (UTC)Telling physics students that entropy is the log of the number of states is just as bad as telling math students that any collection of sets picked out by a formula is a set. You are setting them up for a fall when they realize your definition just isn't coherent (russel's paradox in set theory/description relativity with entropy). Not to mention the fact that continuing to treat entropy as if it were a real property of a system rather than a way to characterize our state of ignorance about a system is totally misleading the public in popularizations of physics (sure it doesn't 'harm' them but if the point isn't to make them believe true things why bother).
Ultimately I think the reason this confusion persists has little to do with lack of time and much more to do with a reluctance to admit that entropy isn't a real objective property of a system. I'm not entierly sure why this reluctance exists but it certainly seems to.
(no subject)
Date: 2006-02-19 05:13 am (UTC)Don't get me wrong I totally agree that the person who first understands how the equations derive from the equivalence principle is going to have a better inutution all other things being equal. This is the very reason I think it would be better to put things on a firm theoretical foundation (if possible) before working tons of examples.
In fact many physicists implicitly support my point with their common statements about renormalization. There are tons of physicists out there who can manipulate the equations of QCD to get the right results through renormalization but nonetheless many of them feel dissatisfied about something. What is it that they feel is missing if not an understanding of why what they are doing works, i.e., some general overarching principle that has inutitive appeal which entails the moves they make in doing the calculations. Yet surely one wouldn't say that these physicist lack intuition about the results of their predictions. Hell this is one of the main reasons this technique can generate usefull results, people can tell whether the answer they get is reasonable or not.
I'm not interested in having some argument about whether experimentalists or theoriticians have a bigger dicks. However, from personal experience it is evident to me that if I wanted to know what was going to happen in some actual physical situation I would probably go ask some experimentalist. If I wanted to know about the mathematical underpinnings of using dirac delta 'functions' or what makes certain symetry groups special I would ask a theoretician. Unless you are making the ridiculous claim that theoriticians are just totally useless and physics would be better off if they were all fired and replaced with experimentalists you have to admit there is something they are better at and this thing is probably what I mean when I use the word understanding.
Ohh as one more non-physics example of why it doesn't take a deep understanding of the principles to inuit what will happen consider the example of the prime number twins who could recognize huge primes in split seconds yet were actually fairly retarded and surely had no deep grasp of number theory.
(no subject)
Date: 2006-02-19 05:24 am (UTC)That being said I'm unsure if truly aiming for real understanding when teaching physics is something that would be good for the proffession overall. As a practical matter physics needs lots of people to work lab equitment and be able to make actual predictions for real experiments. It is tempting to sluff this job off on engineers or something but experimental physics requires problem solving, approximation, and other skills that engineers just don't have.
In other words I think the situation might be alot like it is for teaching calculus to frosh. The smart ones who really want to know what is going on and why things work suffer because of the stupid plug-n-chug approach which dominates. Nevertheless I've talked to GSIs of professors who try and really teach what is going on and it is a disastor because most of the class just can't keep up. Despite my uneasiness with it, it really is better to teach the frosh calculus via plug-n-chug because many of them can't really understand it and just need to use it as a tool. Those that do want to understand it just have to suffer and learn later on their own.
Still the situation in physics is a bit different. Jobs in physics aren't exactly lacking for applicants so maybe a few more people who can't hack the deep theory dropping out wouldn't be such a bad thing. Additionally I understand that the way physics is taught in England is much more theoretically/mathematically oriented with the actual physics only being taught after a sufficent mathematical background to really understand what one is doing. However, I have no first hand experience here and don't know if what happens in England would be good on a wider scale (maybe they produce an inordinate number of theoriticians).
Usually the answer in these situations is to track people, i.e., offer a theory based track and a more applied track. However, at least my experience in physics education, suggests everyone would just stick with the theory track because it was more hardcore until they had to just drop out because they were too far behind. Physics students seem to be concerned about showing they have a big penis more than any other major but maybe that was just a quirk of caltech.
(no subject)
Date: 2006-02-19 06:16 am (UTC)The problem is that we cannot contextualize -- we can say that this mathematical technique will yield these results, or that if we devise a certain situation, we know which techniques to apply and what motions will lead to reasonable results. However, I think it's an important skill to be able intuit what is reasonable at all in a situation that is not well understood. Building a picture by teasing apart the threads, to paraphrase Feynman. This requires a deep understanding of the physical principles around which to orient one's thinking, and in the special moment guess how the principles may be wrong. Experimentalists are in this frame of mind more often, IMHO, because they deal with tangible issues on a day-to-day basis in getting their stuff to work.
(no subject)
Date: 2006-02-19 07:47 am (UTC)The problem is in the word "macrostate". It's an unfortunate word in that it sounds like it's talking about a macrostate of the actual system, but if you think about it it's really talking about the state of the environment. The so-called "macroscopic" states are labelled by what are sometimes referred to as "external parameters". The external really means that it's some property of the environment that your system resides in. Sort of like you have a box and inside it there's a system, but on the box there are dials and every time you turn one you change an "external parameter". The volume is one such parameter because it's defined by where the walls are... the walls aren't really a part of the system, they're what confines the system... they are a constraint on it... a boundry condition... I think it's probably better to think of them as defining the environment rather than an approximate macroscopic understanding of the system.
There is a total amount of entropy in any system, given fixed external parameters... that is, given a particular state of the environment (and a particular value for the total internal energy, which also serves the role of an "external" parameter because energy conservation is a constraint on the system which restricts the phase space it can explore to a particular hypersurface.) But even though there is an objective total amount of entropy, for many purposes you can ignore most of the entropy and just look at portions of it. This would be the case when the system is very unlikely to get excited into certain states. At high temperatures, these extra degrees of freedom come into play and then you have to take into account more of the total entropy that's there. But at low temperatures there is no need so it can be neglected for the sake of practicality. Your examples of the spin of an electron and nuclear excitations are exactly what I have in mind here... they definitely contribute to the entropy, but since entropy is only defined up to an additive constant and at low temperatures that part of the entropy is roughly constant, we can just subtract it off and ignore it.
So that leaves the question of the external parameters... this is the place where I was saying you sort of have a point. The entropy is only defined for a particular choice of these external parameters; however, there are usually only a small number of external parameters which confine the system. Most of the rest you might think of adding don't affect anything, so the only thing they could do is add a constant to the entropy. If you re-express the same parameters in a different way, you still are going to get the same entropy. And after I've finished thinking about it, there's really no issue with finely or coursely graining the system here, I don't think I should have said that in my prior comment... I was still thinking of the macrostates as labelling actual states of the system, but they don't... it's better to think of them as labelling the environment.
finished in next comment due to char limit...
...continued
Date: 2006-02-19 07:47 am (UTC)(no subject)
Date: 2006-02-19 07:59 am (UTC)Not to mention the fact that continuing to treat entropy as if it were a real property of a system rather than a way to characterize our state of ignorance about a system is totally misleading the public in popularizations of physics
I could be wrong here, but my understanding is that when people first introduced the idea of entropy, it was thought that it had something to do with our ignorance about a system. But after enough people thought about it deeply they figured out that that was an incorrect way of looking at it. So you're proposing that we go back to the old way of looking at entropy. I don't quite have enough confidence in my understanding of it to say that's not right... but that is my suspicion. Either way, you don't need to understand that subtle point in order to apply it to most systems... so while it would be nice to mention if there is a consensus on the issue... it's unnecessary to learn how to use it to calculate thermodynamic properties at the undergrad level (or graduate level).
(no subject)
Date: 2006-02-19 08:13 am (UTC)Pressure isn't entierly determined by volume and number of atoms in the box. It also depends on temprature, i.e. average velocity, so this shouldn't be redundant.
I didn't say it was... it's determined by knowing the total internal energy as a fucntion of the volume. Pressure measures how the energy changes as you change the volume. It isn't a thermodynamic coordinate in itself, just a conjugate to one. The "average velocity" you mention is equivalent to knowing the total internal energy--that and the volume are the two things necessary to completely specify the macrostate (aka external parameters) of an ideal gas.
Re: ...continued
Date: 2006-02-19 08:13 pm (UTC)As an interesting side note, my advisor believes that the total multiplicity of the universe is equal to the cosmological constant when measured in the right units. But there is at least one other physicist I know of who disagrees with him, and many more who've just never thought about it.
Actually, I went back and double checked the paper where he discusses this and it seems I got at least two things wrong here. You're probably less interested in this than the rest, but just for my own benefit and anyone else who's curious let me correct myself...
First, I should have said the inverse cosmological constant, and second it's equal to the entropy and not the number of states. The additional hypothesis made by my advisor which is disputed by Leonard Susskind (and possibly others) is that the entropy is approximately the log of the multiplicity. Which is usually only true for classical entropy (with quantum entropy, you need to define it as -Tr &rho ln &rho where &rho is the "density matrix", which is a bit different.) But he thinks this approximation is valid enough to use in an asymptotically deSitter spacetime (which appears to be what we live in) for various technical reasons I don't understand. So if he is correct, then I think this says that the (approximate) total number of quantum states in our universe is 2^(10^120) (since the cosmological constant is 10^-120). So bigger than a googleplex, but still finite!
(no subject)
Date: 2006-02-19 09:19 pm (UTC)This is one of the main points of Kuhn (1970). It's a really readable book which you wouldn't necessarily expect from the most important work in C20 philosophy of science.
(no subject)
Date: 2006-03-12 09:58 pm (UTC)I think the answer to this question depends on the type of physics you are doing. If we are talking about solid state, condensed matter, plasma or any other discipline of physics other than fundamental theory (string theory, quantum gravity that shit) I'm don't know enough to say if you are right or wrong.
On the other hand if you are trying to come up with truly new laws of physics then this sort of intuition can actually be a harm. The important unsolved problems in fundamental physics have to deviate significantly from the way things work in the our well understood problems work otherwise we would apply the same method and we would be done.
Also I would dispute the idea that for these type of physical principles are better known by the experimentalist. The physical principles at issue at the fundamental level are almost mathematical statements and by definition you just can't have a deep understand these principles without understanding what the principle really says.
Sure you can have a surface understanding, or even be good at guessing/intuiting/calculating known predictions, the same way you can calculate with the dirac delta function without understanding the theory of distributions. However, if you really want to understand the dirac delta function, and make use of it in complex limits, you do need to understand the theory of distribtuions. Similarly if you are trying to expand the limits of fundamental physics you need to really understand what the principle says and that means knowing the theorist stuff.
Read This not Other
Date: 2006-03-12 10:28 pm (UTC)Often by entropy people mean what I am calling entropy relative to energy and two other state variables of what type I forget. What I was calling entropy is the type that takes parameters, i.e., you specify macrostates that are fixed, and that notion of entropy certainly isn't objective. It appears this was a confusion of my part caused by the horrible tendency of physics texts to focus on calculating rather than defining rigorously. And what you have been calling entropy isn't vulnerable to this criticism.
However, if you use that definition you have another problem. That is figuring out when two microstates count as 'the same' microstate. I understand that these are defined in terms of basic quantum mechanical states but consider the following.
imagine we have the current particles and we add to them a new property P, each particle can be either +P or -P. Now the P property has no effect on any normal property but every normal quantum interation has some accompanying (but totally unconnected) P event (i.e. there are laws which change which particles have P). This now multiples the multiplicity by 2^N while being provably mathematically equivalent to the original theory.
thus by modifying the theory in mathematically equivalent ways the amount of entropy changes. This of course is in addition to the fact that all our estimates of entropy turn out to be way off whenever current theory is supplanted by a new theory which allows a greater number of states.
Hence this notion of entropy seems to be deeply flawed. Unless we believe there is no deeper theory which postulates a greater number of states (if you want in the range of the thermo equilibrium) then we should believe all our statements about entropy are just false (they are based on a dramatically flawed notion of what counts as a distinct state).
I think a better option is to just define entropy as being relative to a specified notion of distinct microstate.
too late...
Date: 2006-03-12 11:17 pm (UTC)There were two issues you brought up, one being ambiguity in what constitutes a microscopic state (include nuclear spin or not, etc.). I think it's well agreed upon that this is not really a problem and can be defined objectively. The other is the ambiguity in what counts as a macroscopic state... there's where the real controversy lies. I think it's relatively objective. But I recently read some comments from Gell-Mann which indicates he has a view of it similar to yours... where it's all relative to how coursely grained a description you have. My feeling is that whether entropy is subjective mostly depends on what you mean by the question. Which is why there is so much confusion on the issue.
When I said there is no 'objective' notion of entropy all I meant was that you can't point to a box and demand to know 'what is the entropy of the gas in that box' you have to say relative to what parameters, e.g., macroscopic constraints.
It's relative to what constraints are on the system, and what you are including as part of the system. But the reason I would still argue for calling entropy objective is that, to me, when you change these things you're really talking about a different system. Although others might say you're talking about the same system defined in a different way or within a different context. These are thorny issues which a lot of physicists avoid thinking about since it gets to sounding more like philosophy or semantics.
One way in which I would agree that entropy is relative, though, is with respect to the time-scale on which you're doing the experiment. I think this is a somewhat different issue than you were raising, although maybe it's the same and I'm just not making the connection. Anyway, many systems will explore a subspace of phase-space on one time-scale, but if you look on a larger time scale it might escape out of that subspace and explore a much larger total phase space. Quantum mechanical tunneling is one example of how this can happen, although there are classical examples as well. So in this case, it has one entropy if you're looking on one time-scale and another entropy if you're looking on a longer time-scale. Although again, you could just say "I'm always going to look at the longest possible time-scale" which may or may not be a meaningful statement.
These problems of what to define entropy with respect to are not unique to entropy. The same sort of problems arise with just about any other quantity in physics, such as energy, velocity, mass, etc. All of them have different meanings in different contexts... and depending on the contraints you put on the system and on how much of the total energy, mass, velocity, etc. you're ignoring you might assign it different values. For example, the total energy of a system is not an absolute quantity, it's relative to the reference frame in which you are looking at the system. Mass is even more difficult to define, and there are a whole lot of different ways you can define it in different situations. Sometimes potential energy in bonds is counted as mass, and sometimes it's not. Kinetic energy is sometimes counted as mass (for instance, if it's not all going in one direction, but confined in a localized region of space similar to a gas) but usually it's not. So almost all of the quanitites you can define in physics are in some way "relative" to what you mean when you write them down. Temperature is the rate of change in energy with respect to entropy, so it's "doubly relative" if energy and entropy are. In particle physics, even how many particles are in a system is not a well-defined question since it is relative to which gauge you're working in when you write down the equations.
(no subject)
Date: 2006-03-12 11:59 pm (UTC)Alright, one of our problems comes down to just terminology
Agreed. So let me clarify my definition a bit...
The definition of entropy I'm using here is the log of the number of accessible states in a system. (I mentioned earlier that there's a definition of quantum entropy in terms of a density matrix, but let's avoid that for this discussion, since nothing we've been arguing about here really depends on it.) The word accessible makes the actual value depend on a few things... first, it depends on what constraints are on the system since adding another contraint will restrict the system to fewer accessible states (for instance, if the system is held at constant volume, constant pressure, etc.). Second, it can depend on the timescale over which you're interested in the system's behavior... so the only way it truly becomes an absolute objective concept is if you're talking about the entropy of the entire universe on the longest possible timescale (the life of the entire universe). Otherwise there's always the possibility that your system will decay into another state or do something unexpected far in the future.
As for the counting of microstates, I don't think that bears on the objectivity of this concept. And here's why...
imagine we have the current particles and we add to them a new property P, each particle can be either +P or -P. Now the P property has no effect on any normal property but every normal quantum interation has some accompanying (but totally unconnected) P event (i.e. there are laws which change which particles have P). This now multiples the multiplicity by 2^N while being provably mathematically equivalent to the original theory.
Adding the property P to the system changes the total entropy, yes. But does that make it an ill-defined concept? No. Because we know that as long as P doesn't affect the rest of the system, all it does is add a constant to the entropy. So it means that, even though that entropy is there, we are free to ignore it in our calculations for practical reasons. This is akin to ignoring the potential energy stored in the moon's attraction to a car on the highway, in order to calculate how it's going to move through traffic. Yes, in principle it does factor into its total energy... but no, you don't need to include it for most calculations.
Unless we believe there is no deeper theory which postulates a greater number of states (if you want in the range of the thermo equilibrium) then we should believe all our statements about entropy are just false
But science never tries to make statements which are true or false. It only tries to give descriptions which are more accurate than the prior ones. By looking at the behavior of a gas or any large ensemble of states, we know approximately how much entropy is there that matters. There may always be more there which we don't know about, but we can be sure that it doesn't affect things much. The same is true for energy, or any other quantity. You can always discover new stuff, but you know it will only slightly affect your prior calculations. That's the nature of science, and why it's distinct from philosophy, logic, or mathematics.... all of which involve themselves in "truth" where science does not.
(no subject)
Date: 2006-03-13 05:25 am (UTC)That's the nature of science, and why it's distinct from philosophy, logic, or mathematics.... all of which involve themselves in "truth" where science does not.
Actually, I should say binary truth rather than truth here. Binary truth makes very little sense in the real world... I personally think its only use is in a limited (hypothetical) context within mathematics, but that's getting into my own epistemological beliefs so I should stop there before it leads us astray.
(no subject)
Date: 2006-03-14 01:43 am (UTC)If the experimental phenomena are richer than the theory, the theorist must make an effort to apply the theory to understand observations. I think this applies to something like atomic physics.
If the theoretical landscape is richer than the observed phenomena, the experimentalist must understand the theoretical issues to extract results from his apparatus. This might apply to something like particle physics.