QM implements EQUAL

Feb. 11th, 2008 06:33 pm
gusl: (Default)
[personal profile] gusl
This blew me away.

Today, Ed Fredkin taught me that double-slit experiments can tell whether two particles (electrons, atoms, molecules) are *identical* or not: if they are, an interference pattern appears (since this makes it possible, in principle, to track which particle went through which hole).

When we see an interference pattern with gold atoms, we know that those two atoms really are *identical*. This means that we've reached the bottom-level in some sense: there is no deeper micro level in which these atoms could differ detectably ("detectably" according to nature's definition).

If we tag one of the atoms (e.g. by moving an electron in a small way), the interference pattern disappears completely.

I know I'm behind the times. But this is cool. Just imagine using this technology as a kind of quality assurance for nanotech replicators.

--

I'm wondering how these double-slit phenomena play out with reversibility, "irreversibility" (i.e. 2nd law of thermodynamics), and all other principles. It would seem to violate continuity... except that the universe is supposed to be discrete at that scale.

--

Tangentially, about interference patterns in general, I came up with a cute proof the Summer after my freshman year of college: from energy conservation, it follows that power is proportional to the square of the amplitude.

(no subject)

Date: 2008-02-12 12:07 am (UTC)
From: [identity profile] cdtwigg.livejournal.com
Don't you have to shoot a whole bunch of them to figure out whether you've got an interference pattern tho? I mean, these are statistical effects.

(no subject)

Date: 2008-02-12 12:40 am (UTC)
From: [identity profile] bram.livejournal.com
The double slit experiment and interference patterns are very provocative and bear deep thinking over--especially if, like Fredkin (and to some extent me too!) you are interested in explaining what happens in terms of underlying information.

As for reversibility, you can think of quantum mechanics as following from two postulates. Penrose calls them U (for Unitary) or R (for Reduction). U is reversible and destroys no information. It's described by the Schrodinger equation or its generalizations. It basically rotates the complex number describing the probability amplitude. R is what happens when an observation occurs: a random choice of outcome. It is not reversible. The mystery is whether both U and R are real, and what constitutes a measurement.

(no subject)

Date: 2008-02-12 12:43 am (UTC)
From: [identity profile] bram.livejournal.com
"rotates the complex number describing the probability amplitude" is an oversimplification. There's an operator H called the Hamiltonian and the time evolution over a time interval delta t is defined as exp(-i H delta t). You can define exponentiation of operators through Taylor series. It's unitary because it preserves norm (probability); if you multiply the time-evolution operator by its complex conjugate (replace i with -i) you get 1.

(no subject)

Date: 2008-02-12 01:08 am (UTC)
From: [identity profile] gwillen.livejournal.com
Damnit, I forgot to come today. :-( And it sounds like it was interesting, too. Does the blackboard site for the class have all this stuff on it? Did he do anything else today besides this?

(no subject)

Date: 2008-02-12 03:26 am (UTC)
From: [identity profile] gustavolacerda.livejournal.com
Today was about the 2nd law, and Maxwell's Demon from an informational perspective.

(no subject)

Date: 2008-02-12 04:36 am (UTC)
From: [identity profile] gwillen.livejournal.com
Er, and apparently also something about the double-slit experiment?

(no subject)

Date: 2008-02-12 04:39 am (UTC)
From: [identity profile] gustavolacerda.livejournal.com
that was after class.

(no subject)

Date: 2008-02-12 04:40 am (UTC)
From: [identity profile] gustavolacerda.livejournal.com
I think Blackboard has everything. I got a login, by asking Anand.

(no subject)

Date: 2008-02-13 06:17 am (UTC)
From: [identity profile] spoonless.livejournal.com
Regarding what [livejournal.com profile] bram says, I would point out that Penrose is a bit of a nut. Most good physicists today would reject the R postulate, which was mostly something only used by certain physicists back in the mid 1900's. There is a similar postulate that some physicists still think you need, in order to get probabilities out, however it doesn't involve a "reduction" or "wave function collapse". The modern understanding of quantum mechanics is in terms of decoherence, which nearly everyone who counts accepts now.

A few years ago, you and I had a discussion about entropy and Maxwell's demon. I'd like to update what I said regarding that, since I've learned more since then. However, I must admit that I still don't fully understand entropy.

Best I can tell, it goes something like this: if our universe did not have a cosmological constant (dark energy) then the "finegrained" entropy of the multiverse as you evolve it forwards in time would always be zero, since it would always be just in a single state. However, we live in a universe which does have a cosmological constant, and one of the weird implications of that is that there is a sort of intrinsic entropy to the universe, on the order of 10^120 times Boltzman's constant k. In some sense, this entropy is due to the entanglement of patches of our universe with invisible parts that are hidden behind our cosmological horizon and not really a part of the universe in some sense. With a cosmological constant, the finegrained entropy remains constant at 10^120.

On top of this intrinsic entropy due to entanglement of the visible universe with the invisible parts of it, you also can have thermodynamic entropy that results from entanglement of different visible parts of our universe with other visible parts. In order to see this as entropy, you have to do a sort of "course graining" and imagine that our universe is broken up into a distinct number of blocks of some size volume. As long as the blocks you break it into are large compared to the number of microstates in each block, you get the same amount of total entropy per unit volume... independent of what size the blocks are. So this "coursegrained" entropy is a measurable well-defined macroscopic variable as well.

So last time when we were arguing about whether entropy is due to course graining or not... you were mostly right, most of the type of usual thermodynamic entropy we talk about is in some sense due to course graining. However, it doesn't matter too much how you do the course graining. One of the reasons I thought it wasn't due to course graining is because I had heard the estimate of 10^120 for the total amount of entropy in the universe.. however all of this entropy is finegrained entropy. And there is a lot of entropy on top of that if you course grain (which you usually would). The course grained entropy is in some sense information leaking out into the rest of the quantum multiverse... which is something I hypothesized before and it turns out that part is exactly right (I just didn't call it "course grained").

Incidentally, entropy was the first way in which it was discovered that particles of a given type (such as an electron) are identical. You can test for whether a gas is made of identical particles or non-identical particles because the combinatorics involved is different so you get different answers. Near the beginning of the development of thermodynamics, it was discovered empirically that particles are identical because that was the combinatorical formula that works in calculating the number of microstates (much to the surprise of the physicists who were calculating it!) So the double slit is just one of many ways for testing whether particles are identical.

(no subject)

Date: 2008-02-13 06:18 am (UTC)
From: [identity profile] gustavolacerda.livejournal.com
yes. Please reinterpret what I wrote accordingly.

(no subject)

Date: 2008-02-13 06:38 am (UTC)
From: [identity profile] spoonless.livejournal.com

I'm wondering how these double-slit phenomena play out with reversibility, "irreversibility" (i.e. 2nd law of thermodynamics), and all other principles. It would seem to violate continuity... except that the universe is supposed to be discrete at that scale.

What do you mean by "violate continuity"? Continuity of what?

Irreversiblity is due to the course graining mentioned in my earlier comment. Basically, the different blocks in the universe get so hopelessly entangled that it would be impractical to trace back through how exactly they got that way or unentangle them again. Picture thousands of long strings that are initially separate and then they end up all in the same horribly tangled up knot. In principle, there is a way to unknot the knot by reversing each step... but in practice it's very unlikely to happen unless you had some superintelligence carefully doing it deliberately. And if this superintelligent being works based on the same random statistical principles as the rest of the universe, it's hopeless period (which is what is the case in our universe for any intelligent beings we know of).

(no subject)

Date: 2008-02-13 11:22 pm (UTC)
From: [identity profile] spoonless.livejournal.com

Near the beginning of the development of thermodynamics

I should probably say "near the beginning of the development of statistical mechanics"... in other words, the first time they tried to explain things like heat and temperature in terms of the statistics of individual particle motion. The study of heat, temperature, and pressure was around long before that.

(no subject)

Date: 2008-02-14 02:04 am (UTC)
From: [identity profile] gustavolacerda.livejournal.com
btw, are you a Dr. now, or doing a semester-at-Google?

February 2020

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags