Decoherence In Q M And Entropy
+1 to decoherence theories in quantum mechanics:

Decoherence has been marketed as a potential way of linking QM with the second law of thermodynamics.
As an object interacts with its environment, it is constantly exchanging pieces of energy with its surroundings. This exchange of energy can be seen as the object losing information, and hence its wave function finds itself constantly being reset if you like. The environment effectively becomes a way to measure or sample the object's properties. This is what's happening with the double slit experiment: you know which slit the photon went through and your interference goes down the drain (this sort of 'information' is also how quantum computations work). The environment gains the information the object loses.

Decoherence is something entirely quantum - information doesn't hold this kind of sway in classical mechanics.

If you add in an observer to the mix, the information they've got about the state of the object by measuring that object will be altered over time and also (generally) diminished. Interestingly, if observer and environment are "measuring" the same thing e.g. energy, the observe doesn't lose as much information. This is because the environment and observer are both simultaneously "looking" at the object as being made up of a superposition of quantised energies. This is poetically termed the predictability sieve, and there's a related topic called environmental einselection or superselection.

Someone with the last name von Neumann defined entropy quantum-mechanically in terms of the density matrix for an open quantum system (the density matrix playing a role similar to ψ). He thought measurements were irreversible - another physicist called Zeh picked up on this and linked it to The Arrow Of Time: as systems become more macroscopic getting information costs the environment less*, and indeed the von Neumann entropy grows (and ends up being the same as classical entropy), and all of the superposition-y behaviour disappears (or really is never seen in the first place for large systems). It has been proposed that what's driving the increase in entropy full stop is decoherence.

I don't think any of the above clarifies whether decoherence theory is in favour of a SUCK or Everett-type interpretation. The literature is driven largely by the desire to show that everyday, classical physics can in fact be explained by quantum mechanics, and also to join-the-dots mathematically. It is important to note that decoherence doesn't equal wave function collapse (well we could probably discuss that too).

*in terms of action (nifty abstract quantity that rears its head everywhere in physics and coincidentally has the same physical dimensions as Planck's constant) per bit of information.

Siobhan Tobin u5354170

Thanks, Siobhan.

Very helpful.

As you just said very clearly, but it bears repeating, this needs to be ADDED to one of the theories of QM. It doesn't REPLACE them. In particular, it needs a definition of measurement (Zeh has said this too). And that's not trivial, because you can't have every interaction counting as a measurement (at least, not equally), because if you did wouldn't get all the observed interference effects, and also because in the Elitzur-Vaidman "bomb" experiment there are measurements (in the non-technical sense) in places where there's no exchange of energy.

Also, I suspect the entropy part of this only works if we use coarse-graining, a trick which is common in calculations of entropy but which I think doesn't make sense as an explanation of the Second Law. I haven't said anything about coarse-graining in class, but if you know what it is and you're interested in criticisms of it see Michael Redhead, From Physics To Metaphysics, Cambridge: CUP, 1995, p.31.

It sounds like you don't need reading on decoherence in general, but in case you want to read something really good, or in case anyone else does, I strongly recommend this: It's written by one of the people who taught me this stuff (not that I'm fit to tie his shoelaces — he's a real expert on this topic and I'm not). The same author, Guido Bacciagaluppi, has also written this:, which is precisely about the question of whether decoherence can explain an arrow of time: he thinks it can, but not as a fundamental explanation, only as what we see in situations that happen to be typical, which is the same as the Second Law.