iMechanica - arrow of time
https://www.imechanica.org/taxonomy/term/1056
enUnderstanding the 2nd Law of Thermodynamics
https://www.imechanica.org/node/1563
<div class="field field-name-taxonomy-vocabulary-6 field-type-taxonomy-term-reference field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/taxonomy/term/77">opinion</a></div></div></div><div class="field field-name-taxonomy-vocabulary-8 field-type-taxonomy-term-reference field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/taxonomy/term/1055">entropy</a></div><div class="field-item odd"><a href="/taxonomy/term/1056">arrow of time</a></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>
Sean Carroll at <a href="http://cosmicvariance.com/" target="_blank">CosmicVariance</a> has an interesting set of <a href="http://preposterousuniverse.com/talks/time-colloq-07/" target="_blank">slides</a> on the 2nd law of thermodynamics and the arrow of time. In this age of simulations, one question that came to my mind was whether we could do a molecular dynamics simulation of a glass of ice cubes, allow them to melt, and then simulate the process in negative time to get the ice cubes back. Probably not. But the fundamental physical laws do not have any time directionality to them. So what gives? More importantly, the universe appears to have had a low entropy beginning but the second law suggests that we are at an entropy minimum at this point in time.
</p>
<p>
Some answers can be found in the comments on Sean's post.
</p>
<ul><li>
<p>
<a href="http://math.ucr.edu/home/baez/">John Baez</a> clarifies some points (including the Boltzmann H-Theorem) saying:
</p>
<p>
<em><br />
So, Sean wrote:<br /></em>
</p>
<ul><li>
<p>
<em><br />
Boltzmann’s H-theorem, while interesting and important, is even worse. It makes an assumption that is not true (molecular chaos) to reach a conclusion that is not true (the entropy is certain, not just likely, to increase toward the future — and also to the past).<br /></em>
</p>
</li>
</ul><p>
<em>whereas when I last thought about this theorem, I claimed otherwise (see Blake Stacey’s comment). </em>
</p>
<p>
<em><br />
So, is the assumption behind Boltzmann’s H-theorem time-symmetric or not? And what about the theorem’s conclusion: time-symmetric or not? Is the conclusion that entropy increases </em><em>both in the future and the past… or just towards the future? I’m pretty darn sure the assumptions and conclusions are time-asymmetric. Boltzmann actually called his assumption the “Stosszahlansatz”, or “collision number assumption”.<br /></em>
</p>
<p>
<em><br />
So: what’s the Stosszahlansatz? It goes like this. </em>
</p>
<p>
<em><br />
Suppose we have a homogeneous gas of particles and the density of them with momentum p is f(p). Consider only 2-particle interactions and let w(p1, p2; p1′, p2′) be the transition rate at which pairs of particles with momenta p1, p2 bounce off each other and become pairs with momenta p1′, p2′. To keep things simple let me assume symmetry of the basic laws under time and space reversal, which gives:<br /></em>
</p>
<p>
<em><br />
w(p1, p2; p1′, p2′) = w(p1′, p2′; p1, p2).<br /></em>
</p>
<p>
<em><br />
In this case the Stosszahlansatz says:<br /></em>
</p>
<p>
<em><br />
df(p1)/dt = integral w(p1, p2; p1′, p2′) [f(p1′)f(p2′) - f(p1)f(p2)] dp2 dp1′ dp2′<br /></em>
</p>
<p>
<em><br />
This is very sensible-looking if you think about it. Using this, Boltzmann proves the H-theorem. Namely, the derivative of the following function is less than or equal to zero:<br /></em>
</p>
<p>
<em><br />
H(t) = integral f(p) ln f(p) dp<br /></em>
</p>
<p>
<em><br />
This is basically minus the entropy. So, entropy increases, given the Stosszahlansatz! The proof is an easy calculation, and you can find it in section 3.1 Zeh’s </em><em>The Physical Basis of the Direction of Time (a good book). Now: since the output of the H-theorem is time-asymmetric, and all the inputs are time-symmetric except the Stosszahlansatz, we should immediately suspect that the Stosszahlansatz is time-asymmetric.<br /></em>
</p>
<p>
<em><br />
And it is! <br /></em>
</p>
</li>
</ul><ul><li>
<p>
Sean's reponse was:
</p>
<p>
<cite><br /></cite>
</p>
<p>
<em><br />
John’s comment <a rel="nofollow" href="http://cosmicvariance.com/2007/06/11/latest-declamations-about-the-arrow-of-time/#comment-285605">above</a> about the Stosszahlanzatz is basically right, but there is a little bit more behind the story. (And it is all there in Zeh’s book, if you bother to unpack it.)<br /></em>
</p>
<p>
<em><br />
Boltzmann actually did two innocuous-sounding things that enabled him to miraculously derive a time-asymmetric conclusion from time-symmetric microscopic laws. First, he worked with distribution functions defined in the </em><em>single-particle</em> <em>phase space. That is, if you have two particles, you can either keep track of them as one point in a two-particle phase space (separate dimensions for the position and momentum of each particle) or as two points in the single-particle phase space. Both are completely equivalent. But if you go to a</em> <em>distribution function in that single-particle phase space, f(q,p), then you have thrown away information — implicitly, you’ve coarse-grained. In particular, you can’t keep track of the correlations between different particle momenta. There’s no way, for example, to encode “the momentum of particle two is opposite to that of particle one” in such a distribution function. (You can keep track of it with a distribution function on the full multi-particle phase space, which is what Gibbs ultimately used. But there entropy is strictly conserved, unless you coarse-grain by hand.)<br /></em>
</p>
<p>
<em><br />
Because you’ve thrown away information, there is no autonomous dynamics for the distribution function. That is, given f(p,q), you can’t just derive an equation for its time derivative from the Hamiltonian equations of motion. You need to make another assumption, which for Boltzmann was the Stosszahlansatz referred to above. You can justify it by saying that “at the initial moment, there truly are no momentum correlations” (plus some truly innocent technical assumptions, like neglecting collisions with more than two particles). But of course the real Hamiltonian dynamics then instantly creates momentum correlations. So that innocent-sounding assumption is equivalent to “there are no momentum correlations </em><em>before the collisions (even though there will be afterwards).” Which begins to sound a bit explicitly time-asymmetric.<br /></em>
</p>
<p>
<em><br />
The way I presented the story in my talk was to strictly impose molecular chaos (no momentum correlations) at one moment in time. That’s really breaking time-translation invariance, not time-reversal. From that you could straightforwardly derive that entropy should increase to the past and the future, given the real Hamiltonian dynamics. What the real Boltzmann equation does is effectively to assume molecular chaos, chug forward one timestep, and then re-assume molecular chaos. It’s equivalent to a dynamical coarse-graning, because the distribution function on the single-particle phase space can’t carry along all the fine-grained information</em>.
</p>
</li>
</ul><p>
</p>
<p>
What do you think?
</p>
</div></div></div>Fri, 15 Jun 2007 18:58:56 +0000Biswajit Banerjee1563 at https://www.imechanica.orghttps://www.imechanica.org/node/1563#commentshttps://www.imechanica.org/crss/node/1563