User login

Navigation

You are here

Learning temperature, pressure, and chemical potential

Zhigang Suo's picture

I have updated sections of my notes on thermodynamics.  A few thoughts on learning are collected here.  Of our world the following facts are known:

  • An isolated system has a set of quantum states, or microstates, for want of a single word.
  • The isolated system flips rapidly and ceaselessly from one microstate to another.
  • After a system is isolated for a long time, all microstates of the system are equally probable (the fundamental postulate).

For an account of these facts, see notes on Isolated Systems (node/290).

A particular way to exploit these facts is as follows. A subset of microstates of an isolated system is called a macrostate. For a system isolated for a long time, of a given family of macrostates of the system, the most probable macrostate has the largest number of microstates.

The logarithm of the number of microstates is called the entropy.

Thermodynamics is then formulated by constructing isolated systems and macrostates. One object of thermodynamics is to determine entropies for various macrostates, by a combination of experiments and calculations.  Another object of thermodynamics is to pick the macrostate with the largest entropy.

We have seen three basic examples:

  • By allowing two systems to exchange energy, we introduce temperature (node/291).
  • By allowing two systems to exchange energy and space, we introduce pressure (node/885).
  • By allowing two systems to exchange energy and matter, we introduce chemical potential (node/911).

The mathematics is simple enough.  For a system with variable energy, volume, and the number of a species of molecules, the entropy of the system is a function of the three variables.  When two systems are in contact, the composite of the two systems is an isolated system.  The entropy of the composite is the sum of the entropy of one system and the entropy of the other system.

The problem maximization leads to the partial derivatives of the entropy with respect to energy, volume, and number of molecules.  We then name the partial derivatives using combinations of temperature, pressure, and chemical potential.  Each of the three quantities­­­, however, poses distinct difficulties to learn.

Temperature is familiar to everyone from daily experience. The familiarity, however, seems to breed not only contempt but also misunderstanding. The essential steps in learning are to strip away extraneous ideas associated with the daily use of temperature, and to link temperature to the derivative of entropy with respect to energy. After these steps, the familiarity with temperature aids, rather than retards, the understanding of thermodynamics.

Pressure is familiar to us mechanicians. The ideas of force and work are so embedded that any other way to introduce pressure seems to be unnatural. Yet this familiarity sheds no insight into phenomena such as osmosis and ideal gas. To equilibrate two systems exchanging volume, what matters is the derivative of entropy with respect to volume. However, instead of giving this derivative a distinct name, we call this derivative the ratio of the pressure over temperature.

Chemical potential is introduced within thermodynamics, so that we do not need to fight against familiarity. The difficulty in leaning chemical potential is to get to be familiar with it. How is chemical potential determined in experiments? What does chemical potential do? The definition of chemical potential has the same issue as that of pressure. To equilibrate two systems exchanging molecules, what matters is the derivative of entropy with respect to the number of molecules. However, instead of giving this derivative a distinct name, we call this derivative the ratio of the chemical potential over temperature.

Some comfort is gained when we examine energy as a function of entropy, volume and the number of molecules. But this comfort comes at a cost: it removes us one more step away from the fundamental postulate.  We twist our presentation to conform to Gibbs’s preference.

Comments

Rui Huang's picture

Dear Zhigang,

I find that I can always learn more about mechanics and thermodynamics from your notes. Your discussions on temperature, pressure, and chemical potential are very enlightening to me, not that I do not understand these fundamental concepts but to understand them at a higher level. Although I have yet to digest all your collected thoughts and will have to spend more time with your notes, there is one question I cannot wait to ask. While pressure is commonly introduced as a state variable in thermodynamics, I have not seen much of the same for stress in thermodynamics. As for many conventional mechanicians, I see pressure as part of stress (for both solids and fluids). The question is then: should stress play the same role as pressure, for example, in determining the number of microstates or entropy?

Thanks. 

RH

Zhigang Suo's picture

Dear Rui:  Thank you for your kindness.  I have also updated my notes on finite deformation, where stress is discussed within the context of thermodynamics.  The notes need more work, but they moght be of some use to you.  Let's talk more about the ideas when you have a chance looking at the notes.

Arash_Yavari's picture

Dear Zhigang:

Thank you for sharing your notes. I had a quick look and have a couple of quick questions/comments.

On the second page, you mention that: "The combined growth and deformation clearly does not preserve the identity of each material particle." I think the traditional formulation needs fundamental changes for surface growth (new material points are added) but for bulk growth, there are many works in the literature that assume material points are preserved. In "growth" theories, conservation of mass is modified as mass is added or removed. Another key approach in the existing works is a multiplicative decomposition of deformation gradient into elastic and growth parts, very much like what is traditionally done in finite plasticity.

On the same page you mention that "we may not be able to always set the reference state as the unstressed state." This is correct in the case of Euclidean spaces; you may be able to embed a body with residual stresses in a non-Euclidean space such that in the new space it is stress free.

One page 17, you assume that entropy density depends on deformation gradient. On page 23 you mention that material-frame-indifference enables one to write the free energy density as a function of C. Why wouldn't you do the same for entropy density? If entropy density depends on F, then it depends on the first Piola-Kirchhoff stress as well (at least implicitly)? Perhaps this is related to Rui's question?

Regards,
Arash

Zhigang Suo's picture

Dear Arash:  Really appreciate your comments on the notes on finite deformation.  Quick responses follow.

  • Growth.  I agree with your comments.  People have already had some theories about growth, and it will be fascinating to study them as theories and as applications to real phenomena.
  • Reference state.  The reference state does not need be an actual state of the body.  Of course, one may wish to select the reference state from the actual states.
  • Entropy.  I'll add a sentence to the notes to state that the entropy is a function of C.  To link the entropy to stress needs the condition of thermodynamic equilibrium, as discussed in the notes.  This requirement is similar to relating the entropy of a system to the temperature.  
Subscribe to Comments for "Learning temperature, pressure, and chemical potential"

Recent comments

More comments

Syndicate

Subscribe to Syndicate