Entropy (thermodynamics): Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Paul Wormer
No edit summary
imported>Paul Wormer
No edit summary
Line 1: Line 1:
{{subpages}}
{{subpages}}
'''Entropy''' is a function of the state of a [[thermodynamics|thermodynamic system]]. It is a size-extensive<ref>A size-extensive property  of a system becomes ''x'' times larger when the system is enlarged by a factor ''x'', provided all intensive parameters remain the same upon the enlargement. Intensive parameters, like temperature, density, and pressure,  are independent of size.</ref>  quantity with dimension [[energy]] divided by temperature  ([[SI]] unit: [[joule]]/K). Entropy has no clear analogous mechanical  meaning—unlike volume, a similar size-extensive state parameter. Moreover entropy cannot directly be measured, there is no such thing as an entropy meter, whereas state parameters like volume and temperature are easily determined. Consequently entropy is one of the least understood  concepts in physics.<ref>It is reported that in a conversation with Claude Shannon, John (Johann) von Neumann  said: "In the second place, and more important, nobody knows what entropy really is [..]”.  M. Tribus, E. C. McIrvine, ''Energy and information'', Scientific American, vol. '''224''' (September 1971), pp. 178–184.</ref>
'''Entropy''' is a function of the state of a [[thermodynamics|thermodynamic system]]. It is a size-extensive<ref>A size-extensive property  of a system becomes ''x'' times larger when the system is enlarged by a factor ''x'', provided all intensive parameters remain the same upon the enlargement. Intensive parameters, like temperature, density, and pressure,  are independent of size.</ref>  quantity, invariably denoted by ''S'', with dimension [[energy]] divided by temperature  ([[SI]] unit: [[joule]]/K). Entropy has no analogous mechanical  meaning—unlike volume, a similar size-extensive state parameter. Moreover entropy cannot directly be measured, there is no such thing as an entropy meter, whereas state parameters like volume and temperature are easily determined. Consequently entropy is one of the least understood  concepts in physics.<ref>It is reported that in a conversation with Claude Shannon, John (Johann) von Neumann  said: "In the second place, and more important, nobody knows what entropy really is [..]”.  M. Tribus, E. C. McIrvine, ''Energy and information'', Scientific American, vol. '''224''' (September 1971), pp. 178–184.</ref>
<!-- {{Image|Carnot title page.jpg|right|300px}} -->
<!-- {{Image|Carnot title page.jpg|right|300px}} -->
 
{{Image|Text entropy.png|right|500px|Excerpt from Clausius (1865).<br><small>'''Translation:''' In search of a distinctive name for ''S'', one could call the quantity ''S''  the ''transformation content'' of the body, similarly as one could call the quantity ''U''  the ''heat and work content'' of the body. As I deem it better to derive the name for a  quantity that is so important for science from the antique languages, so that it  can be used without change in all modern languages, I  propose to call the quantity ''S'' the ''entropy'' of the body, after the Greek word for transformation ἡ  τροπή. On purpose I have constructed the word ''entropy'' to resemble
The state variable "entropy" was introduced by [[Rudolf Clausius]] in 1865<ref>
as much as possible  the word ''energy'', as both quantities  to be called by  these words are so closely related in their physical meaning. Hence a certain similarity in assigning their names seemed useful to me.</small> }}
R. J. E. Clausius, ''Über verschiedenen für die Anwendung bequeme Formen der Hauptgleichungen der Mechanischen Wärmetheorie'' [On several forms of the fundamental equations of the mechanical theory of heat that are useful for application], Annalen der Physik, vol. '''125''', pp. 352–400 (1865). Around the same time Clausius wrote a two-volume treatise:  R. J. E. Clausius, ''Abhandlungen über die mechanische Wärmetheorie'' [Treatise on the mechanical theory of heat], F. Vieweg, Braunschweig, (1864–1867); the paper was published too late for inclusion in the first edition of the treatise, but it was included as the 9th chapter in the 1867 English translation of the first volume (the second volume was not translated).</ref> when he gave a mathematical  formulation of  the [[second law of thermodynamics]]. He derived the name  from the classical Greek ἐν + τροπή  (en = in, at; tropè = change, transformation). On purpose Clausius chose a term similar to "energy", because of the close relationship between the two concepts.
The state variable "entropy" was introduced by [[Rudolf Clausius]] in 1865, <ref>
R. J. E. Clausius, ''Über verschiedenen für die Anwendung bequeme Formen der Hauptgleichungen der Mechanischen Wärmetheorie'' [On several forms of the fundamental equations of the mechanical theory of heat that are useful for application], Annalen der Physik, (is Poggendorff's Annalen der Physik und Chemie) vol. '''125''', pp. 352–400 (1865). Around the same time Clausius wrote a two-volume treatise:  R. J. E. Clausius, ''Abhandlungen über die mechanische Wärmetheorie'' [Treatise on the mechanical theory of heat], F. Vieweg, Braunschweig, (vol I:  1864, vol II: 1867);  [http://books.google.com/books?id=utAEAAAAYAAJ&printsec=frontcover&hl=de&source=gbs_v2_summary_r&cad=0#v=onepage&q=entropie&f=false Google books (contains two volumes)]. The 1865 Annalen paper was reprinted in the second volume of the ''Abhandlungen'' and included in the 1867 English translation.</ref>see the inset for his text, when he gave a mathematical  formulation of  the [[second law of thermodynamics]].


The traditional way of introducing entropy is by means of a Carnot engine, an abstract engine conceived  in 1824 by [[Sadi Carnot]]<ref>S. Carnot, ''Réflexions sur la puissance motrice du feu et sur les machines propres à développer cette puissance (Reflections on the motive power of fire and on machines suited to develop that power)'',  Chez Bachelier, Paris (1824).</ref> as an idealization of a steam engine. Carnot's work foreshadowed the [[second law of thermodynamics]]. The  "engineering" manner—by an engine—of introducing entropy will be discussed below. In this approach, entropy is the amount of [[heat]] (per degree kelvin) gained or lost by a thermodynamic system that makes a transition from one state to another. The second law states that the entropy of an isolated system increases in spontaneous (natural) processes leading from one state to another, whereas the first law states that the [[internal energy]] of the system is conserved.
The traditional way of introducing entropy is by means of a Carnot engine, an abstract engine conceived  in 1824 by [[Sadi Carnot]]<ref>S. Carnot, ''Réflexions sur la puissance motrice du feu et sur les machines propres à développer cette puissance (Reflections on the motive power of fire and on machines suited to develop that power)'',  Chez Bachelier, Paris (1824).</ref> as an idealization of a steam engine. Carnot's work foreshadowed the [[second law of thermodynamics]]. The  "engineering" manner—by an engine—of introducing entropy will be discussed below. In this approach, entropy is the amount of [[heat]] (per degree kelvin) gained or lost by a thermodynamic system that makes a transition from one state to another. The second law states that the entropy of an isolated system increases in spontaneous (natural) processes leading from one state to another, whereas the first law states that the [[internal energy]] of the system is conserved.
Line 12: Line 13:
Not satisfied with the engineering type of argument, the mathematician [[Constantin Carathéodory]] gave in 1909 a new axiomatic formulation of entropy and the second law of thermodynamics.<ref>C.  Carathéodory,  ''Untersuchungen über die Grundlagen der Thermodynamik''  [Investigation on the foundations of thermodynamics],  Mathematische Annalen, vol. '''67''',  pp. 355-386 (1909).</ref> His theory was based on [[Pfaffian differential equations]]. His axiom replaced the earlier Kelvin-Planck and the equivalent Clausius formulation of the second law and did not need Carnot engines. Carathéodory's work was taken up  by [[Max Born]],<ref>M. Born, Physikalische Zeitschrift, vol. 22, p. 218, 249, 282 (1922)</ref> and it is treated in a few  textbooks.<ref>H. B. Callen, ''Thermodynamics and an Introduction to Thermostatistics.'' John Wiley and Sons, New York, 2nd edition, (1965); E. A. Guggenheim, ''Thermodynamics'', North-Holland, Amsterdam, 5th edition (1967)</ref> Since it requires more mathematical knowledge than the traditional approach based on Carnot engines, and since this mathematical knowledge is not needed by most students of thermodynamics,  the traditional approach is still dominant in the majority of introductory works on thermodynamics.
Not satisfied with the engineering type of argument, the mathematician [[Constantin Carathéodory]] gave in 1909 a new axiomatic formulation of entropy and the second law of thermodynamics.<ref>C.  Carathéodory,  ''Untersuchungen über die Grundlagen der Thermodynamik''  [Investigation on the foundations of thermodynamics],  Mathematische Annalen, vol. '''67''',  pp. 355-386 (1909).</ref> His theory was based on [[Pfaffian differential equations]]. His axiom replaced the earlier Kelvin-Planck and the equivalent Clausius formulation of the second law and did not need Carnot engines. Carathéodory's work was taken up  by [[Max Born]],<ref>M. Born, Physikalische Zeitschrift, vol. 22, p. 218, 249, 282 (1922)</ref> and it is treated in a few  textbooks.<ref>H. B. Callen, ''Thermodynamics and an Introduction to Thermostatistics.'' John Wiley and Sons, New York, 2nd edition, (1965); E. A. Guggenheim, ''Thermodynamics'', North-Holland, Amsterdam, 5th edition (1967)</ref> Since it requires more mathematical knowledge than the traditional approach based on Carnot engines, and since this mathematical knowledge is not needed by most students of thermodynamics,  the traditional approach is still dominant in the majority of introductory works on thermodynamics.


==Classical definition==
==Traditional definition==
The state (a point in state space) of a thermodynamic system  is characterized by a number of variables, such as [[pressure]] ''p'', [[temperature]] ''T'', amount of substance ''n'', volume ''V'', etc.  Any thermodynamic parameter  can be seen as a function of an arbitrary independent set of other thermodynamic variables, hence the terms  "property", "parameter",  "variable" and "function" are used interchangeably. The number of ''independent'' thermodynamic variables of a system is equal to the number of energy contacts  of the system with its surroundings.
The state (a point in state space) of a thermodynamic system  is characterized by a number of variables, such as [[pressure]] ''p'', [[temperature]] ''T'', amount of substance ''n'', volume ''V'', etc.  Any thermodynamic parameter  can be seen as a function of an arbitrary independent set of other thermodynamic variables, hence the terms  "property", "parameter",  "variable" and "function" are used interchangeably. The number of ''independent'' thermodynamic variables of a system is equal to the number of energy contacts  of the system with its surroundings.



Revision as of 07:29, 8 November 2009

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

Entropy is a function of the state of a thermodynamic system. It is a size-extensive[1] quantity, invariably denoted by S, with dimension energy divided by temperature (SI unit: joule/K). Entropy has no analogous mechanical meaning—unlike volume, a similar size-extensive state parameter. Moreover entropy cannot directly be measured, there is no such thing as an entropy meter, whereas state parameters like volume and temperature are easily determined. Consequently entropy is one of the least understood concepts in physics.[2]

PD Image
Excerpt from Clausius (1865).
Translation: In search of a distinctive name for S, one could call the quantity S the transformation content of the body, similarly as one could call the quantity U the heat and work content of the body. As I deem it better to derive the name for a quantity that is so important for science from the antique languages, so that it can be used without change in all modern languages, I propose to call the quantity S the entropy of the body, after the Greek word for transformation ἡ τροπή. On purpose I have constructed the word entropy to resemble as much as possible the word energy, as both quantities to be called by these words are so closely related in their physical meaning. Hence a certain similarity in assigning their names seemed useful to me.

The state variable "entropy" was introduced by Rudolf Clausius in 1865, [3]see the inset for his text, when he gave a mathematical formulation of the second law of thermodynamics.

The traditional way of introducing entropy is by means of a Carnot engine, an abstract engine conceived in 1824 by Sadi Carnot[4] as an idealization of a steam engine. Carnot's work foreshadowed the second law of thermodynamics. The "engineering" manner—by an engine—of introducing entropy will be discussed below. In this approach, entropy is the amount of heat (per degree kelvin) gained or lost by a thermodynamic system that makes a transition from one state to another. The second law states that the entropy of an isolated system increases in spontaneous (natural) processes leading from one state to another, whereas the first law states that the internal energy of the system is conserved.

In 1877 Ludwig Boltzmann[5] gave a definition of entropy in the context of the kinetic gas theory, a branch of physics that developed into statistical thermodynamics. Boltzmann's definition of entropy was furthered by John von Neumann[6] to a quantum statistical definition. The quantum statistical point of view, too, will be reviewed in the present article. In the statistical approach the entropy of an isolated (constant energy) system is kB logΩ, where kB is Boltzmann's constant, Ω is the number of different wave functions ("microstates") of the system belonging to the system's energy (Ω is the degree of degeneracy, the probability that a state is described by one of the Ω wave functions, is in one of the Ω microstates), and the function log stands for the natural (base e) logarithm.

Not satisfied with the engineering type of argument, the mathematician Constantin Carathéodory gave in 1909 a new axiomatic formulation of entropy and the second law of thermodynamics.[7] His theory was based on Pfaffian differential equations. His axiom replaced the earlier Kelvin-Planck and the equivalent Clausius formulation of the second law and did not need Carnot engines. Carathéodory's work was taken up by Max Born,[8] and it is treated in a few textbooks.[9] Since it requires more mathematical knowledge than the traditional approach based on Carnot engines, and since this mathematical knowledge is not needed by most students of thermodynamics, the traditional approach is still dominant in the majority of introductory works on thermodynamics.

Traditional definition

The state (a point in state space) of a thermodynamic system is characterized by a number of variables, such as pressure p, temperature T, amount of substance n, volume V, etc. Any thermodynamic parameter can be seen as a function of an arbitrary independent set of other thermodynamic variables, hence the terms "property", "parameter", "variable" and "function" are used interchangeably. The number of independent thermodynamic variables of a system is equal to the number of energy contacts of the system with its surroundings.

An example of a reversible (quasi-static) energy contact is offered by the prototype thermodynamical system, a gas-filled cylinder with piston. Such a cylinder can perform work on its surroundings,

where dV stands for a small increment of the volume V of the cylinder, p is the pressure inside the cylinder and DW stands for a small amount of work. Work by expansion is a form of energy contact between the cylinder and its surroundings. This process can be reverted, the volume of the cylinder can be decreased, the gas is compressed and the surroundings perform work DW = pdV < 0 on the cylinder.

The small amount of work is indicated by D, and not by d, because DW is not necessarily a differential of a function. However, when we divide DW by p the quantity DW/p becomes obviously equal to the differential dV of the differentiable state function V. State functions depend only on the actual values of the thermodynamic parameters (they are local in state space), and not on the path along which the state was reached (the history of the state). Mathematically this means that integration from point 1 to point 2 along path I in state space is equal to integration along a different path II,

The amount of work (divided by p) performed reversibly along path I is equal to the amount of work (divided by p) along path II. This condition is necessary and sufficient that DW/p is the differential of a state function. So, although DW is not a differential, the quotient DW/p is one.

Reversible absorption of a small amount of heat DQ is another energy contact of a system with its surroundings; DQ is again not a differential of a certain function. In a completely analogous manner to DW/p, the following result can be shown for the heat DQ (divided by T) absorbed reversibly by the system along two different paths (along both paths the absorption is reversible):

(1)



Hence the quantity dS defined by

is the differential of a state variable S, the entropy of the system. In the next subsection equation (1) will be proved from the Kelvin-Planck principle. Observe that this definition of entropy only fixes entropy differences:

Note further that entropy has the dimension energy per degree temperature (joule per degree kelvin) and recalling the first law of thermodynamics (the differential dU of the internal energy satisfies dU = DQDW), it follows that

(For convenience sake only a single work term was considered here, namely DW = pdV, work done by the system). The internal energy is an extensive quantity. The temperature T is an intensive property, independent of the size of the system. It follows that the entropy S is an extensive property. In that sense the entropy resembles the volume of the system. We reiterate that volume is a state function with a well-defined mechanical meaning, whereas entropy is introduced by analogy and is not easily visualized. Indeed, as is shown in the next subsection, it requires a fairly elaborate reasoning to prove that S is a state function, i.e., that equation (1) holds.

Proof that entropy is a state function

Equation (1) gives the sufficient condition that the entropy S is a state function. The standard proof of equation (1), as given now, is physical, by means of an engine making Carnot cycles, and is based on the Kelvin-Planck formulation of the second law of thermodynamics.

PD Image

Consider the figure. A system, consisting of an arbitrary closed system C (only heat goes in and out) and a reversible heat engine E, is coupled to a large heat reservoir R of constant temperature T0. The system C undergoes a cyclic state change 1-2-1. Since no work is performed on or by C, it follows that

For the heat engine E it holds (by the definition of thermodynamic temperature) that

Hence

From the Kelvin-Planck principle it follows that W is necessarily less or equal zero, because there is only the single heat source R from which W is extracted. Invoking the first law of thermodynamics we get,

so that

Because the processes inside C and E are assumed reversible, all arrows can be reverted and in the very same way it is shown that

so that equation (1) holds (with a slight change of notation, subscripts are transferred to the respective integral signs):

Quantum statistical entropy

Footnotes

  1. A size-extensive property of a system becomes x times larger when the system is enlarged by a factor x, provided all intensive parameters remain the same upon the enlargement. Intensive parameters, like temperature, density, and pressure, are independent of size.
  2. It is reported that in a conversation with Claude Shannon, John (Johann) von Neumann said: "In the second place, and more important, nobody knows what entropy really is [..]”. M. Tribus, E. C. McIrvine, Energy and information, Scientific American, vol. 224 (September 1971), pp. 178–184.
  3. R. J. E. Clausius, Über verschiedenen für die Anwendung bequeme Formen der Hauptgleichungen der Mechanischen Wärmetheorie [On several forms of the fundamental equations of the mechanical theory of heat that are useful for application], Annalen der Physik, (is Poggendorff's Annalen der Physik und Chemie) vol. 125, pp. 352–400 (1865). Around the same time Clausius wrote a two-volume treatise: R. J. E. Clausius, Abhandlungen über die mechanische Wärmetheorie [Treatise on the mechanical theory of heat], F. Vieweg, Braunschweig, (vol I: 1864, vol II: 1867); Google books (contains two volumes). The 1865 Annalen paper was reprinted in the second volume of the Abhandlungen and included in the 1867 English translation.
  4. S. Carnot, Réflexions sur la puissance motrice du feu et sur les machines propres à développer cette puissance (Reflections on the motive power of fire and on machines suited to develop that power), Chez Bachelier, Paris (1824).
  5. L. Boltzmann, Über die Beziehung zwischen dem zweiten Hauptsatz der mechanischen Wärmetheorie und der Wahrscheinlichkeitsrechnung respektive den Sätzen über das Wärmegleichgewicht, [On the relation between the second fundamental law of the mechanical theory of heat and the probability calculus with respect to the theorems of heat equilibrium] Wiener Berichte vol. 76, pp. 373-435 (1877)
  6. J. von Neumann, Mathematische Grundlagen der Quantenmechanik, [Mathematical foundation of quantum mechanics] Springer, Berlin (1932)
  7. C. Carathéodory, Untersuchungen über die Grundlagen der Thermodynamik [Investigation on the foundations of thermodynamics], Mathematische Annalen, vol. 67, pp. 355-386 (1909).
  8. M. Born, Physikalische Zeitschrift, vol. 22, p. 218, 249, 282 (1922)
  9. H. B. Callen, Thermodynamics and an Introduction to Thermostatistics. John Wiley and Sons, New York, 2nd edition, (1965); E. A. Guggenheim, Thermodynamics, North-Holland, Amsterdam, 5th edition (1967)

References

  • M. W. Zemansky, Kelvin and Carathéodory—A Reconciliation, American Journal of Physics Vol. 34, pp. 914-920 (1966) [1]